Science.gov

Sample records for augmented reality engineering

  1. Usability Engineering: Domain Analysis Activities for Augmented Reality Systems

    DTIC Science & Technology

    2002-01-01

    This paper discusses our usability engineering process for the Battlefield Augmented Reality System (BARS). Usability engineering is a structured...management principals and techniques, formal and semiformal evaluation techniques, and computerized tools. BARS is an outdoor augmented reality system...originally developed the process in the context of virtual reality applications, but in this work we are adapting the procedures to an augmented

  2. Confronting an Augmented Reality

    ERIC Educational Resources Information Center

    Munnerley, Danny; Bacon, Matt; Wilson, Anna; Steele, James; Hedberg, John; Fitzgerald, Robert

    2012-01-01

    How can educators make use of augmented reality technologies and practices to enhance learning and why would we want to embrace such technologies anyway? How can an augmented reality help a learner confront, interpret and ultimately comprehend reality itself ? In this article, we seek to initiate a discussion that focuses on these questions, and…

  3. Augmented Reality in astrophysics

    NASA Astrophysics Data System (ADS)

    Vogt, Frédéric P. A.; Shingles, Luke J.

    2013-09-01

    Augmented Reality consists of merging live images with virtual layers of information. The rapid growth in the popularity of smartphones and tablets over recent years has provided a large base of potential users of Augmented Reality technology, and virtual layers of information can now be attached to a wide variety of physical objects. In this article, we explore the potential of Augmented Reality for astrophysical research with two distinct experiments: (1) Augmented Posters and (2) Augmented Articles. We demonstrate that the emerging technology of Augmented Reality can already be used and implemented without expert knowledge using currently available apps. Our experiments highlight the potential of Augmented Reality to improve the communication of scientific results in the field of astrophysics. We also present feedback gathered from the Australian astrophysics community that reveals evidence of some interest in this technology by astronomers who experimented with Augmented Posters. In addition, we discuss possible future trends for Augmented Reality applications in astrophysics, and explore the current limitations associated with the technology. This Augmented Article, the first of its kind, is designed to allow the reader to directly experiment with this technology.

  4. Augmented reality system

    NASA Astrophysics Data System (ADS)

    Lin, Chien-Liang; Su, Yu-Zheng; Hung, Min-Wei; Huang, Kuo-Cheng

    2010-08-01

    In recent years, Augmented Reality (AR)[1][2][3] is very popular in universities and research organizations. The AR technology has been widely used in Virtual Reality (VR) fields, such as sophisticated weapons, flight vehicle development, data model visualization, virtual training, entertainment and arts. AR has characteristics to enhance the display output as a real environment with specific user interactive functions or specific object recognitions. It can be use in medical treatment, anatomy training, precision instrument casting, warplane guidance, engineering and distance robot control. AR has a lot of vantages than VR. This system developed combines sensors, software and imaging algorithms to make users feel real, actual and existing. Imaging algorithms include gray level method, image binarization method, and white balance method in order to make accurate image recognition and overcome the effects of light.

  5. Towards Pervasive Augmented Reality: Context-Awareness in Augmented Reality.

    PubMed

    Grubert, Jens; Langlotz, Tobias; Zollmann, Stefanie; Regenbrecht, Holger

    2016-03-17

    Augmented Reality is a technique that enables users to interact with their physical environment through the overlay of digital information. While being researched for decades, more recently, Augmented Reality moved out of the research labs and into the field. While most of the applications are used sporadically and for one particular task only, current and future scenarios will provide a continuous and multi-purpose user experience. Therefore, in this paper, we present the concept of Pervasive Augmented Reality, aiming to provide such an experience by sensing the user's current context and adapting the AR system based on the changing requirements and constraints. We present a taxonomy for Pervasive Augmented Reality and context-aware Augmented Reality, which classifies context sources and context targets relevant for implementing such a context-aware, continuous Augmented Reality experience. We further summarize existing approaches that contribute towards Pervasive Augmented Reality. Based our taxonomy and survey, we identify challenges for future research directions in Pervasive Augmented Reality.

  6. A Tracker Alignment Framework for Augmented Reality

    DTIC Science & Technology

    2003-01-01

    A Tracker Alignment Framework for Augmented Reality Yohan Baillot and Simon J. Julier ITT Advanced Engineering & Sciences 2560 Huntington Ave...with as few as three measurements. 1. Introduction Almost all Augmented Reality (AR) systems use a track- ing system to capture motion of objects in...DATES COVERED 00-00-2003 to 00-00-2003 4. TITLE AND SUBTITLE A Tracker Alignment Framework for Augmented Reality 5a. CONTRACT NUMBER 5b. GRANT

  7. Augmented Reality Simulations on Handheld Computers

    ERIC Educational Resources Information Center

    Squire, Kurt; Klopfer, Eric

    2007-01-01

    Advancements in handheld computing, particularly its portability, social interactivity, context sensitivity, connectivity, and individuality, open new opportunities for immersive learning environments. This article articulates the pedagogical potential of augmented reality simulations in environmental engineering education by immersing students in…

  8. Augmented Reality Binoculars.

    PubMed

    Oskiper, Taragay; Sizintsev, Mikhail; Branzoi, Vlad; Samarasekera, Supun; Kumar, Rakesh

    2015-05-01

    In this paper we present an augmented reality binocular system to allow long range high precision augmentation of live telescopic imagery with aerial and terrain based synthetic objects, vehicles, people and effects. The inserted objects must appear stable in the display and must not jitter and drift as the user pans around and examines the scene with the binoculars. The design of the system is based on using two different cameras with wide field of view and narrow field of view lenses enclosed in a binocular shaped shell. Using the wide field of view gives us context and enables us to recover the 3D location and orientation of the binoculars much more robustly, whereas the narrow field of view is used for the actual augmentation as well as to increase precision in tracking. We present our navigation algorithm that uses the two cameras in combination with an inertial measurement unit and global positioning system in an extended Kalman filter and provides jitter free, robust and real-time pose estimation for precise augmentation. We have demonstrated successful use of our system as part of information sharing example as well as a live simulated training system for observer training, in which fixed and rotary wing aircrafts, ground vehicles, and weapon effects are combined with real world scenes.

  9. Augmented Virtual Reality Laboratory

    NASA Technical Reports Server (NTRS)

    Tully-Hanson, Benjamin

    2015-01-01

    Real time motion tracking hardware has for the most part been cost prohibitive for research to regularly take place until recently. With the release of the Microsoft Kinect in November 2010, researchers now have access to a device that for a few hundred dollars is capable of providing redgreenblue (RGB), depth, and skeleton data. It is also capable of tracking multiple people in real time. For its original intended purposes, i.e. gaming, being used with the Xbox 360 and eventually Xbox One, it performs quite well. However, researchers soon found that although the sensor is versatile, it has limitations in real world applications. I was brought aboard this summer by William Little in the Augmented Virtual Reality (AVR) Lab at Kennedy Space Center to find solutions to these limitations.

  10. Augmented Reality Comes to Physics

    ERIC Educational Resources Information Center

    Buesing, Mark; Cook, Michael

    2013-01-01

    Augmented reality (AR) is a technology used on computing devices where processor-generated graphics are rendered over real objects to enhance the sensory experience in real time. In other words, what you are really seeing is augmented by the computer. Many AR games already exist for systems such as Kinect and Nintendo 3DS and mobile apps, such as…

  11. Augmented Reality Comes to Physics

    NASA Astrophysics Data System (ADS)

    Buesing, Mark; Cook, Michael

    2013-04-01

    Augmented reality (AR) is a technology used on computing devices where processor-generated graphics are rendered over real objects to enhance the sensory experience in real time. In other words, what you are really seeing is augmented by the computer. Many AR games already exist for systems such as Kinect and Nintendo 3DS and mobile apps, such as Tagwhat and Star Chart (a must for astronomy class). The yellow line marking first downs in a televised football game2 and the enhanced puck that makes televised hockey easier to follow3 both use augmented reality to do the job.

  12. Vertical Vergence Calibration for Augmented Reality Displays

    DTIC Science & Technology

    2006-03-01

    2006 2. REPORT TYPE 3. DATES COVERED 00-00-2006 to 00-00-2006 4. TITLE AND SUBTITLE Vertical Vergence Calibration for Augmented Reality Displays...8-98) Prescribed by ANSI Std Z39-18 Vertical Vergence Calibration for Augmented Reality Displays Mark A. Livingston∗ Adam Lederer Virtual Reality...dimensional Graphics and Realism—Virtual Reality Keywords: augmented reality , head-mounted display, vergence 1 INTRODUCTION Many augmented reality (AR

  13. Augmented Reality Tower Technology Assessment

    NASA Technical Reports Server (NTRS)

    Reisman, Ronald J.; Brown, David M.

    2009-01-01

    Augmented Reality technology may help improve Air Traffic Control Tower efficiency and safety during low-visibility conditions. This paper presents the assessments of five off-duty controllers who shadow-controlled' with an augmented reality prototype in their own facility. Initial studies indicated unanimous agreement that this technology is potentially beneficial, though the prototype used in the study was not adequate for operational use. Some controllers agreed that augmented reality technology improved situational awareness, had potential to benefit clearance, control, and coordination tasks and duties and could be very useful for acquiring aircraft and weather information, particularly aircraft location, heading, and identification. The strongest objections to the prototype used in this study were directed at aircraft registration errors, unacceptable optical transparency, insufficient display performance in sunlight, inadequate representation of the static environment and insufficient symbology.

  14. Wireless Augmented Reality Prototype (WARP)

    NASA Technical Reports Server (NTRS)

    Devereaux, A. S.

    1999-01-01

    Initiated in January, 1997, under NASA's Office of Life and Microgravity Sciences and Applications, the Wireless Augmented Reality Prototype (WARP) is a means to leverage recent advances in communications, displays, imaging sensors, biosensors, voice recognition and microelectronics to develop a hands-free, tetherless system capable of real-time personal display and control of computer system resources. Using WARP, an astronaut may efficiently operate and monitor any computer-controllable activity inside or outside the vehicle or station. The WARP concept is a lightweight, unobtrusive heads-up display with a wireless wearable control unit. Connectivity to the external system is achieved through a high-rate radio link from the WARP personal unit to a base station unit installed into any system PC. The radio link has been specially engineered to operate within the high- interference, high-multipath environment of a space shuttle or space station module. Through this virtual terminal, the astronaut will be able to view and manipulate imagery, text or video, using voice commands to control the terminal operations. WARP's hands-free access to computer-based instruction texts, diagrams and checklists replaces juggling manuals and clipboards, and tetherless computer system access allows free motion throughout a cabin while monitoring and operating equipment.

  15. A Pilot Study of the Effectiveness of Augmented Reality to Enhance the Use of Remote Labs in Electrical Engineering Education

    NASA Astrophysics Data System (ADS)

    Mejías Borrero, A.; Andújar Márquez, J. M.

    2012-10-01

    Lab practices are an essential part of teaching in Engineering. However, traditional laboratory lessons developed in classroom labs (CL) must be adapted to teaching and learning strategies that go far beyond the common concept of e-learning, in the sense that completely virtualized distance education disconnects teachers and students from the real world, which can generate specific problems in laboratory classes. Current proposals of virtual labs (VL) and remote labs (RL) do not either cover new needs properly or contribute remarkable improvement to traditional labs—except that they favor distance training. Therefore, online teaching and learning in lab practices demand a further step beyond current VL and RL. This paper poses a new reality and new teaching/learning concepts in the field of lab practices in engineering. The developed augmented reality-based lab system (augmented remote lab, ARL) enables teachers and students to work remotely (Internet/intranet) in current CL, including virtual elements which interact with real ones. An educational experience was conducted to assess the developed ARL with the participation of a group of 10 teachers and another group of 20 students. Both groups have completed lab practices of the contents in the subjects Digital Systems and Robotics and Industrial Automation, which belong to the second year of the new degree in Electronic Engineering (adapted to the European Space for Higher Education). The labs were carried out by means of three different possibilities: CL, VL and ARL. After completion, both groups were asked to fill in some questionnaires aimed at measuring the improvement contributed by ARL relative to CL and VL. Except in some specific questions, the opinion of teachers and students was rather similar and positive regarding the use and possibilities of ARL. Although the results are still preliminary and need further study, seems to conclude that ARL remarkably improves the possibilities of current VL and RL

  16. Augmented reality building operations tool

    SciTech Connect

    Brackney, Larry J.

    2014-09-09

    A method (700) for providing an augmented reality operations tool to a mobile client (642) positioned in a building (604). The method (700) includes, with a server (660), receiving (720) from the client (642) an augmented reality request for building system equipment (612) managed by an energy management system (EMS) (620). The method (700) includes transmitting (740) a data request for the equipment (612) to the EMS (620) and receiving (750) building management data (634) for the equipment (612). The method (700) includes generating (760) an overlay (656) with an object created based on the building management data (634), which may be sensor data, diagnostic procedures, or the like. The overlay (656) is configured for concurrent display on a display screen (652) of the client (642) with a real-time image of the building equipment (612). The method (700) includes transmitting (770) the overlay (656) to the client (642).

  17. Augmenting Your Own Reality: Student Authoring of Science-Based Augmented Reality Games

    ERIC Educational Resources Information Center

    Klopfer, Eric; Sheldon, Josh

    2010-01-01

    Augmented Reality (AR) simulations superimpose a virtual overlay of data and interactions onto a real-world context. The simulation engine at the heart of this technology is built to afford elements of game play that support explorations and learning in students' natural context--their own community and surroundings. In one of the more recent…

  18. Webizing mobile augmented reality content

    NASA Astrophysics Data System (ADS)

    Ahn, Sangchul; Ko, Heedong; Yoo, Byounghyun

    2014-01-01

    This paper presents a content structure for building mobile augmented reality (AR) applications in HTML5 to achieve a clean separation of the mobile AR content and the application logic for scaling as on the Web. We propose that the content structure contains the physical world as well as virtual assets for mobile AR applications as document object model (DOM) elements and that their behaviour and user interactions are controlled through DOM events by representing objects and places with a uniform resource identifier. Our content structure enables mobile AR applications to be seamlessly developed as normal HTML documents under the current Web eco-system.

  19. Augmented Reality for Maintenance and Repair (ARMAR)

    DTIC Science & Technology

    2007-08-01

    The purpose of this research, Augmented Reality for Maintenance and Repair (ARMAR), was to research the design and development of experimental... augmented reality systems for maintenance job aiding. The goal was to explore and evaluate the feasibility of developing prototype adaptive augmented ... reality systems that can be used to investigate how real time computer graphics, overlaid on and registered with the actual equipment being maintained, can

  20. Developing a New Medical Augmented Reality System.

    DTIC Science & Technology

    1996-05-01

    Augmented reality is a technique for combining supplementary imagery such that it appears as part of the scene and can be used for guidance, training...and locational aids. In the medical domain, augmented reality can be used to combine medical imagery to the physician’s view of a patient to help...the physician establish a direct relation between the imagery and the patient. This project report will examine medical augmented reality systems for

  1. Eyekon: Distributed Augmented Reality for Soldier Teams

    DTIC Science & Technology

    2003-06-01

    Eyekon: Distributed Augmented Reality for Soldier Teams TOPIC: Information Superiority/Information Operations and Information Age... Augmented Reality for Soldier Teams 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER...by ANSI Std Z39-18 Eyekon: Distributed Augmented Reality for Soldier Teams Abstract The battlefield is a place of violence ruled by

  2. Enhancing Education through Mobile Augmented Reality

    ERIC Educational Resources Information Center

    Joan, D. R. Robert

    2015-01-01

    In this article, the author has discussed about the Mobile Augmented Reality and enhancing education through it. The aim of the present study was to give some general information about mobile augmented reality which helps to boost education. Purpose of the current study reveals the mobile networks which are used in the institution campus as well…

  3. Information Filtering for Mobile Augmented Reality

    DTIC Science & Technology

    2000-10-01

    Augmented reality is a potentially powerful paradigm for annotating the environment with computer-generated material. These benefits will be even...greater when augmented reality systems become mobile and wearable. However, to minimize the problem of clutter and maximize the effectiveness of the

  4. Urban Terrain Modeling for Augmented Reality Applications

    DTIC Science & Technology

    2001-01-01

    Augmented reality (AR) systems have arguably some of the most stringent requirements of any kind of three-dimensional synthetic graphic systems. AR...environment for a mobile augmented reality system. We review, describe and compare the effectiveness of a number of different modeling paradigms

  5. Information Filtering for Mobile Augmented Reality

    DTIC Science & Technology

    2002-07-02

    Augmented Reality (AR) has the potential to revolutionise the way in which information is delivered to a user. By tracking the user s position and...on the problem of developing mobile augmented reality systems which can be worn by an individual user operating in a large, complicated environment

  6. Augmented Reality for Close Quarters Combat

    SciTech Connect

    2013-09-20

    Sandia National Laboratories has developed a state-of-the-art augmented reality training system for close-quarters combat (CQB). This system uses a wearable augmented reality system to place the user in a real environment while engaging enemy combatants in virtual space (Boston Dynamics DI-Guy). Umbra modeling and simulation environment is used to integrate and control the AR system.

  7. Augmented Reality for Close Quarters Combat

    ScienceCinema

    None

    2016-07-12

    Sandia National Laboratories has developed a state-of-the-art augmented reality training system for close-quarters combat (CQB). This system uses a wearable augmented reality system to place the user in a real environment while engaging enemy combatants in virtual space (Boston Dynamics DI-Guy). Umbra modeling and simulation environment is used to integrate and control the AR system.

  8. Augmented reality-assisted skull base surgery.

    PubMed

    Cabrilo, I; Sarrafzadeh, A; Bijlenga, P; Landis, B N; Schaller, K

    2014-12-01

    Neuronavigation is widely considered as a valuable tool during skull base surgery. Advances in neuronavigation technology, with the integration of augmented reality, present advantages over traditional point-based neuronavigation. However, this development has not yet made its way into routine surgical practice, possibly due to a lack of acquaintance with these systems. In this report, we illustrate the usefulness and easy application of augmented reality-based neuronavigation through a case example of a patient with a clivus chordoma. We also demonstrate how augmented reality can help throughout all phases of a skull base procedure, from the verification of neuronavigation accuracy to intraoperative image-guidance.

  9. Augmented Reality in Architecture: Rebuilding Archeological Heritage

    NASA Astrophysics Data System (ADS)

    de la Fuente Prieto, J.; Castaño Perea, E.; Labrador Arroyo, F.

    2017-02-01

    With the development in recent years of augmented reality and the appearance of new mobile terminals and storage bases on-line, we find the possibility of using a powerful tool for transmitting architecture. This paper analyzes the relationship between Augmented Reality and Architecture. Firstly, connects the theoretical framework of both disciplines through the Representation concept. Secondly, describes the milestones and possibilities of Augmented Reality in the particular field of archaeological reconstruction. And lastly, once recognized the technology developed, we face the same analysis from a critical point of view, assessing their suitability to the discipline that concerns us is the architecture and within archeology.

  10. Augmented Reality for the Improvement of Remote Laboratories: An Augmented Remote Laboratory

    ERIC Educational Resources Information Center

    Andujar, J. M.; Mejias, A.; Marquez, M. A.

    2011-01-01

    Augmented reality (AR) provides huge opportunities for online teaching in science and engineering, as these disciplines place emphasis on practical training and unsuited to completely nonclassroom training. This paper proposes a new concept in virtual and remote laboratories: the augmented remote laboratory (ARL). ARL is being tested in the first…

  11. The Development of Mobile Augmented Reality

    DTIC Science & Technology

    2012-01-01

    This chapter provides a high-level overview of fifteen years of augmented reality research that was sponsored by the U.S. Office of Naval Research...years by a number of university and industrial research laboratories. It laid the groundwork for the development of many commercial mobile augmented ... reality (AR) applications that are currently available for smartphones and tablets. Furthermore, it helped shape a number of ongoing research activities in mobile AR.

  12. Location-Based Learning through Augmented Reality

    ERIC Educational Resources Information Center

    Chou, Te-Lien; Chanlin, Lih-Juan

    2014-01-01

    A context-aware and mixed-reality exploring tool cannot only effectively provide an information-rich environment to users, but also allows them to quickly utilize useful resources and enhance environment awareness. This study integrates Augmented Reality (AR) technology into smartphones to create a stimulating learning experience at a university…

  13. ARSC: Augmented Reality Student Card--An Augmented Reality Solution for the Education Field

    ERIC Educational Resources Information Center

    El Sayed, Neven A. M.; Zayed, Hala H.; Sharawy, Mohamed I.

    2011-01-01

    Augmented Reality (AR) is the technology of adding virtual objects to real scenes through enabling the addition of missing information in real life. As the lack of resources is a problem that can be solved through AR, this paper presents and explains the usage of AR technology we introduce Augmented Reality Student Card (ARSC) as an application of…

  14. Formwork application optimization by using augmented reality

    NASA Astrophysics Data System (ADS)

    Diaconu, R.; Petruse, R.; Brindasu, P. D.

    2016-11-01

    By using the PLM (Product Lifecycle Management) principle on the formwork case study, after determining the functions and the technical solutions, the application must be made as optimum as possible in order to assure productivity and provide the necessary information as quick as possible. The concept is to create a complex management for the formwork based on augmented reality. By taking into account the development rate of the information, augmented reality is tending to be one of the widest (in term of domain) visualization instrument. Also used in the construction domain, augmented reality can be applied also for the case of formwork design and management. The application of the solution will be retrieved in the construction of the product, its transportation and deposit. The usage of this concept will help reduce, even eliminate human or technical errors and can offer a precise state of a specific required formwork from the stock.

  15. Understanding the Conics through Augmented Reality

    ERIC Educational Resources Information Center

    Salinas, Patricia; Pulido, Ricardo

    2017-01-01

    This paper discusses the production of a digital environment to foster the learning of conics through augmented reality. The name conic refers to curves obtained by the intersection of a plane with a right circular conical surface. The environment gives students the opportunity to interact with the cone and the plane as virtual objects in real…

  16. Intelligent Augmented Reality Training for Motherboard Assembly

    ERIC Educational Resources Information Center

    Westerfield, Giles; Mitrovic, Antonija; Billinghurst, Mark

    2015-01-01

    We investigate the combination of Augmented Reality (AR) with Intelligent Tutoring Systems (ITS) to assist with training for manual assembly tasks. Our approach combines AR graphics with adaptive guidance from the ITS to provide a more effective learning experience. We have developed a modular software framework for intelligent AR training…

  17. The Educational Possibilities of Augmented Reality

    ERIC Educational Resources Information Center

    Cabero, Julio; Barroso, Julio

    2016-01-01

    A large number of emergent technologies have been acquiring a strong impulse in recent years. One of these emergent technologies is Augmented Reality (RA), which will surely have a high level of penetration into all our educational centers, including universities, in the next 3 to 5 years, as a number of different reports have already highlighted.…

  18. Personalized augmented reality for anatomy education.

    PubMed

    Ma, Meng; Fallavollita, Pascal; Seelbach, Ina; Von Der Heide, Anna Maria; Euler, Ekkehard; Waschke, Jens; Navab, Nassir

    2016-05-01

    Anatomy education is a challenging but vital element in forming future medical professionals. In this work, a personalized and interactive augmented reality system is developed to facilitate education. This system behaves as a "magic mirror" which allows personalized in-situ visualization of anatomy on the user's body. Real-time volume visualization of a CT dataset creates the illusion that the user can look inside their body. The system comprises a RGB-D sensor as a real-time tracking device to detect the user moving in front of a display. In addition, the magic mirror system shows text information, medical images, and 3D models of organs that the user can interact with. Through the participation of 7 clinicians and 72 students, two user studies were designed to respectively assess the precision and acceptability of the magic mirror system for education. The results of the first study demonstrated that the average precision of the augmented reality overlay on the user body was 0.96 cm, while the results of the second study indicate 86.1% approval for the educational value of the magic mirror, and 91.7% approval for the augmented reality capability of displaying organs in three dimensions. The usefulness of this unique type of personalized augmented reality technology has been demonstrated in this paper.

  19. CARE: Creating Augmented Reality in Education

    ERIC Educational Resources Information Center

    Latif, Farzana

    2012-01-01

    This paper explores how Augmented Reality using mobile phones can enhance teaching and learning in education. It specifically examines its application in two cases, where it is identified that the agility of mobile devices and the ability to overlay context specific resources offers opportunities to enhance learning that would not otherwise exist.…

  20. Design Principles for Augmented Reality Learning

    ERIC Educational Resources Information Center

    Dunleavy, Matt

    2014-01-01

    Augmented reality is an emerging technology that utilizes mobile, context-aware devices (e.g., smartphones, tablets) that enable participants to interact with digital information embedded within the physical environment. This overview of design principles focuses on specific strategies that instructional designers can use to develop AR learning…

  1. Get Real: Augmented Reality for the Classroom

    ERIC Educational Resources Information Center

    Mitchell, Rebecca; DeBay, Dennis

    2012-01-01

    Kids love augmented reality (AR) simulations because they are like real-life video games. AR simulations allow students to learn content while collaborating face to face and interacting with a multimedia-enhanced version of the world around them. Although the technology may seem advanced, AR software makes it easy to develop content-based…

  2. The Augmented REality Sandtable (ARES)

    DTIC Science & Technology

    2015-10-01

    end state is an augmented sand table platform that supports a variety of research, training, and operational needs. In recent years, several high-tech... ecommerce No Yes No No Petrasova (2014) GIS-Based Environmental Modeling with Tangible Interaction and Dynamic Visualization Follow-up to Tateosian... platform . Fig. 6 Tank combat game application • ARES Scalability Based on user requirements, a squad-sized version of ARES (7 × 4 ft) is

  3. Augmented Reality Mentor for Training Maintenance Procedures: Interim Assessment

    DTIC Science & Technology

    2014-08-01

    ARI Research Note 2014-04 Augmented Reality Mentor for Training Maintenance Procedures: Interim Assessment Louise...2014 4. TITLE AND SUBTITLE Augmented Reality Mentor for Training Maintenance Procedures: Interim Assessment 5a. CONTRACT OR GRANT NUMBER...Representative and Subject Matter POC: Dr. William R. Bickley 14. ABSTRACT (Maximum 200 words): The Augmented Reality Mentor is a 2-yr advanced

  4. Opportunistic tangible user interfaces for augmented reality.

    PubMed

    Henderson, Steven; Feiner, Steven

    2010-01-01

    Opportunistic Controls are a class of user interaction techniques that we have developed for augmented reality (AR) applications to support gesturing on, and receiving feedback from, otherwise unused affordances already present in the domain environment. By leveraging characteristics of these affordances to provide passive haptics that ease gesture input, Opportunistic Controls simplify gesture recognition, and provide tangible feedback to the user. In this approach, 3D widgets are tightly coupled with affordances to provide visual feedback and hints about the functionality of the control. For example, a set of buttons can be mapped to existing tactile features on domain objects. We describe examples of Opportunistic Controls that we have designed and implemented using optical marker tracking, combined with appearance-based gesture recognition. We present the results of two user studies. In the first, participants performed a simulated maintenance inspection of an aircraft engine using a set of virtual buttons implemented both as Opportunistic Controls and using simpler passive haptics. Opportunistic Controls allowed participants to complete their tasks significantly faster and were preferred over the baseline technique. In the second, participants proposed and demonstrated user interfaces incorporating Opportunistic Controls for two domains, allowing us to gain additional insights into how user interfaces featuring Opportunistic Controls might be designed.

  5. Spacecraft 3D Augmented Reality Mobile App

    NASA Technical Reports Server (NTRS)

    Hussey, Kevin J.; Doronila, Paul R.; Kumanchik, Brian E.; Chan, Evan G.; Ellison, Douglas J.; Boeck, Andrea; Moore, Justin M.

    2013-01-01

    The Spacecraft 3D application allows users to learn about and interact with iconic NASA missions in a new and immersive way using common mobile devices. Using Augmented Reality (AR) techniques to project 3D renditions of the mission spacecraft into real-world surroundings, users can interact with and learn about Curiosity, GRAIL, Cassini, and Voyager. Additional updates on future missions, animations, and information will be ongoing. Using a printed AR Target and camera on a mobile device, users can get up close with these robotic explorers, see how some move, and learn about these engineering feats, which are used to expand knowledge and understanding about space. The software receives input from the mobile device's camera to recognize the presence of an AR marker in the camera's field of view. It then displays a 3D rendition of the selected spacecraft in the user's physical surroundings, on the mobile device's screen, while it tracks the device's movement in relation to the physical position of the spacecraft's 3D image on the AR marker.

  6. Augmented reality in surgical procedures

    NASA Astrophysics Data System (ADS)

    Samset, E.; Schmalstieg, D.; Vander Sloten, J.; Freudenthal, A.; Declerck, J.; Casciaro, S.; Rideng, Ø.; Gersak, B.

    2008-02-01

    Minimally invasive therapy (MIT) is one of the most important trends in modern medicine. It includes a wide range of therapies in videoscopic surgery and interventional radiology and is performed through small incisions. It reduces hospital stay-time by allowing faster recovery and offers substantially improved cost-effectiveness for the hospital and the society. However, the introduction of MIT has also led to new problems. The manipulation of structures within the body through small incisions reduces dexterity and tactile feedback. It requires a different approach than conventional surgical procedures, since eye-hand co-ordination is not based on direct vision, but more predominantly on image guidance via endoscopes or radiological imaging modalities. ARIS*ER is a multidisciplinary consortium developing a new generation of decision support tools for MIT by augmenting visual and sensorial feedback. We will present tools based on novel concepts in visualization, robotics and haptics providing tailored solutions for a range of clinical applications. Examples from radio-frequency ablation of liver-tumors, laparoscopic liver surgery and minimally invasive cardiac surgery will be presented. Demonstrators were developed with the aim to provide a seamless workflow for the clinical user conducting image-guided therapy.

  7. Vision-based augmented reality system

    NASA Astrophysics Data System (ADS)

    Chen, Jing; Wang, Yongtian; Shi, Qi; Yan, Dayuan

    2003-04-01

    The most promising aspect of augmented reality lies in its ability to integrate the virtual world of the computer with the real world of the user. Namely, users can interact with the real world subjects and objects directly. This paper presents an experimental augmented reality system with a video see-through head-mounted device to display visual objects, as if they were lying on the table together with real objects. In order to overlay virtual objects on the real world at the right position and orientation, the accurate calibration and registration are most important. A vision-based method is used to estimate CCD external parameters by tracking 4 known points with different colors. It achieves sufficient accuracy for non-critical applications such as gaming, annotation and so on.

  8. Augmented reality visualization for thoracoscopic spine surgery

    NASA Astrophysics Data System (ADS)

    Sauer, Frank; Vogt, Sebastian; Khamene, Ali; Heining, Sandro; Euler, Ekkehard; Schneberger, Marc; Zuerl, Konrad; Mutschler, Wolf

    2006-03-01

    We are developing an augmented reality (AR) image guidance system in which information derived from medical images is overlaid onto a video view of the patient. The centerpiece of the system is a head-mounted display custom fitted with two miniature color video cameras that capture the stereo view of the scene. Medical graphics is overlaid onto the video view and appears firmly anchored in the scene, without perceivable time lag or jitter. We have been testing the system for different clinical applications. In this paper we discuss minimally invasive thoracoscopic spine surgery as a promising new orthopedic application. In the standard approach, the thoracoscope - a rigid endoscope - provides visual feedback for the minimally invasive procedure of removing a damaged disc and fusing the two neighboring vertebrae. The navigation challenges are twofold. From a global perspective, the correct vertebrae on the spine have to be located with the inserted instruments. From a local perspective, the actual spine procedure has to be performed precisely. Visual feedback from the thoracoscope provides only limited support for both of these tasks. In the augmented reality approach, we give the surgeon additional anatomical context for the navigation. Before the surgery, we derive a model of the patient's anatomy from a CT scan, and during surgery we track the location of the surgical instruments in relation to patient and model. With this information, we can help the surgeon in both the global and local navigation, providing a global map and 3D information beyond the local 2D view of the thoracoscope. Augmented reality visualization is a particularly intuitive method of displaying this information to the surgeon. To adapt our augmented reality system to this application, we had to add an external optical tracking system, which works now in combination with our head-mounted tracking camera. The surgeon's feedback to the initial phantom experiments is very positive.

  9. Data Distribution for Mobile Augmented Reality in Simulation and Training

    DTIC Science & Technology

    2003-01-01

    The Battlefield Augmented Reality System (BARS) is a mobile augmented reality system that displays head-up battlefield intelligence information to a...constraints, we present a robust event-based data distribution mechanism for mobile augmented reality and virtual environments. It is based on replicated...its name, a plan of its interior, icons to represent reported hazard locations, and the names of adjacent streets. The full power of mobile augmented

  10. Telescopic multi-resolution augmented reality

    NASA Astrophysics Data System (ADS)

    Jenkins, Jeffrey; Frenchi, Christopher; Szu, Harold

    2014-05-01

    To ensure a self-consistent scaling approximation, the underlying microscopic fluctuation components can naturally influence macroscopic means, which may give rise to emergent observable phenomena. In this paper, we describe a consistent macroscopic (cm-scale), mesoscopic (micron-scale), and microscopic (nano-scale) approach to introduce Telescopic Multi-Resolution (TMR) into current Augmented Reality (AR) visualization technology. We propose to couple TMR-AR by introducing an energy-matter interaction engine framework that is based on known Physics, Biology, Chemistry principles. An immediate payoff of TMR-AR is a self-consistent approximation of the interaction between microscopic observables and their direct effect on the macroscopic system that is driven by real-world measurements. Such an interdisciplinary approach enables us to achieve more than multiple scale, telescopic visualization of real and virtual information but also conducting thought experiments through AR. As a result of the consistency, this framework allows us to explore a large dimensionality parameter space of measured and unmeasured regions. Towards this direction, we explore how to build learnable libraries of biological, physical, and chemical mechanisms. Fusing analytical sensors with TMR-AR libraries provides a robust framework to optimize testing and evaluation through data-driven or virtual synthetic simulations. Visualizing mechanisms of interactions requires identification of observable image features that can indicate the presence of information in multiple spatial and temporal scales of analog data. The AR methodology was originally developed to enhance pilot-training as well as `make believe' entertainment industries in a user-friendly digital environment We believe TMR-AR can someday help us conduct thought experiments scientifically, to be pedagogically visualized in a zoom-in-and-out, consistent, multi-scale approximations.

  11. Augmented assessment as a means to augmented reality.

    PubMed

    Bergeron, Bryan

    2006-01-01

    Rigorous scientific assessment of educational technologies typically lags behind the availability of the technologies by years because of the lack of validated instruments and benchmarks. Even when the appropriate assessment instruments are available, they may not be applied because of time and monetary constraints. Work in augmented reality, instrumented mannequins, serious gaming, and similar promising educational technologies that haven't undergone timely, rigorous evaluation, highlights the need for assessment methodologies that address the limitations of traditional approaches. The most promising augmented assessment solutions incorporate elements of rapid prototyping used in the software industry, simulation-based assessment techniques modeled after methods used in bioinformatics, and object-oriented analysis methods borrowed from object oriented programming.

  12. A telescope with augmented reality functions

    NASA Astrophysics Data System (ADS)

    Hou, Qichao; Cheng, Dewen; Wang, Qiwei; Wang, Yongtian

    2016-10-01

    This study introduces a telescope with virtual reality (VR) and augmented reality (AR) functions. In this telescope, information on the micro-display screen is integrated to the reticule of telescope through a beam splitter and is then received by the observer. The design and analysis of telescope optical system with AR and VR ability is accomplished and the opto-mechanical structure is designed. Finally, a proof-of-concept prototype is fabricated and demonstrated. The telescope has an exit pupil diameter of 6 mm at an eye relief of 19 mm, 6° field of view, 5 to 8 times visual magnification , and a 30° field of view of the virtual image.

  13. Augmented reality for breast tumors visualization.

    PubMed

    Ghaderi, Mohammad Ali; Heydarzadeh, Mehrdad; Nourani, Mehrdad; Gupta, Gopal; Tamil, Lakshman

    2016-08-01

    3D visualization of breast tumors are shown to be effective by previous studies. In this paper, we introduce a new augmented reality application that can help doctors and surgeons to have a more accurate visualization of breast tumors; this system uses a marker-based image-processing technique to render a 3D model of the tumors on the body. The model can be created using a combination of breast 3D mammography by experts. We have tested the system using an Android smartphone and a head-mounted device. This proof of concept can be useful for oncologists to have a more effective screening, and surgeons to plan the surgery.

  14. Human Performance Assessments when Using Augmented Reality for Navigation

    DTIC Science & Technology

    2006-06-01

    Human performance executing search and rescue type of navigation is one area that can benefit from augmented reality technology when the proper...landmarks. We briefly report on an experiment that demonstrated the benefits of augmented reality in a search and rescue task. Specifically, 120...participants, equally divided by gender, were tested in speed and accuracy using augmented reality in a search and rescue task. Accuracy performance was

  15. Augmented reality and training for airway management procedures.

    PubMed

    Davis, Larry; Ha, Yonggang; Frolich, Seth; Martin, Glenn; Meyer, Catherine; Pettitt, Beth; Norfleet, Jack; Lin, Kuo-Chi; Rolland, Jannick P

    2002-01-01

    Augmented reality is often used for interactive, three-dimensional visualization within the medical community. To this end, we present the integration of an augmented reality system that will be used to train military medics in airway management. The system demonstrates how a head-mounted projective display can be integrated with a desktop PC to create an augmented reality visualization. Furthermore, the system, which uses a lightweight optical tracker, demonstrates the low cost and the portability of the application.

  16. Augmenting your own reality: student authoring of science-based augmented reality games.

    PubMed

    Klopfer, Eric; Sheldon, Josh

    2010-01-01

    Augmented Reality (AR) simulations superimpose a virtual overlay of data and interactions onto a real-world context. The simulation engine at the heart of this technology is built to afford elements of game play that support explorations and learning in students' natural context--their own community and surroundings. In one of the more recent games, TimeLab 2100, players role-play citizens of the early 22nd century when global climate change is out of control. Through AR, they see their community as it might be nearly one hundred years in the future. TimeLab and other similar AR games balance location specificity and portability--they are games that are tied to a location and games that are movable from place to place. Focusing students on developing their own AR games provides the best of both virtual and physical worlds: a more portable solution that deeply connects young people to their own surroundings. A series of initiatives has focused on technical and pedagogical solutions to supporting students authoring their own games.

  17. A Pilot Study of the Effectiveness of Augmented Reality to Enhance the Use of Remote Labs in Electrical Engineering Education

    ERIC Educational Resources Information Center

    Borrero, A. Mejias; Marquez, J. M. Andujar

    2012-01-01

    Lab practices are an essential part of teaching in Engineering. However, traditional laboratory lessons developed in classroom labs (CL) must be adapted to teaching and learning strategies that go far beyond the common concept of e-learning, in the sense that completely virtualized distance education disconnects teachers and students from the real…

  18. Augmenting a Child's Reality: Using Educational Tablet Technology

    ERIC Educational Resources Information Center

    Tanner, Patricia; Karas, Carly; Schofield, Damian

    2014-01-01

    This study investigates the classroom integration of an innovative technology, augmented reality. Although the process of adding new technologies into a classroom setting can be daunting, the concept of augmented reality has demonstrated the ability to educate students and to assist with their comprehension of a procedural task. One half of the…

  19. The Local Games Lab ABQ: Homegrown Augmented Reality

    ERIC Educational Resources Information Center

    Holden, Christopher

    2014-01-01

    Experiments in the use of augmented reality games formerly required extensive material resources and expertise to implement above and beyond what might be possible within the usual educational contexts. Currently, the more common availability of hardware in these contexts and the existence of easy-to-use, general purpose augmented reality design…

  20. Potential Use of Augmented Reality in LIS Education

    ERIC Educational Resources Information Center

    Wójcik, Magdalena

    2016-01-01

    The subject of this article is the use of augmented reality technology in library and information science education. The aim is to determine the scope and potential uses of augmented reality in the education of information professionals. In order to determine the scope and forms of potential use of AR technology in LIS education a two-step…

  1. On Location Learning: Authentic Applied Science with Networked Augmented Realities

    ERIC Educational Resources Information Center

    Rosenbaum, Eric; Klopfer, Eric; Perry, Judy

    2007-01-01

    The learning of science can be made more like the practice of science through authentic simulated experiences. We have created a networked handheld Augmented Reality environment that combines the authentic role-playing of Augmented Realities and the underlying models of Participatory Simulations. This game, known as Outbreak @ The Institute, is…

  2. Temporal Coherence Strategies for Augmented Reality Labeling.

    PubMed

    Madsen, Jacob Boesen; Tatzqern, Markus; Madsen, Claus B; Schmalstieg, Dieter; Kalkofen, Denis

    2016-04-01

    Temporal coherence of annotations is an important factor in augmented reality user interfaces and for information visualization. In this paper, we empirically evaluate four different techniques for annotation. Based on these findings, we follow up with subjective evaluations in a second experiment. Results show that presenting annotations in object space or image space leads to a significant difference in task performance. Furthermore, there is a significant interaction between rendering space and update frequency of annotations. Participants improve significantly in locating annotations, when annotations are presented in object space, and view management update rate is limited. In a follow-up experiment, participants appear to be more satisfied with limited update rate in comparison to a continuous update rate of the view management system.

  3. Visualizing Sea Level Rise with Augmented Reality

    NASA Astrophysics Data System (ADS)

    Kintisch, E. S.

    2013-12-01

    Looking Glass is an application on the iPhone that visualizes in 3-D future scenarios of sea level rise, overlaid on live camera imagery in situ. Using a technology known as augmented reality, the app allows a layperson user to explore various scenarios of sea level rise using a visual interface. Then the user can see, in an immersive, dynamic way, how those scenarios would affect a real place. The first part of the experience activates users' cognitive, quantitative thinking process, teaching them how global sea level rise, tides and storm surge contribute to flooding; the second allows an emotional response to a striking visual depiction of possible future catastrophe. This project represents a partnership between a science journalist, MIT, and the Rhode Island School of Design, and the talk will touch on lessons this projects provides on structuring and executing such multidisciplinary efforts on future design projects.

  4. Image-processing with augmented reality (AR)

    NASA Astrophysics Data System (ADS)

    Babaei, Hossein R.; Mohurutshe, Pagiel L.; Habibi Lashkari, Arash

    2013-03-01

    In this project, the aim is to discuss and articulate the intent to create an image-based Android Application. The basis of this study is on real-time image detection and processing. It's a new convenient measure that allows users to gain information on imagery right on the spot. Past studies have revealed attempts to create image based applications but have only gone up to crating image finders that only work with images that are already stored within some form of database. Android platform is rapidly spreading around the world and provides by far the most interactive and technical platform for smart-phones. This is why it was important to base the study and research on it. Augmented Reality is this allows the user to maipulate the data and can add enhanced features (video, GPS tags) to the image taken.

  5. Augmented reality in neurosurgery: a systematic review.

    PubMed

    Meola, Antonio; Cutolo, Fabrizio; Carbone, Marina; Cagnazzo, Federico; Ferrari, Mauro; Ferrari, Vincenzo

    2016-05-07

    Neuronavigation has become an essential neurosurgical tool in pursuing minimal invasiveness and maximal safety, even though it has several technical limitations. Augmented reality (AR) neuronavigation is a significant advance, providing a real-time updated 3D virtual model of anatomical details, overlaid on the real surgical field. Currently, only a few AR systems have been tested in a clinical setting. The aim is to review such devices. We performed a PubMed search of reports restricted to human studies of in vivo applications of AR in any neurosurgical procedure using the search terms "Augmented reality" and "Neurosurgery." Eligibility assessment was performed independently by two reviewers in an unblinded standardized manner. The systems were qualitatively evaluated on the basis of the following: neurosurgical subspecialty of application, pathology of treated lesions and lesion locations, real data source, virtual data source, tracking modality, registration technique, visualization processing, display type, and perception location. Eighteen studies were included during the period 1996 to September 30, 2015. The AR systems were grouped by the real data source: microscope (8), hand- or head-held cameras (4), direct patient view (2), endoscope (1), and X-ray fluoroscopy (1) head-mounted display (1). A total of 195 lesions were treated: 75 (38.46 %) were neoplastic, 77 (39.48 %) neurovascular, and 1 (0.51 %) hydrocephalus, and 42 (21.53 %) were undetermined. Current literature confirms that AR is a reliable and versatile tool when performing minimally invasive approaches in a wide range of neurosurgical diseases, although prospective randomized studies are not yet available and technical improvements are needed.

  6. Augmented Reality as a Countermeasure for Sleep Deprivation.

    PubMed

    Baumeister, James; Dorrlan, Jillian; Banks, Siobhan; Chatburn, Alex; Smith, Ross T; Carskadon, Mary A; Lushington, Kurt; Thomas, Bruce H

    2016-04-01

    Sleep deprivation is known to have serious deleterious effects on executive functioning and job performance. Augmented reality has an ability to place pertinent information at the fore, guiding visual focus and reducing instructional complexity. This paper presents a study to explore how spatial augmented reality instructions impact procedural task performance on sleep deprived users. The user study was conducted to examine performance on a procedural task at six time points over the course of a night of total sleep deprivation. Tasks were provided either by spatial augmented reality-based projections or on an adjacent monitor. The results indicate that participant errors significantly increased with the monitor condition when sleep deprived. The augmented reality condition exhibited a positive influence with participant errors and completion time having no significant increase when sleep deprived. The results of our study show that spatial augmented reality is an effective sleep deprivation countermeasure under laboratory conditions.

  7. Augmented Reality vs Virtual Reality for 3D Object Manipulation.

    PubMed

    Krichenbauer, Max; Yamamoto, Goshiro; Taketomi, Takafumi; Sandor, Christian; Kato, Hirokazu

    2017-01-25

    Virtual Reality (VR) Head-Mounted Displays (HMDs) are on the verge of becoming commodity hardware available to the average user and feasible to use as a tool for 3D work. Some HMDs include front-facing cameras, enabling Augmented Reality (AR) functionality. Apart from avoiding collisions with the environment, interaction with virtual objects may also be affected by seeing the real environment. However, whether these effects are positive or negative has not yet been studied extensively. For most tasks it is unknown whether AR has any advantage over VR. In this work we present the results of a user study in which we compared user performance measured in task completion time on a 9 degrees of freedom object selection and transformation task performed either in AR or VR, both with a 3D input device and a mouse. Our results show faster task completion time in AR over VR. When using a 3D input device, a purely VR environment increased task completion time by 22.5% on average compared to AR (p < 0:024). Surprisingly, a similar effect occurred when using a mouse: users were about 17.3% slower in VR than in AR (p < 0:04). Mouse and 3D input device produced similar task completion times in each condition (AR or VR) respectively. We further found no differences in reported comfort.

  8. Basic Perception in Head-worn Augmented Reality Displays

    DTIC Science & Technology

    2012-01-01

    Head-worn displays have been an integral part of augmented reality since the inception of the field. However, due to numerous difficulties with...functions of the human visual system when using head-worn augmented reality displays. In particular, we look at loss of visual acuity and contrast (and...designing using such unique hardware, the perceptual capabilities of users suffer when looking at either the virtual or real portions of the augmented

  9. Augmented reality for underground pipe inspection and maintenance

    NASA Astrophysics Data System (ADS)

    Lawson, Shaun W.; Pretlove, John R. G.

    1998-12-01

    The University of Surrey is engaged in developing augmented reality systems and teleoperation techniques for enhanced visual analysis and task performance in hostile environments. One particular current project in the UK is addressing the development of stereo vision systems, augmented reality, image processing techniques and specialist robotic vehicles which may be used for the future examination and maintenance of underground sewage pipes. This paper describes the components of the stereo vision and augmented reality system and illustrates some preliminary results of the use of the stereo robotic system mounted on a mobile laboratory vehicle and calibrated using a pin-hole camera model.

  10. SmartG: Spontaneous Malaysian Augmented Reality Tourist Guide

    NASA Astrophysics Data System (ADS)

    Kasinathan, Vinothini; Mustapha, Aida; Subramaniam, Tanabalan

    2016-11-01

    In effort to attract higher tourist expenditure along with higher tourist arrivals, this paper proposes a travel application called the SmartG, acronym for Spontaneous Malaysian Augmented Reality Tourist Guide, which operates by making recommendations to user based on the travel objective and individual budget constraints. The applications relies on augmented reality technology, whereby a three dimensional model is presented to the user based on input from real world environment. User testing returned a favorable feedback on the concept of using augmented reality in promoting Malaysian tourism.

  11. 3D augmented reality with integral imaging display

    NASA Astrophysics Data System (ADS)

    Shen, Xin; Hua, Hong; Javidi, Bahram

    2016-06-01

    In this paper, a three-dimensional (3D) integral imaging display for augmented reality is presented. By implementing the pseudoscopic-to-orthoscopic conversion method, elemental image arrays with different capturing parameters can be transferred into the identical format for 3D display. With the proposed merging algorithm, a new set of elemental images for augmented reality display is generated. The newly generated elemental images contain both the virtual objects and real world scene with desired depth information and transparency parameters. The experimental results indicate the feasibility of the proposed 3D augmented reality with integral imaging.

  12. Augmented Reality Marker Hiding with Texture Deformation.

    PubMed

    Kawai, Norihiko; Sato, Tomokazu; Nakashima, Yuta; Yokoya, Naokazu

    2016-10-19

    Augmented reality (AR) marker hiding is a technique to visually remove AR markers in a real-time video stream. A conventional approach transforms a background image with a homography matrix calculated on the basis of a camera pose and overlays the transformed image on an AR marker region in a real-time frame, assuming that the AR marker is on a planar surface. However, this approach may cause discontinuities in textures around the boundary between the marker and its surrounding area when the planar surface assumption is not satisfied. This paper proposes a method for AR marker hiding without discontinuities around texture boundaries even under nonplanar background geometry without measuring it. For doing this, our method estimates the dense motion in the marker's background by analyzing the motion of sparse feature points around it, together with a smooth motion assumption, and deforms the background image according to it. Our experiments demonstrate the effectiveness of the proposed method in various environments with different background geometries and textures.

  13. Directing driver attention with augmented reality cues

    PubMed Central

    Rusch, Michelle L.; Schall, Mark C.; Gavin, Patrick; Lee, John D.; Dawson, Jeffrey D.; Vecera, Shaun; Rizzo, Matthew

    2013-01-01

    This simulator study evaluated the effects of augmented reality (AR) cues designed to direct the attention of experienced drivers to roadside hazards. Twenty-seven healthy middle-aged licensed drivers with a range of attention capacity participated in a 54 mile (1.5 hour) drive in an interactive fixed-base driving simulator. Each participant received AR cues to potential roadside hazards in six simulated straight (9 mile long) rural roadway segments. Drivers were evaluated on response time for detecting a potentially hazardous event, detection accuracy for target (hazard) and non-target objects, and headway with respect to the hazards. Results showed no negative outcomes associated with interference. AR cues did not impair perception of non-target objects, including for drivers with lower attentional capacity. Results showed near significant response time benefits for AR cued hazards. AR cueing increased response rate for detecting pedestrians and warning signs but not vehicles. AR system false alarms and misses did not impair driver responses to potential hazards. PMID:24436635

  14. Videometric head tracker for augmented reality applications

    NASA Astrophysics Data System (ADS)

    Janin, Adam L.; Zikan, Karel; Mizell, David; Banner, Mike; Sowizral, Henry A.

    1995-12-01

    For the past three years, we have been developing augmented reality technology for application to a variety of touch labor tasks in aircraft manufacturing and assembly. The system would be worn by factory workers to provide them with better-quality information for performing their tasks than was previously available. Using a see-through head-mounted display (HMD) whose optics are set at a focal length of about 18 in., the display and its associated head tracking system can be used to superimpose and stabilize graphics on the surface of a work piece. This technology would obviate many expensive marking systems now used in aerospace manufacturing. The most challenging technical issue with respect to factory applications of AR is head position and orientation tracking. It requires high accuracy, long- range tracking in a high-noise environment. The approach we have chosen uses a head- mounted miniature video camera. The user's wearable computer system utilizes the camera to find fiducial markings that have been placed on known coordinates on or near the work piece. The system then computes the user's position and orientation relative to the fiducial marks. It is referred to as a `videometric' head tracker. In this paper, we describe the steps we took and the results we obtained in the process of prototyping our videometric head tracker, beginning with analytical and simulation results, and continuing through the working prototypes.

  15. Evaluating User Experience of Augmented Reality Eyeglasses.

    PubMed

    Gamberini, Luciano; Orso, Valeria; Beretta, Andrea; Jacucci, Giulio; Spagnolli, Anna; Rimondi, Romina

    2015-01-01

    Augmented reality based applications have been experimented with in various contexts. Typically, the interaction is supported by handled devices, which, in specific scenarios, may hinder the interaction and spoil the experience of use, as the user is forced to hold the device and to keep her eyes on it at all times. The recent launch on the market of light-weight, unobtrusive head-mounted displays may change this circumstance. Nevertheless, investigations are needed to understand if such head-worn devices effectively outperform handheld devices in terms of comfort and pleasant experience of use. Here we present two experiments aimed at assessing the comfort of wearing a head-worn, see-through AR viewer in both a controlled and a natural setting. Besides the comfort of wearing the device, aspects related to the user experience were also investigated in the field evaluation. Our findings suggest that the head-mounted display examined is comfortable to wear regardless of the context of use. Interestingly in the field trails, participants did not express concern for the impression they would have made on other people and the experience of use was overall pleasant. Possible issues related to visual fatigue emerged.

  16. Virtual reality, augmented reality…I call it i-Reality

    PubMed Central

    2015-01-01

    The new term improved reality (i-Reality) is suggested to include virtual reality (VR) and augmented reality (AR). It refers to a real world that includes improved, enhanced and digitally created features that would offer an advantage on a particular occasion (i.e., a medical act). I-Reality may help us bridge the gap between the high demand for medical providers and the low supply of them by improving the interaction between providers and patients. PMID:28293571

  17. Virtual reality, augmented reality…I call it i-Reality.

    PubMed

    Grossmann, Rafael J

    2015-01-01

    The new term improved reality (i-Reality) is suggested to include virtual reality (VR) and augmented reality (AR). It refers to a real world that includes improved, enhanced and digitally created features that would offer an advantage on a particular occasion (i.e., a medical act). I-Reality may help us bridge the gap between the high demand for medical providers and the low supply of them by improving the interaction between providers and patients.

  18. Hands in space: gesture interaction with augmented-reality interfaces.

    PubMed

    Billinghurst, Mark; Piumsomboon, Tham; Huidong Bai

    2014-01-01

    Researchers at the Human Interface Technology Laboratory New Zealand (HIT Lab NZ) are investigating free-hand gestures for natural interaction with augmented-reality interfaces. They've applied the results to systems for desktop computers and mobile devices.

  19. Graphical user interface concepts for tactical augmented reality

    NASA Astrophysics Data System (ADS)

    Argenta, Chris; Murphy, Anne; Hinton, Jeremy; Cook, James; Sherrill, Todd; Snarski, Steve

    2010-04-01

    Applied Research Associates and BAE Systems are working together to develop a wearable augmented reality system under the DARPA ULTRA-Vis program†. Our approach to achieve the objectives of ULTRAVis, called iLeader, incorporates a full color 40° field of view (FOV) see-thru holographic waveguide integrated with sensors for full position and head tracking to provide an unobtrusive information system for operational maneuvers. iLeader will enable warfighters to mark-up the 3D battle-space with symbologic identification of graphical control measures, friendly force positions and enemy/target locations. Our augmented reality display provides dynamic real-time painting of symbols on real objects, a pose-sensitive 360° representation of relevant object positions, and visual feedback for a variety of system activities. The iLeader user interface and situational awareness graphical representations are highly intuitive, nondisruptive, and always tactically relevant. We used best human-factors practices, system engineering expertise, and cognitive task analysis to design effective strategies for presenting real-time situational awareness to the military user without distorting their natural senses and perception. We present requirements identified for presenting information within a see-through display in combat environments, challenges in designing suitable visualization capabilities, and solutions that enable us to bring real-time iconic command and control to the tactical user community.

  20. Mobile Augmented Reality: Applications and Human Factors Evaluations

    DTIC Science & Technology

    2006-06-01

    increased the difficulty in acquiring and maintaining situation awareness (SA). Augmented reality (AR) has the potential to meet some of these new...intuitive and unambiguous. In the training applications, the virtual OPFOR must appear and behave realistically. We discuss the development of our augmented ... reality system and the human factors testing we have performed. We apply the system to two military needs: situation awareness during operations and training.

  1. Improving Robotic Operator Performance Using Augmented Reality

    NASA Technical Reports Server (NTRS)

    Maida, James C.; Bowen, Charles K.; Pace, John W.

    2007-01-01

    The Special Purpose Dexterous Manipulator (SPDM) is a two-armed robot that functions as an extension to the end effector of the Space Station Robotics Manipulator System (SSRMS), currently in use on the International Space Station (ISS). Crew training for the SPDM is accomplished using a robotic hardware simulator, which performs most of SPDM functions under normal static Earth gravitational forces. Both the simulator and SPDM are controlled from a standard robotic workstation using a laptop for the user interface and three monitors for camera views. Most operations anticipated for the SPDM involve the manipulation, insertion, and removal of any of several types of Orbital Replaceable Unit (ORU), modules which control various ISS functions. Alignment tolerances for insertion of the ORU into its receptacle are 0.25 inch and 0.5 degree from nominal values. The pre-insertion alignment task must be performed within these tolerances by using available video camera views of the intrinsic features of the ORU and receptacle, without special registration markings. Since optimum camera views may not be available, and dynamic orbital lighting conditions may limit periods of viewing, a successful ORU insertion operation may require an extended period of time. This study explored the feasibility of using augmented reality (AR) to assist SPDM operations. Geometric graphical symbols were overlaid on one of the workstation monitors to afford cues to assist the operator in attaining adequate pre-insertion ORU alignment. Twelve skilled subjects performed eight ORU insertion tasks using the simulator with and without the AR symbols in a repeated measures experimental design. Results indicated that using the AR symbols reduced pre-insertion alignment error for all subjects and reduced the time to complete pre-insertion alignment for most subjects.

  2. Visual Environment for Designing Interactive Learning Scenarios with Augmented Reality

    ERIC Educational Resources Information Center

    Mota, José Miguel; Ruiz-Rube, Iván; Dodero, Juan Manuel; Figueiredo, Mauro

    2016-01-01

    Augmented Reality (AR) technology allows the inclusion of virtual elements on a vision of actual physical environment for the creation of a mixed reality in real time. This kind of technology can be used in educational settings. However, the current AR authoring tools present several drawbacks, such as, the lack of a mechanism for tracking the…

  3. Augmented Reality, the Future of Contextual Mobile Learning

    ERIC Educational Resources Information Center

    Sungkur, Roopesh Kevin; Panchoo, Akshay; Bhoyroo, Nitisha Kirtee

    2016-01-01

    Purpose: This study aims to show the relevance of augmented reality (AR) in mobile learning for the 21st century. With AR, any real-world environment can be augmented by providing users with accurate digital overlays. AR is a promising technology that has the potential to encourage learners to explore learning materials from a totally new…

  4. Augmented reality aiding collimator exchange at the LHC

    NASA Astrophysics Data System (ADS)

    Martínez, Héctor; Fabry, Thomas; Laukkanen, Seppo; Mattila, Jouni; Tabourot, Laurent

    2014-11-01

    Novel Augmented Reality techniques have the potential to have a large positive impact on the way remote maintenance operations are carried out in hazardous areas, e.g. areas where radiation doses that imply careful planning and optimization of maintenance operations are present. This paper describes an Augmented Reality strategy, system and implementation for aiding the remote collimator exchange in the LHC, currently the world's largest and highest-energy particle accelerator. The proposed system relies on marker detection and multi-modal augmentation in real-time. A database system has been used to ensure flexibility. The system has been tested in a mock-up facility, showing real time performance and great potential for future use in the LHC. The technical-scientific difficulties identified during the development of the system and the proposed solutions described in this paper may help the development of future Augmented Reality systems for remote handling in scientific facilities.

  5. Augmented reality for biomedical wellness sensor systems

    NASA Astrophysics Data System (ADS)

    Jenkins, Jeffrey; Szu, Harold

    2013-05-01

    Due to the commercial move and gaming industries, Augmented Reality (AR) technology has matured. By definition of AR, both artificial and real humans can be simultaneously present and realistically interact among one another. With the help of physics and physiology, we can build in the AR tool together with real human day-night webcam inputs through a simple interaction of heat transfer -getting hot, action and reaction -walking or falling, as well as the physiology -sweating due to activity. Knowing the person age, weight and 3D coordinates of joints in the body, we deduce the force, the torque, and the energy expenditure during real human movements and apply to an AR human model. We wish to support the physics-physiology AR version, PPAR, as a BMW surveillance tool for senior home alone (SHA). The functionality is to record senior walking and hand movements inside a home environment. Besides the fringe benefit of enabling more visits from grand children through AR video games, the PP-AR surveillance tool may serve as a means to screen patients in the home for potential falls at points around in house. Moreover, we anticipate PP-AR may help analyze the behavior history of SHA, e.g. enhancing the Smartphone SHA Ubiquitous Care Program, by discovering early symptoms of candidate Alzheimer-like midnight excursions, or Parkinson-like trembling motion for when performing challenging muscular joint movements. Using a set of coordinates corresponding to a set of 3D positions representing human joint locations, we compute the Kinetic Energy (KE) generated by each body segment over time. The Work is then calculated, and converted into calories. Using common graphics rendering pipelines, one could invoke AR technology to provide more information about patients to caretakers. Alerts to caretakers can be prompted by a patient's departure from their personal baseline, and the patient's time ordered joint information can be loaded to a graphics viewer allowing for high

  6. Potential perils of peri-Pokémon perambulation: the dark reality of augmented reality?

    PubMed Central

    Joseph, Bellal; Armstrong, David G.

    2016-01-01

    Recently, the layering of augmented reality information on top of smartphone applications has created unprecedented user engagement and popularity. One augmented reality-based entertainment application, Pokémon Go (Pokémon Company, Tokyo, Japan) has become the most rapidly downloaded in history. This technology holds tremendous promise to promote ambulatory activity. However, there exists the obvious potential for distraction-related morbidity. We report two cases, presenting simultaneously to our trauma center, with injuries sustained secondary to gameplay with this augmented reality-based application. PMID:27713831

  7. Potential perils of peri-Pokémon perambulation: the dark reality of augmented reality?

    PubMed

    Joseph, Bellal; Armstrong, David G

    2016-10-01

    Recently, the layering of augmented reality information on top of smartphone applications has created unprecedented user engagement and popularity. One augmented reality-based entertainment application, Pokémon Go (Pokémon Company, Tokyo, Japan) has become the most rapidly downloaded in history. This technology holds tremendous promise to promote ambulatory activity. However, there exists the obvious potential for distraction-related morbidity. We report two cases, presenting simultaneously to our trauma center, with injuries sustained secondary to gameplay with this augmented reality-based application.

  8. Gunner Goggles: Implementing Augmented Reality into Medical Education.

    PubMed

    Wang, Leo L; Wu, Hao-Hua; Bilici, Nadir; Tenney-Soeiro, Rebecca

    2016-01-01

    There is evidence that both smartphone and tablet integration into medical education has been lacking. At the same time, there is a niche for augmented reality (AR) to improve this process through the enhancement of textbook learning. Gunner Goggles is an attempt to enhance textbook learning in shelf exam preparatory review with augmented reality. Here we describe our initial prototype and detail the process by which augmented reality was implemented into our textbook through Layar. We describe the unique functionalities of our textbook pages upon augmented reality implementation, which includes links, videos and 3D figures, and surveyed 24 third year medical students for their impression of the technology. Upon demonstrating an initial prototype textbook chapter, 100% (24/24) of students felt that augmented reality improved the quality of our textbook chapter as a learning tool. Of these students, 92% (22/24) agreed that their shelf exam review was inadequate and 19/24 (79%) felt that a completed Gunner Goggles product would have been a viable alternative to their shelf exam review. Thus, while students report interest in the integration of AR into medical education test prep, future investigation into how the use of AR can improve performance on exams is warranted.

  9. Towards Robot teaching based on Virtual and Augmented Reality Concepts

    NASA Astrophysics Data System (ADS)

    Ennakr, Said; Domingues, Christophe; Benchikh, Laredj; Otmane, Samir; Mallem, Malik

    2009-03-01

    A complex system is a system made up of a great number of entities in local and simultaneous interaction. Its design requires the collaboration of engineers of various complementary specialties, so that it is necessary to invent new design methods. Indeed, currently the industry loses much time between the moment when the product model is designed and when the latter is serially produced on the lines of factories. This production is generally ensured by automated and more often robotized means. A deadline is thus necessary for the development of the automatisms and the robots work on a new product model. In this context we launched a study based on the principle of the mechatronics design in Augmented Reality-Virtual Reality. This new approach will bring solutions to problems encountered in many application scopes, but also to problems involved in the distance which separates the offices from design of vehicles and their production sites. This new approach will minimize the differences of errors between the design model and real prototype.

  10. Soldier-worn augmented reality system for tactical icon visualization

    NASA Astrophysics Data System (ADS)

    Roberts, David; Menozzi, Alberico; Clipp, Brian; Russler, Patrick; Cook, James; Karl, Robert; Wenger, Eric; Church, William; Mauger, Jennifer; Volpe, Chris; Argenta, Chris; Wille, Mark; Snarski, Stephen; Sherrill, Todd; Lupo, Jasper; Hobson, Ross; Frahm, Jan-Michael; Heinly, Jared

    2012-06-01

    This paper describes the development and demonstration of a soldier-worn augmented reality system testbed that provides intuitive 'heads-up' visualization of tactically-relevant geo-registered icons. Our system combines a robust soldier pose estimation capability with a helmet mounted see-through display to accurately overlay geo-registered iconography (i.e., navigation waypoints, blue forces, aircraft) on the soldier's view of reality. Applied Research Associates (ARA), in partnership with BAE Systems and the University of North Carolina - Chapel Hill (UNC-CH), has developed this testbed system in Phase 2 of the DARPA ULTRA-Vis (Urban Leader Tactical, Response, Awareness, and Visualization) program. The ULTRA-Vis testbed system functions in unprepared outdoor environments and is robust to numerous magnetic disturbances. We achieve accurate and robust pose estimation through fusion of inertial, magnetic, GPS, and computer vision data acquired from helmet kit sensors. Icons are rendered on a high-brightness, 40°×30° field of view see-through display. The system incorporates an information management engine to convert CoT (Cursor-on-Target) external data feeds into mil-standard icons for visualization. The user interface provides intuitive information display to support soldier navigation and situational awareness of mission-critical tactical information.

  11. Augmented reality three-dimensional display with light field fusion.

    PubMed

    Xie, Songlin; Wang, Peng; Sang, Xinzhu; Li, Chengyu

    2016-05-30

    A video see-through augmented reality three-dimensional display method is presented. The system that is used for dense viewpoint augmented reality presentation fuses the light fields of the real scene and the virtual model naturally. Inherently benefiting from the rich information of the light field, depth sense and occlusion can be handled under no priori depth information of the real scene. A series of processes are proposed to optimize the augmented reality performance. Experimental results show that the reconstructed fused 3D light field on the autostereoscopic display is well presented. The virtual model is naturally integrated into the real scene with a consistence between binocular parallax and monocular depth cues.

  12. Evaluating Virtual Reality and Augmented Reality Training for Industrial Maintenance and Assembly Tasks

    ERIC Educational Resources Information Center

    Gavish, Nirit; Gutiérrez, Teresa; Webel, Sabine; Rodríguez, Jorge; Peveri, Matteo; Bockholt, Uli; Tecchia, Franco

    2015-01-01

    The current study evaluated the use of virtual reality (VR) and augmented reality (AR) platforms, developed within the scope of the SKILLS Integrated Project, for industrial maintenance and assembly (IMA) tasks training. VR and AR systems are now widely regarded as promising training platforms for complex and highly demanding IMA tasks. However,…

  13. Frames of Reference in Mobile Augmented Reality Displays

    ERIC Educational Resources Information Center

    Mou, Weimin; Biocca, Frank; Owen, Charles B.; Tang, Arthur; Xiao, Fan; Lim, Lynette

    2004-01-01

    In 3 experiments, the authors investigated spatial updating in augmented reality environments. Participants learned locations of virtual objects on the physical floor. They were turned to appropriate facing directions while blindfolded before making pointing judgments (e.g., "Imagine you are facing X. Point to Y"). Experiments manipulated the…

  14. Learning Physics through Play in an Augmented Reality Environment

    ERIC Educational Resources Information Center

    Enyedy, Noel; Danish, Joshua A.; Delacruz, Girlie; Kumar, Melissa

    2012-01-01

    The Learning Physics through Play Project (LPP) engaged 6-8-year old students (n = 43) in a series of scientific investigations of Newtonian force and motion including a series of augmented reality activities. We outline the two design principles behind the LPP curriculum: 1) the use of socio-dramatic, embodied play in the form of participatory…

  15. Improving Situational Awareness on Submarines Using Augmented Reality

    DTIC Science & Technology

    2008-09-01

    Control............. 66 4. Provide a Shared Contact Picture in Control ...................... 67 D. DISADVANTAGES OF THE PROPOSED DISPLAY SYSTEM...76 D. DISADVANTAGE OF AN AUGMENTED REALITY SYSTEM........... 77 1. AR Display Devices can be Considered Cumbersome ...... 77 2...point and the subjects’ displayed complete cooperation with me. A disadvantage of conducting the research under these conditions was that control

  16. Augmented Reality Games: Using Technology on a Budget

    ERIC Educational Resources Information Center

    Annetta, Leonard; Burton, Erin Peters; Frazier, Wendy; Cheng, Rebecca; Chmiel, Margaret

    2012-01-01

    As smartphones become more ubiquitous among adolescents, there is increasing potential for these as a tool to engage students in science instruction through innovative learning environments such as augmented reality (AR). Aligned with the National Science Education Standards (NRC 1996) and integrating the three dimensions of "A Framework for K-12…

  17. The Design of Immersive English Learning Environment Using Augmented Reality

    ERIC Educational Resources Information Center

    Li, Kuo-Chen; Chen, Cheng-Ting; Cheng, Shein-Yung; Tsai, Chung-Wei

    2016-01-01

    The study uses augmented reality (AR) technology to integrate virtual objects into the real learning environment for language learning. The English AR classroom is constructed using the system prototyping method and evaluated by semi-structured in-depth interviews. According to the flow theory by Csikszenmihalyi in 1975 along with the immersive…

  18. Using Augmented Reality Tools to Enhance Children's Library Services

    ERIC Educational Resources Information Center

    Meredith, Tamara R.

    2015-01-01

    Augmented reality (AR) has been used and documented for a variety of commercial and educational purposes, and the proliferation of mobile devices has increased the average person's access to AR systems and tools. However, little research has been done in the area of using AR to supplement traditional library services, specifically for patrons aged…

  19. Augmented Reality in Education--Cases, Places and Potentials

    ERIC Educational Resources Information Center

    Bower, Matt; Howe, Cathie; McCredie, Nerida; Robinson, Austin; Grover, David

    2014-01-01

    Augmented Reality is poised to profoundly transform Education as we know it. The capacity to overlay rich media onto the real world for viewing through web-enabled devices such as phones and tablet devices means that information can be made available to students at the exact time and place of need. This has the potential to reduce cognitive…

  20. Effect on Academic Procrastination after Introducing Augmented Reality

    ERIC Educational Resources Information Center

    Bendicho, Peña Fabiani; Mora, Carlos Efren; Añorbe-Díaz, Beatriz; Rivero-Rodríguez, Pedro

    2017-01-01

    Students suffer academic procrastination while dealing with frequent deadlines and working under pressure. This causes to delay their coursework and may affect their academic progress, despite feeling worse. Triggering students' motivation, like introducing technologies, helps to reduce procrastination. In this context, Augmented Reality has been…

  1. Augmented Reality and Mobile Learning: The State of the Art

    ERIC Educational Resources Information Center

    FitzGerald, Elizabeth; Ferguson, Rebecca; Adams, Anne; Gaved, Mark; Mor, Yishay; Thomas, Rhodri

    2013-01-01

    In this paper, the authors examine the state of the art in augmented reality (AR) for mobile learning. Previous work in the field of mobile learning has included AR as a component of a wider toolkit but little has been done to discuss the phenomenon in detail or to examine in a balanced fashion its potential for learning, identifying both positive…

  2. Current Status, Opportunities and Challenges of Augmented Reality in Education

    ERIC Educational Resources Information Center

    Wu, Hsin-Kai; Lee, Silvia Wen-Yu; Chang, Hsin-Yi; Liang, Jyh-Chong

    2013-01-01

    Although augmented reality (AR) has gained much research attention in recent years, the term AR was given different meanings by varying researchers. In this article, we first provide an overview of definitions, taxonomies, and technologies of AR. We argue that viewing AR as a concept rather than a type of technology would be more productive for…

  3. Sensors for Location-Based Augmented Reality the Example of Galileo and Egnos

    NASA Astrophysics Data System (ADS)

    Pagani, Alain; Henriques, José; Stricker, Didier

    2016-06-01

    Augmented Reality has long been approached from the point of view of Computer Vision and Image Analysis only. However, much more sensors can be used, in particular for location-based Augmented Reality scenarios. This paper reviews the various sensors that can be used for location-based Augmented Reality. It then presents and discusses several examples of the usage of Galileo and EGNOS in conjonction with Augmented Reality.

  4. Augmented reality based real-time subcutaneous vein imaging system

    PubMed Central

    Ai, Danni; Yang, Jian; Fan, Jingfan; Zhao, Yitian; Song, Xianzheng; Shen, Jianbing; Shao, Ling; Wang, Yongtian

    2016-01-01

    A novel 3D reconstruction and fast imaging system for subcutaneous veins by augmented reality is presented. The study was performed to reduce the failure rate and time required in intravenous injection by providing augmented vein structures that back-project superimposed veins on the skin surface of the hand. Images of the subcutaneous vein are captured by two industrial cameras with extra reflective near-infrared lights. The veins are then segmented by a multiple-feature clustering method. Vein structures captured by the two cameras are matched and reconstructed based on the epipolar constraint and homographic property. The skin surface is reconstructed by active structured light with spatial encoding values and fusion displayed with the reconstructed vein. The vein and skin surface are both reconstructed in the 3D space. Results show that the structures can be precisely back-projected to the back of the hand for further augmented display and visualization. The overall system performance is evaluated in terms of vein segmentation, accuracy of vein matching, feature points distance error, duration times, accuracy of skin reconstruction, and augmented display. All experiments are validated with sets of real vein data. The imaging and augmented system produces good imaging and augmented reality results with high speed. PMID:27446690

  5. Augmented reality based real-time subcutaneous vein imaging system.

    PubMed

    Ai, Danni; Yang, Jian; Fan, Jingfan; Zhao, Yitian; Song, Xianzheng; Shen, Jianbing; Shao, Ling; Wang, Yongtian

    2016-07-01

    A novel 3D reconstruction and fast imaging system for subcutaneous veins by augmented reality is presented. The study was performed to reduce the failure rate and time required in intravenous injection by providing augmented vein structures that back-project superimposed veins on the skin surface of the hand. Images of the subcutaneous vein are captured by two industrial cameras with extra reflective near-infrared lights. The veins are then segmented by a multiple-feature clustering method. Vein structures captured by the two cameras are matched and reconstructed based on the epipolar constraint and homographic property. The skin surface is reconstructed by active structured light with spatial encoding values and fusion displayed with the reconstructed vein. The vein and skin surface are both reconstructed in the 3D space. Results show that the structures can be precisely back-projected to the back of the hand for further augmented display and visualization. The overall system performance is evaluated in terms of vein segmentation, accuracy of vein matching, feature points distance error, duration times, accuracy of skin reconstruction, and augmented display. All experiments are validated with sets of real vein data. The imaging and augmented system produces good imaging and augmented reality results with high speed.

  6. The Effect of an Augmented Reality Enhanced Mathematics Lesson on Student Achievement and Motivation

    ERIC Educational Resources Information Center

    Estapa, Anne; Nadolny, Larysa

    2015-01-01

    The purpose of the study was to assess student achievement and motivation during a high school augmented reality mathematics activity focused on dimensional analysis. Included in this article is a review of the literature on the use of augmented reality in mathematics and the combination of print with augmented reality, also known as interactive…

  7. Event-Based Data Distribution for Mobile Augmented Reality and Virtual Environments

    DTIC Science & Technology

    2004-04-01

    demonstrated in the Battlefield Augmented Reality System (BARS) situation awareness system, composed of several mobile augmented reality systems, immersive...connectivity and their bandwidth can be highly constrained. This paper presents a robust event-based data distribution mechanism for mobile augmented ... reality and virtual environments. It is based on replicated databases, pluggable networking protocols, and communication channels. The mechanism is

  8. An Event-Based Data Distribution Mechanism for Collaborative Mobile Augmented Reality and Virtual Environments

    DTIC Science & Technology

    2003-01-01

    mechanism in the Battlefield Augmented Reality System (BARS) situation awareness system, which is composed of several mobile augmented reality systems...connectivity and their bandwidth can be highly constrained. In this paper we present a robust event based data distribution mechanism for mobile augmented ... reality and virtual environments. It is based on replicated databases, pluggable networking protocols, and communication channels. We demonstrate the

  9. An Analysis of Engagement in a Combination Indoor/Outdoor Augmented Reality Educational Game

    ERIC Educational Resources Information Center

    Folkestad, James; O'Shea, Patrick

    2011-01-01

    This paper describes the results of a qualitative analysis of video captured during a dual indoor/outdoor Augmented Reality experience. Augmented Reality is the layering of virtual information on top of the physical world. This Augmented Reality experience asked students to interact with the San Diego Museum of Art and the Botanical Gardens in San…

  10. Use of augmented reality in aircraft maintenance operations

    NASA Astrophysics Data System (ADS)

    De Marchi, L.; Ceruti, A.; Testoni, N.; Marzani, A.; Liverani, A.

    2014-03-01

    This paper illustrates a Human-Machine Interface based on Augmented Reality (AR) conceived to provide to maintenance operators the results of an impact detection methodology. In particular, the implemented tool dynamically interacts with a head portable visualization device allowing the inspector to see the estimated impact position on the structure. The impact detection methodology combines the signals collected by a network of piezosensors bonded on the structure to be monitored. Then a signal processing algorithm is applied to compensate for dispersion the acquired guided waves. The compensated waveforms yield to a robust estimation of guided waves difference in distance of propagation (DDOP), used to feed hyperbolic algorithms for impact location determination. The output of the impact methodology is passed to an AR visualization technology that is meant to support the inspector during the on-field inspection/diagnosis as well as the maintenance operations. The inspector, in fact, can see interactively in real time the impact data directly on the surface of the structure. Here the proposed approach is tested on the engine cowling of a Cessna 150 general aviation airplane. Preliminary results confirm the feasibility of the method and its exploitability in maintenance practice.

  11. Augmented Reality Based Doppler Lidar Data Visualization: Promises and Challenges

    NASA Astrophysics Data System (ADS)

    Cherukuru, N. W.; Calhoun, R.

    2016-06-01

    Augmented reality (AR) is a technology in which the enables the user to view virtual content as if it existed in real world. We are exploring the possibility of using this technology to view radial velocities or processed wind vectors from a Doppler wind lidar, thus giving the user an ability to see the wind in a literal sense. This approach could find possible applications in aviation safety, atmospheric data visualization as well as in weather education and public outreach. As a proof of concept, we used the lidar data from a recent field campaign and developed a smartphone application to view the lidar scan in augmented reality. In this paper, we give a brief methodology of this feasibility study, present the challenges and promises of using AR technology in conjunction with Doppler wind lidars.

  12. Fourier holographic display for augmented reality using holographic optical element

    NASA Astrophysics Data System (ADS)

    Li, Gang; Lee, Dukho; Jeong, Youngmo; Lee, Byoungho

    2016-03-01

    A method for realizing a three-dimensional see-through augmented reality in Fourier holographic display is proposed. A holographic optical element (HOE) with the function of Fourier lens is adopted in the system. The Fourier hologram configuration causes the real scene located behind the lens to be distorted. In the proposed method, since the HOE is transparent and it functions as the lens just for Bragg matched condition, there is not any distortion when people observe the real scene through the lens HOE (LHOE). Furthermore, two optical characteristics of the recording material are measured for confirming the feasibility of using LHOE in the proposed see-through augmented reality holographic display. The results are verified experimentally.

  13. Spatial augmented reality based high accuracy human face projection

    NASA Astrophysics Data System (ADS)

    Li, Dong; Xie, Jinghui; Li, Yufeng; Weng, Dongdong; Liu, Yue

    2015-08-01

    This paper discusses the imaging principles and the technical difficulties of spatial augmented reality based human face projection. A novel geometry correction method is proposed to realize fast, high-accuracy face model projection. Using a depth camera to reconstruct the projected object, the relative position from the rendered model to the projector can be accessed and the initial projection image is generated. Then the projected image is distorted by using Bezier interpolation to guarantee that the projected texture matches with the object surface. The proposed method is under a simple process flow and can achieve high perception registration of virtual and real object. In addition, this method has a good performance in the condition that the reconstructed model is not exactly same with the rendered virtual model which extends its application area in the spatial augmented reality based human face projection.

  14. ChemPreview: an augmented reality-based molecular interface.

    PubMed

    Zheng, Min; Waller, Mark P

    2017-05-01

    Human computer interfaces make computational science more comprehensible and impactful. Complex 3D structures such as proteins or DNA are magnified by digital representations and displayed on two-dimensional monitors. Augmented reality has recently opened another door to access the virtual three-dimensional world. Herein, we present an augmented reality application called ChemPreview with the potential to manipulate bio-molecular structures at an atomistic level. ChemPreview is available at https://github.com/wallerlab/chem-preview/releases, and is built on top of the Meta 1 platform https://www.metavision.com/. ChemPreview can be used to interact with a protein in an intuitive way using natural hand gestures, thereby making it appealing to computational chemists or structural biologists. The ability to manipulate atoms in real world could eventually provide new and more efficient ways of extracting structural knowledge, or designing new molecules in silico.

  15. Use of display technologies for augmented reality enhancement

    NASA Astrophysics Data System (ADS)

    Harding, Kevin

    2016-06-01

    Augmented reality (AR) is seen as an important tool for the future of user interfaces as well as training applications. An important application area for AR is expected to be in the digitization of training and worker instructions used in the Brilliant Factory environment. The transition of work instructions methods from printed pages in a book or taped to a machine to virtual simulations is a long step with many challenges along the way. A variety of augmented reality tools are being explored today for industrial applications that range from simple programmable projections in the work space to 3D displays and head mounted gear. This paper will review where some of these tool are today and some of the pros and cons being considered for the future worker environment.

  16. Augmented reality assisted surgery: a urologic training tool

    PubMed Central

    Dickey, Ryan M; Srikishen, Neel; Lipshultz, Larry I; Spiess, Philippe E; Carrion, Rafael E; Hakky, Tariq S

    2016-01-01

    Augmented reality is widely used in aeronautics and is a developing concept within surgery. In this pilot study, we developed an application for use on Google Glass® optical head-mounted display to train urology residents in how to place an inflatable penile prosthesis. We use the phrase Augmented Reality Assisted Surgery to describe this novel application of augmented reality in the setting of surgery. The application demonstrates the steps of the surgical procedure of inflatable penile prosthesis placement. It also contains software that allows for detection of interest points using a camera feed from the optical head-mounted display to enable faculty to interact with residents during placement of the penile prosthesis. Urology trainees and faculty who volunteered to take part in the study were given time to experience the technology in the operative or perioperative setting and asked to complete a feedback survey. From 30 total participants using a 10-point scale, educational usefulness was rated 8.6, ease of navigation was rated 7.6, likelihood to use was rated 7.4, and distraction in operating room was rated 4.9. When stratified between trainees and faculty, trainees found the technology more educationally useful, and less distracting. Overall, 81% of the participants want this technology in their residency program, and 93% see this technology in the operating room in the future. Further development of this technology is warranted before full release, and further studies are necessary to better characterize the effectiveness of Augmented Reality Assisted Surgery in urologic surgical training. PMID:26620455

  17. Augmented Reality at the Tactical and Operational Levels of War

    DTIC Science & Technology

    2015-10-24

    Augmented reality technologies have reached a stage where commercially viable systems will soon enter the consumer market . These technologies merge...specific applications. Soon, low-cost, adaptive AR systems will enter the consumer market place. As commercially viable AR systems become available...many market analysts predict that AR systems will become regular features of most people’s daily lives and work environments. Just as the regular use

  18. Augmented reality assisted surgery: a urologic training tool.

    PubMed

    Dickey, Ryan M; Srikishen, Neel; Lipshultz, Larry I; Spiess, Philippe E; Carrion, Rafael E; Hakky, Tariq S

    2016-01-01

    Augmented reality is widely used in aeronautics and is a developing concept within surgery. In this pilot study, we developed an application for use on Google Glass ® optical head-mounted display to train urology residents in how to place an inflatable penile prosthesis. We use the phrase Augmented Reality Assisted Surgery to describe this novel application of augmented reality in the setting of surgery. The application demonstrates the steps of the surgical procedure of inflatable penile prosthesis placement. It also contains software that allows for detection of interest points using a camera feed from the optical head-mounted display to enable faculty to interact with residents during placement of the penile prosthesis. Urology trainees and faculty who volunteered to take part in the study were given time to experience the technology in the operative or perioperative setting and asked to complete a feedback survey. From 30 total participants using a 10-point scale, educational usefulness was rated 8.6, ease of navigation was rated 7.6, likelihood to use was rated 7.4, and distraction in operating room was rated 4.9. When stratified between trainees and faculty, trainees found the technology more educationally useful, and less distracting. Overall, 81% of the participants want this technology in their residency program, and 93% see this technology in the operating room in the future. Further development of this technology is warranted before full release, and further studies are necessary to better characterize the effectiveness of Augmented Reality Assisted Surgery in urologic surgical training.

  19. Quantification of Visual Capabilities Using Augmented Reality Displays

    DTIC Science & Technology

    2006-10-01

    completed this task. No subject reported any color-blindness. The set of color samples were again presented in a randomly per- muted order. As these sets of...Teleperators and Virtual Environments, 14(5):550–562, October 2005. [7] Joseph L. Gabbard, J. Edward Swan II, Deborah Hix, Robert S. Schulman, John Lucas, and...Catherine A. Zanbaka, J. Edward Swan II, and Harvey S. Smallman. Objective measures for the effectiveness of augmented reality. In IEEE Virtual Re

  20. In vitro cardiac catheter navigation via augmented reality surgical guidance

    NASA Astrophysics Data System (ADS)

    Linte, Cristian A.; Moore, John; Wiles, Andrew; Lo, Jennifer; Wedlake, Chris; Peters, Terry M.

    2009-02-01

    Catheter-driven cardiac interventions have emerged in response to the need of reducing invasiveness associated with the traditional cut-and-sew techniques. Catheter manipulation is traditionally performed under real-time fluoroscopy imaging, resulting in an overall trade-off of procedure invasiveness for radiation exposure of both the patient and clinical staff. Our approach to reducing and potentially eliminating the use of flouroscopy in the operating room entails the use of multi-modality imaging and magnetic tracking technologies, wrapped together into an augmented reality environment for enhanced intra-procedure visualization and guidance. Here we performed an in vitro study in which a catheter was guided to specific targets located on the endocardial atrial surface of a beating heart phantom. "Therapy delivery" was modeled in the context of a blinded procedure, mimicking a beating heart, intracardiac intervention. The users navigated the tip of a magnetically tracked Freezor 5 CRYOCATH catheter to the specified targets. Procedure accuracy was determined as the distance between the tracked catheter tip and the tracked surgical target at the time of contact, and it was assessed under three different guidance modalities: endoscopic, augmented reality, and ultrasound image guidance. The overall RMS targeting accuracy achieved under augmented reality guidance averaged to 1.1 mm. This guidance modality shows significant improvements in both procedure accuracy and duration over ultrasound image guidance alone, while maintianing an overall targeting accuracy comparable to that achieved under endoscopic guidance.

  1. Flexible augmented reality architecture applied to environmental management

    NASA Astrophysics Data System (ADS)

    Correia, Nuno M. R.; Romao, Teresa; Santos, Carlos; Trabuco, Adelaide; Santos, Rossana; Romero, Luis; Danado, Jose; Dias, Eduardo; Camara, Antonio; Nobre, Edmundo

    2003-05-01

    Environmental management often requires in loco observation of the area under analysis. Augmented Reality (AR) technologies allow real time superimposition of synthetic objects on real images, providing augmented knowledge about the surrounding world. Users of an AR system can visualize the real surrounding world together with additional data generated in real time in a contextual way. The work reported in this paper was done in the scope of ANTS (Augmented Environments) project. ANTS is an AR project that explores the development of an augmented reality technological infrastructure for environmental management. This paper presents the architecture and the most relevant modules of ANTS. The system"s architecture follows the client-server model and is based on several independent, but functionally interdependent modules. It has a flexible design, which allows the transfer of some modules to and from the client side, according to the available processing capacities of the client device and the application"s requirements. It combines several techniques to identify the user"s position and orientation allowing the system to adapt to the particular characteristics of each environment. The determination of the data associated to a certain location involves the use of both a 3D Model of the location and the multimedia geo-referenced database.

  2. Contextualized Interdisciplinary Learning in Mainstream Schools Using Augmented Reality-Based Technology: A Dream or Reality?

    ERIC Educational Resources Information Center

    Ong, Alex

    2010-01-01

    The use of augmented reality (AR) tools, where virtual objects such as tables and graphs can be displayed and be interacted with in real scenes created from imaging devices, in mainstream school curriculum is uncommon, as they are potentially costly and sometimes bulky. Thus, such learning tools are mainly applied in tertiary institutions, such as…

  3. Augmented Reality Environments in Learning, Communicational and Professional Contexts in Higher Education

    ERIC Educational Resources Information Center

    Martín Gutiérrez, Jorge; Meneses Fernández, María Dolores

    2014-01-01

    This paper explores educational and professional uses of augmented learning environment concerned with issues of training and entertainment. We analyze the state-of-art research of some scenarios based on augmented reality. Some examples for the purpose of education and simulation are described. These applications show that augmented reality can…

  4. FlyAR: augmented reality supported micro aerial vehicle navigation.

    PubMed

    Zollmann, Stefanie; Hoppe, Christof; Langlotz, Tobias; Reitmayr, Gerhard

    2014-04-01

    Micro aerial vehicles equipped with high-resolution cameras can be used to create aerial reconstructions of an area of interest. In that context automatic flight path planning and autonomous flying is often applied but so far cannot fully replace the human in the loop, supervising the flight on-site to assure that there are no collisions with obstacles. Unfortunately, this workflow yields several issues, such as the need to mentally transfer the aerial vehicle’s position between 2D map positions and the physical environment, and the complicated depth perception of objects flying in the distance. Augmented Reality can address these issues by bringing the flight planning process on-site and visualizing the spatial relationship between the planned or current positions of the vehicle and the physical environment. In this paper, we present Augmented Reality supported navigation and flight planning of micro aerial vehicles by augmenting the user’s view with relevant information for flight planning and live feedback for flight supervision. Furthermore, we introduce additional depth hints supporting the user in understanding the spatial relationship of virtual waypoints in the physical world and investigate the effect of these visualization techniques on the spatial understanding.

  5. Live texturing of augmented reality characters from colored drawings.

    PubMed

    Magnenat, Stéphane; Ngo, Dat Tien; Zünd, Fabio; Ryffel, Mattia; Noris, Gioacchino; Rothlin, Gerhard; Marra, Alessia; Nitti, Maurizio; Fua, Pascal; Gross, Markus; Sumner, Robert W

    2015-11-01

    Coloring books capture the imagination of children and provide them with one of their earliest opportunities for creative expression. However, given the proliferation and popularity of digital devices, real-world activities like coloring can seem unexciting, and children become less engaged in them. Augmented reality holds unique potential to impact this situation by providing a bridge between real-world activities and digital enhancements. In this paper, we present an augmented reality coloring book App in which children color characters in a printed coloring book and inspect their work using a mobile device. The drawing is detected and tracked, and the video stream is augmented with an animated 3-D version of the character that is textured according to the child's coloring. This is possible thanks to several novel technical contributions. We present a texturing process that applies the captured texture from a 2-D colored drawing to both the visible and occluded regions of a 3-D character in real time. We develop a deformable surface tracking method designed for colored drawings that uses a new outlier rejection algorithm for real-time tracking and surface deformation recovery. We present a content creation pipeline to efficiently create the 2-D and 3-D content. And, finally, we validate our work with two user studies that examine the quality of our texturing algorithm and the overall App experience.

  6. Fast Markerless Tracking for Augmented Reality in Planar Environment

    NASA Astrophysics Data System (ADS)

    Basori, Ahmad Hoirul; Afif, Fadhil Noer; Almazyad, Abdulaziz S.; AbuJabal, Hamza Ali S.; Rehman, Amjad; Alkawaz, Mohammed Hazim

    2015-12-01

    Markerless tracking for augmented reality should not only be accurate but also fast enough to provide a seamless synchronization between real and virtual beings. Current reported methods showed that a vision-based tracking is accurate but requires high computational power. This paper proposes a real-time hybrid-based method for tracking unknown environments in markerless augmented reality. The proposed method provides collaboration of vision-based approach with accelerometers and gyroscopes sensors as camera pose predictor. To align the augmentation relative to camera motion, the tracking method is done by substituting feature-based camera estimation with combination of inertial sensors with complementary filter to provide more dynamic response. The proposed method managed to track unknown environment with faster processing time compared to available feature-based approaches. Moreover, the proposed method can sustain its estimation in a situation where feature-based tracking loses its track. The collaboration of sensor tracking managed to perform the task for about 22.97 FPS, up to five times faster than feature-based tracking method used as comparison. Therefore, the proposed method can be used to track unknown environments without depending on amount of features on scene, while requiring lower computational cost.

  7. Marshall Engineers Use Virtual Reality

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. Marshall Spce Flight Center (MSFC) is begirning to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models are used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup is to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provides general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC).

  8. Simulation and augmented reality in endovascular neurosurgery: lessons from aviation.

    PubMed

    Mitha, Alim P; Almekhlafi, Mohammed A; Janjua, Major Jameel J; Albuquerque, Felipe C; McDougall, Cameron G

    2013-01-01

    Endovascular neurosurgery is a discipline strongly dependent on imaging. Therefore, technology that improves how much useful information we can garner from a single image has the potential to dramatically assist decision making during endovascular procedures. Furthermore, education in an image-enhanced environment, especially with the incorporation of simulation, can improve the safety of the procedures and give interventionalists and trainees the opportunity to study or perform simulated procedures before the intervention, much like what is practiced in the field of aviation. Here, we examine the use of simulators in the training of fighter pilots and discuss how similar benefits can compensate for current deficiencies in endovascular training. We describe the types of simulation used for endovascular procedures, including virtual reality, and discuss the relevant data on its utility in training. Finally, the benefit of augmented reality during endovascular procedures is discussed, along with future computerized image enhancement techniques.

  9. Application of Virtual, Augmented, and Mixed Reality to Urology

    PubMed Central

    2016-01-01

    Recent developments in virtual, augmented, and mixed reality have introduced a considerable number of new devices into the consumer market. This momentum is also affecting the medical and health care sector. Although many of the theoretical and practical foundations of virtual reality (VR) were already researched and experienced in the 1980s, the vastly improved features of displays, sensors, interactivity, and computing power currently available in devices offer a new field of applications to the medical sector and also to urology in particular. The purpose of this review article is to review the extent to which VR technology has already influenced certain aspects of medicine, the applications that are currently in use in urology, and the future development trends that could be expected. PMID:27706017

  10. Application of Virtual, Augmented, and Mixed Reality to Urology.

    PubMed

    Hamacher, Alaric; Kim, Su Jin; Cho, Sung Tae; Pardeshi, Sunil; Lee, Seung Hyun; Eun, Sung-Jong; Whangbo, Taeg Keun

    2016-09-01

    Recent developments in virtual, augmented, and mixed reality have introduced a considerable number of new devices into the consumer market. This momentum is also affecting the medical and health care sector. Although many of the theoretical and practical foundations of virtual reality (VR) were already researched and experienced in the 1980s, the vastly improved features of displays, sensors, interactivity, and computing power currently available in devices offer a new field of applications to the medical sector and also to urology in particular. The purpose of this review article is to review the extent to which VR technology has already influenced certain aspects of medicine, the applications that are currently in use in urology, and the future development trends that could be expected.

  11. Beyond laser safety glasses: augmented reality in optics laboratories.

    PubMed

    Quercioli, Franco

    2017-02-01

    Blocking visibility of a laser beam after a pair of safety goggles have been worn is always an unpleasant experience. Working blindly is hard, sometimes even dangerous, and safety could be again at risk. A safe, clear view of the laser beam path would be highly desirable. This paper presents a technique for laboratory laser safety, using a smartphone's camera and display, in conjunction with an augmented reality headset to allow clear viewing of laser experiments without any risk of laser eye injury. Use of the technique is demonstrated, and strengths and weaknesses of the solution are discussed.

  12. Development of microprojection system of mixed and augmented reality

    NASA Astrophysics Data System (ADS)

    Rudakova, M. S.

    2016-08-01

    The paper deals with the problems of development and designing of microprojection system of mixed and augmented reality designed so that an observer could see the information of the microdisplay and the surrounding space as the background at the same time. The combiner on planar waveguides screens based on the composite structure of the prism elements was developed. In this work different results of the simulation with the TracePro software are considered and also the main problems encountered in the development of such systems are considered.

  13. An augmented reality simulator for ultrasound guided needle placement training.

    PubMed

    Magee, D; Zhu, Y; Ratnalingam, R; Gardner, P; Kessel, D

    2007-10-01

    Details are presented of a low cost augmented-reality system for the simulation of ultrasound guided needle insertion procedures (tissue biopsy, abscess drainage, nephrostomy etc.) for interventional radiology education and training. The system comprises physical elements; a mannequin, a mock ultrasound probe and a needle, and software elements; generating virtual ultrasound anatomy and allowing data collection. These two elements are linked by a pair of magnetic 3D position sensors. Virtual anatomic images are generated based on anatomic data derived from full body CT scans of live humans. Details of the novel aspects of this system are presented including; image generation, registration and calibration.

  14. A Survey of Mobile and Wireless Technologies for Augmented Reality Systems (Preprint)

    DTIC Science & Technology

    2008-02-01

    Page 1 of 30 A survey of mobile and wireless technologies for augmented reality systems George Papagiannakis*, Gurminder Singh**, Nadia...performance of the current task, enabling an “ Augmented Reality ” (AR) Caudell et al [2]. Although AR was meant to include mobility, it was not until...The Columbia Touring Machine” by Feiner et al [3] that the first outdoor Mobile Augmented Reality System (MARS) was created. Around the same time

  15. Effectiveness of Occluded Object Representations at Displaying Ordinal Depth Information in Augmented Reality

    DTIC Science & Technology

    2013-03-01

    Displaying Ordinal Depth Information in Augmented Reality 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT...Effectiveness of Occluded Object Representations at Displaying Ordinal Depth Information in Augmented Reality Mark A. Livingston∗ Naval Research Laboratory...effectively impossible with all icon styles, whereas in the case of partial overlap, the Ground Plane had a clear advantage. Keywords: Augmented reality , human

  16. Transforming Fleet Network Operations with Collaborative Decision Support and Augmented Reality Technologies

    DTIC Science & Technology

    2004-03-01

    NETWORK OPERATIONS WITH COLLABORATIVE DECISION SUPPORT AND AUGMENTED REALITY TECHNOLOGIES by John J. Fay March 2004 Thesis Advisor: Alex...Network Operations with Collaborative Decision Support and Augmented Reality Technologies 6. AUTHOR(S) John J Fay 5. FUNDING NUMBERS 7. PERFORMING...management for distributed sea-based forces using existing technologies. Combining a collaborative tool, Decision Support System (DSS), and Augmented Reality (AR

  17. Towards Determination of Visual Requirements for Augmented Reality Displays and Virtual Environments for the Airport Tower

    DTIC Science & Technology

    2006-06-01

    The visual requirements for augmented reality or virtual environments displays that might be used in real or virtual towers are reviewed with respect... augmented reality displays, an optical see-through display was used in an ATC Tower simulation. Three different binocular fields of view (14 , 28...fields of view much greater than 47 are unlikely to dramatically improve search performance and that partial binocular overlap is a feasible display technique for augmented reality Tower applications.

  18. Human Factors Issues in the Use of Virtual and Augmented Reality for Military Purposes - USA

    DTIC Science & Technology

    2005-12-01

    Human Factors Issues in the Use of Virtual and Augmented Reality for Military Purposes – USA Stephen Golberg US Army Research institute...Human Factors Issues in the Use of Virtual and Augmented Reality for Military Purposes USA 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...Rev. 8-98) Prescribed by ANSI Std Z39-18 HUMAN FACTORS ISSUES IN THE USE OF VIRTUAL AND AUGMENTED REALITY FOR MILITARY PURPOSES – USA 7 - 2 RTO

  19. Attention and Trust Biases in the Design of Augmented Reality Displays

    DTIC Science & Technology

    2000-04-01

    AND TRUST BIASES IN THE DESIGN OF AUGMENTED REALITY DISPLAYS Michelle Yeh and Christopher D. Wickens Technical Report ARL-00-3/FED-LAB-00-1...REPORT TYPE 3. DATES COVERED - 4. TITLE AND SUBTITLE Attention and Trust Biases in the Design of Augmented Reality Displays 5a. CONTRACT...efficient presentation of data through a more “invisible” interface using techniques of augmented reality , in which supplementary information

  20. Architecture and Key Techniques of Augmented Reality Maintenance Guiding System for Civil Aircrafts

    NASA Astrophysics Data System (ADS)

    hong, Zhou; Wenhua, Lu

    2017-01-01

    Augmented reality technology is introduced into the maintenance related field for strengthened information in real-world scenarios through integration of virtual assistant maintenance information with real-world scenarios. This can lower the difficulty of maintenance, reduce maintenance errors, and improve the maintenance efficiency and quality of civil aviation crews. Architecture of augmented reality virtual maintenance guiding system is proposed on the basis of introducing the definition of augmented reality and analyzing the characteristics of augmented reality virtual maintenance. Key techniques involved, such as standardization and organization of maintenance data, 3D registration, modeling of maintenance guidance information and virtual maintenance man-machine interaction, are elaborated emphatically, and solutions are given.

  1. The status of augmented reality in laparoscopic surgery as of 2016.

    PubMed

    Bernhardt, Sylvain; Nicolau, Stéphane A; Soler, Luc; Doignon, Christophe

    2017-04-01

    This article establishes a comprehensive review of all the different methods proposed by the literature concerning augmented reality in intra-abdominal minimally invasive surgery (also known as laparoscopic surgery). A solid background of surgical augmented reality is first provided in order to support the survey. Then, the various methods of laparoscopic augmented reality as well as their key tasks are categorized in order to better grasp the current landscape of the field. Finally, the various issues gathered from these reviewed approaches are organized in order to outline the remaining challenges of augmented reality in laparoscopic surgery.

  2. archAR: an archaeological augmented reality experience

    NASA Astrophysics Data System (ADS)

    Wiley, Bridgette; Schulze, Jürgen P.

    2015-03-01

    We present an application for Android phones or tablets called "archAR" that uses augmented reality as an alternative, portable way of viewing archaeological information from UCSD's Levantine Archaeology Laboratory. archAR provides a unique experience of flying through an archaeological dig site in the Levantine area and exploring the artifacts uncovered there. Using a Google Nexus tablet and Qualcomm's Vuforia API, we use an image target as a map and overlay a three-dimensional model of the dig site onto it, augmenting reality such that we are able to interact with the plotted artifacts. The user can physically move the Android device around the image target and see the dig site model from any perspective. The user can also move the device closer to the model in order to "zoom" into the view of a particular section of the model and its associated artifacts. This is especially useful, as the dig site model and the collection of artifacts are very detailed. The artifacts are plotted as points, colored by type. The user can touch the virtual points to trigger a popup information window that contains details of the artifact, such as photographs, material descriptions, and more.

  3. On Location Learning: Authentic Applied Science with Networked Augmented Realities

    NASA Astrophysics Data System (ADS)

    Rosenbaum, Eric; Klopfer, Eric; Perry, Judy

    2007-02-01

    The learning of science can be made more like the practice of science through authentic simulated experiences. We have created a networked handheld Augmented Reality environment that combines the authentic role-playing of Augmented Realities and the underlying models of Participatory Simulations. This game, known as Outbreak @ The Institute, is played across a university campus where players take on the roles of doctors, medical technicians, and public health experts to contain a disease outbreak. Players can interact with virtual characters and employ virtual diagnostic tests and medicines. They are challenged to identify the source and prevent the spread of an infectious disease that can spread among real and/or virtual characters according to an underlying model. In this paper, we report on data from three high school classes who played the game. We investigate students' perception of the authenticity of the game in terms of their personal embodiment in the game, their experience playing different roles, and their understanding of the dynamic model underlying the game.

  4. Utilization of the Space Vision System as an Augmented Reality System For Mission Operations

    NASA Technical Reports Server (NTRS)

    Maida, James C.; Bowen, Charles

    2003-01-01

    Augmented reality is a technique whereby computer generated images are superimposed on live images for visual enhancement. Augmented reality can also be characterized as dynamic overlays when computer generated images are registered with moving objects in a live image. This technique has been successfully implemented, with low to medium levels of registration precision, in an NRA funded project entitled, "Improving Human Task Performance with Luminance Images and Dynamic Overlays". Future research is already being planned to also utilize a laboratory-based system where more extensive subject testing can be performed. However successful this might be, the problem will still be whether such a technology can be used with flight hardware. To answer this question, the Canadian Space Vision System (SVS) will be tested as an augmented reality system capable of improving human performance where the operation requires indirect viewing. This system has already been certified for flight and is currently flown on each shuttle mission for station assembly. Successful development and utilization of this system in a ground-based experiment will expand its utilization for on-orbit mission operations. Current research and development regarding the use of augmented reality technology is being simulated using ground-based equipment. This is an appropriate approach for development of symbology (graphics and annotation) optimal for human performance and for development of optimal image registration techniques. It is anticipated that this technology will become more pervasive as it matures. Because we know what and where almost everything is on ISS, this reduces the registration problem and improves the computer model of that reality, making augmented reality an attractive tool, provided we know how to use it. This is the basis for current research in this area. However, there is a missing element to this process. It is the link from this research to the current ISS video system and to

  5. Learning Anatomy via Mobile Augmented Reality: Effects on Achievement and Cognitive Load

    ERIC Educational Resources Information Center

    Küçük, Sevda; Kapakin, Samet; Göktas, Yüksel

    2016-01-01

    Augmented reality (AR), a new generation of technology, has attracted the attention of educators in recent years. In this study, a MagicBook was developed for a neuroanatomy topic by using mobile augmented reality (mAR) technology. This technology integrates virtual learning objects into the real world and allow users to interact with the…

  6. A Mobile Service Oriented Multiple Object Tracking Augmented Reality Architecture for Education and Learning Experiences

    ERIC Educational Resources Information Center

    Rattanarungrot, Sasithorn; White, Martin; Newbury, Paul

    2014-01-01

    This paper describes the design of our service-oriented architecture to support mobile multiple object tracking augmented reality applications applied to education and learning scenarios. The architecture is composed of a mobile multiple object tracking augmented reality client, a web service framework, and dynamic content providers. Tracking of…

  7. Mobile Augmented Reality in Supporting Peer Assessment: An Implementation in a Fundamental Design Course

    ERIC Educational Resources Information Center

    Lan, Chung-Hsien; Chao, Stefan; Kinshuk; Chao, Kuo-Hung

    2013-01-01

    This study presents a conceptual framework for supporting mobile peer assessment by incorporating augmented reality technology to eliminate limitation of reviewing and assessing. According to the characteristics of mobile technology and augmented reality, students' work can be shown in various ways by considering the locations and situations. This…

  8. Augmented Reality Trends in Education: A Systematic Review of Research and Applications

    ERIC Educational Resources Information Center

    Bacca, Jorge; Baldiris, Silvia; Fabregat, Ramon; Graf, Sabine; Kinshuk

    2014-01-01

    In recent years, there has been an increasing interest in applying Augmented Reality (AR) to create unique educational settings. So far, however, there is a lack of review studies with focus on investigating factors such as: the uses, advantages, limitations, effectiveness, challenges and features of augmented reality in educational settings.…

  9. Social Augmented Reality: Enhancing Context-Dependent Communication and Informal Learning at Work

    ERIC Educational Resources Information Center

    Pejoska, Jana; Bauters, Merja; Purma, Jukka; Leinonen, Teemu

    2016-01-01

    Our design proposal of social augmented reality (SoAR) grows from the observed difficulties of practical applications of augmented reality (AR) in workplace learning. In our research we investigated construction workers doing physical work in the field and analyzed the data using qualitative methods in various workshops. The challenges related to…

  10. Augmented Reality: A Brand New Challenge for the Assessment and Treatment of Psychological Disorders

    PubMed Central

    Chicchi Giglioli, Irene Alice; Pallavicini, Federica; Pedroli, Elisa; Serino, Silvia; Riva, Giuseppe

    2015-01-01

    Augmented Reality is a new technological system that allows introducing virtual contents in the real world in order to run in the same representation and, in real time, enhancing the user's sensory perception of reality. From another point of view, Augmented Reality can be defined as a set of techniques and tools that add information to the physical reality. To date, Augmented Reality has been used in many fields, such as medicine, entertainment, maintenance, architecture, education, and cognitive and motor rehabilitation but very few studies and applications of AR exist in clinical psychology. In the treatment of psychological disorders, Augmented Reality has given preliminary evidence to be a useful tool due to its adaptability to the patient needs and therapeutic purposes and interactivity. Another relevant factor is the quality of the user's experience in the Augmented Reality system determined from emotional engagement and sense of presence. This experience could increase the AR ecological validity in the treatment of psychological disorders. This paper reviews the recent studies on the use of Augmented Reality in the evaluation and treatment of psychological disorders, focusing on current uses of this technology and on the specific features that delineate Augmented Reality a new technique useful for psychology. PMID:26339283

  11. Augmented Reality: A Brand New Challenge for the Assessment and Treatment of Psychological Disorders.

    PubMed

    Chicchi Giglioli, Irene Alice; Pallavicini, Federica; Pedroli, Elisa; Serino, Silvia; Riva, Giuseppe

    2015-01-01

    Augmented Reality is a new technological system that allows introducing virtual contents in the real world in order to run in the same representation and, in real time, enhancing the user's sensory perception of reality. From another point of view, Augmented Reality can be defined as a set of techniques and tools that add information to the physical reality. To date, Augmented Reality has been used in many fields, such as medicine, entertainment, maintenance, architecture, education, and cognitive and motor rehabilitation but very few studies and applications of AR exist in clinical psychology. In the treatment of psychological disorders, Augmented Reality has given preliminary evidence to be a useful tool due to its adaptability to the patient needs and therapeutic purposes and interactivity. Another relevant factor is the quality of the user's experience in the Augmented Reality system determined from emotional engagement and sense of presence. This experience could increase the AR ecological validity in the treatment of psychological disorders. This paper reviews the recent studies on the use of Augmented Reality in the evaluation and treatment of psychological disorders, focusing on current uses of this technology and on the specific features that delineate Augmented Reality a new technique useful for psychology.

  12. Computer vision and augmented reality in gastrointestinal endoscopy

    PubMed Central

    Mahmud, Nadim; Cohen, Jonah; Tsourides, Kleovoulos; Berzin, Tyler M.

    2015-01-01

    Augmented reality (AR) is an environment-enhancing technology, widely applied in the computer sciences, which has only recently begun to permeate the medical field. Gastrointestinal endoscopy—which relies on the integration of high-definition video data with pathologic correlates—requires endoscopists to assimilate and process a tremendous amount of data in real time. We believe that AR is well positioned to provide computer-guided assistance with a wide variety of endoscopic applications, beginning with polyp detection. In this article, we review the principles of AR, describe its potential integration into an endoscopy set-up, and envisage a series of novel uses. With close collaboration between physicians and computer scientists, AR promises to contribute significant improvements to the field of endoscopy. PMID:26133175

  13. Pose Estimation for Augmented Reality: A Hands-On Survey.

    PubMed

    Marchand, Eric; Uchiyama, Hideaki; Spindler, Fabien

    2016-12-01

    Augmented reality (AR) allows to seamlessly insert virtual objects in an image sequence. In order to accomplish this goal, it is important that synthetic elements are rendered and aligned in the scene in an accurate and visually acceptable way. The solution of this problem can be related to a pose estimation or, equivalently, a camera localization process. This paper aims at presenting a brief but almost self-contented introduction to the most important approaches dedicated to vision-based camera localization along with a survey of several extension proposed in the recent years. For most of the presented approaches, we also provide links to code of short examples. This should allow readers to easily bridge the gap between theoretical aspects and practical implementations.

  14. Efficient Verification of Holograms Using Mobile Augmented Reality.

    PubMed

    Hartl, Andreas Daniel; Arth, Clemens; Grubert, Jens; Schmalstieg, Dieter

    2016-07-01

    Paper documents such as passports, visas and banknotes are frequently checked by inspection of security elements. In particular, optically variable devices such as holograms are important, but difficult to inspect. Augmented Reality can provide all relevant information on standard mobile devices. However, hologram verification on mobiles still takes long and provides lower accuracy than inspection by human individuals using appropriate reference information. We aim to address these drawbacks by automatic matching combined with a special parametrization of an efficient goal-oriented user interface which supports constrained navigation. We first evaluate a series of similarity measures for matching hologram patches to provide a sound basis for automatic decisions. Then a re-parametrized user interface is proposed based on observations of typical user behavior during document capture. These measures help to reduce capture time to approximately 15 s with better decisions regarding the evaluated samples than what can be achieved by untrained users.

  15. Real-time tomographic holography for augmented reality

    PubMed Central

    Galeotti, John M.; Siegel, Mel; Stetten, George

    2011-01-01

    The concept and instantiation of Real-Time Tomographic Holography (RTTH) for augmented reality is presented. RTTH enables natural hand-eye coordination to guide invasive medical procedures without requiring tracking or a head-mounted device. It places a real-time virtual image of an object's cross-section into its actual location, without noticeable viewpoint dependence (e.g. parallax error). The virtual image is viewed through a flat narrow-band Holographic Optical Element with optical power that generates an in-situ virtual image (within 1 m of the HOE) from a small SLM display without obscuring a direct view of the physical world. Rigidly fixed upon a medical ultrasound probe, an RTTH device could show the scan in its actual location inside the patient, even as the probe was moved relative to the patient. PMID:20634827

  16. Pose tracking for augmented reality applications in outdoor archaeological sites

    NASA Astrophysics Data System (ADS)

    Younes, Georges; Asmar, Daniel; Elhajj, Imad; Al-Harithy, Howayda

    2017-01-01

    In recent years, agencies around the world have invested huge amounts of effort toward digitizing many aspects of the world's cultural heritage. Of particular importance is the digitization of outdoor archaeological sites. In the spirit of valorization of this digital information, many groups have developed virtual or augmented reality (AR) computer applications themed around a particular archaeological object. The problem of pose tracking in outdoor AR applications is addressed. Different positional systems are analyzed, resulting in the selection of a monocular camera-based user tracker. The limitations that challenge this technique from map generation, scale, anchoring, to lighting conditions are analyzed and systematically addressed. Finally, as a case study, our pose tracking system is implemented within an AR experience in the Byblos Roman theater in Lebanon.

  17. Computer vision and augmented reality in gastrointestinal endoscopy.

    PubMed

    Mahmud, Nadim; Cohen, Jonah; Tsourides, Kleovoulos; Berzin, Tyler M

    2015-08-01

    Augmented reality (AR) is an environment-enhancing technology, widely applied in the computer sciences, which has only recently begun to permeate the medical field. Gastrointestinal endoscopy-which relies on the integration of high-definition video data with pathologic correlates-requires endoscopists to assimilate and process a tremendous amount of data in real time. We believe that AR is well positioned to provide computer-guided assistance with a wide variety of endoscopic applications, beginning with polyp detection. In this article, we review the principles of AR, describe its potential integration into an endoscopy set-up, and envisage a series of novel uses. With close collaboration between physicians and computer scientists, AR promises to contribute significant improvements to the field of endoscopy.

  18. Simulating low-cost cameras for augmented reality compositing.

    PubMed

    Klein, Georg; Murray, David W

    2010-01-01

    Video see-through Augmented Reality adds computer graphics to the real world in real time by overlaying graphics onto a live video feed. To achieve a realistic integration of the virtual and real imagery, the rendered images should have a similar appearance and quality to those produced by the video camera. This paper describes a compositing method which models the artifacts produced by a small low-cost camera, and adds these effects to an ideal pinhole image produced by conventional rendering methods. We attempt to model and simulate each step of the imaging process, including distortions, chromatic aberrations, blur, Bayer masking, noise, sharpening, and color-space compression, all while requiring only an RGBA image and an estimate of camera velocity as inputs.

  19. An augmented reality assistance platform for eye laser surgery.

    PubMed

    Ee Ping Ong; Lee, Jimmy Addison; Jun Cheng; Beng Hai Lee; Guozhen Xu; Laude, Augustinus; Teoh, Stephen; Tock Han Lim; Wong, Damon W K; Jiang Liu

    2015-08-01

    This paper presents a novel augmented reality assistance platform for eye laser surgery. The aims of the proposed system are for the application of assisting eye doctors in pre-planning as well as providing guidance and protection during laser surgery. We developed algorithms to automatically register multi-modal images, detect macula and optic disc regions, and demarcate these as protected areas from laser surgery. The doctor will then be able to plan the laser treatment pre-surgery using the registered images and segmented regions. Thereafter, during live surgery, the system will automatically register and track the slit lamp video frames on the registered retina images, send appropriate warning when the laser is near protected areas, and disable the laser function when it points into the protected areas. The proposed system prototype can help doctors to speed up laser surgery with confidence without fearing that they may unintentionally fire laser in the protected areas.

  20. Augmented Reality Imaging System: 3D Viewing of a Breast Cancer

    PubMed Central

    Douglas, David B.; Boone, John M.; Petricoin, Emanuel; Liotta, Lance; Wilson, Eugene

    2016-01-01

    Objective To display images of breast cancer from a dedicated breast CT using Depth 3-Dimensional (D3D) augmented reality. Methods A case of breast cancer imaged using contrast-enhanced breast CT (Computed Tomography) was viewed with the augmented reality imaging, which uses a head display unit (HDU) and joystick control interface. Results The augmented reality system demonstrated 3D viewing of the breast mass with head position tracking, stereoscopic depth perception, focal point convergence and the use of a 3D cursor and joy-stick enabled fly through with visualization of the spiculations extending from the breast cancer. Conclusion The augmented reality system provided 3D visualization of the breast cancer with depth perception and visualization of the mass's spiculations. The augmented reality system should be further researched to determine the utility in clinical practice. PMID:27774517

  1. Augmented Reality Implementation in Watch Catalog as e-Marketing Based on Mobile Aplication

    NASA Astrophysics Data System (ADS)

    Adrianto, D.; Luwinda, F. A.; Yesmaya, V.

    2017-01-01

    Augmented Reality is one of important methods to provide user with a better interactive user interface. In this research, Augmented Reality in Mobile Application will be applied to provide user with useful information related with Watch Catalogue. This research will be focused on design and implementation an application using Augmented Reality. The process model used in this research is Extreme Programming. Extreme Programming have a several steps which are planning, design, coding, and testing. The result of this research is Augmented Reality application based on Android. This research will be conclude that implementation of Augmented Reality based on Android in Watch Catalogue will help customer to collect the useful information related to the specific object of watch.

  2. Training for planning tumour resection: augmented reality and human factors.

    PubMed

    Abhari, Kamyar; Baxter, John S H; Chen, Elvis C S; Khan, Ali R; Peters, Terry M; de Ribaupierre, Sandrine; Eagleson, Roy

    2015-06-01

    Planning surgical interventions is a complex task, demanding a high degree of perceptual, cognitive, and sensorimotor skills to reduce intra- and post-operative complications. This process requires spatial reasoning to coordinate between the preoperatively acquired medical images and patient reference frames. In the case of neurosurgical interventions, traditional approaches to planning tend to focus on providing a means for visualizing medical images, but rarely support transformation between different spatial reference frames. Thus, surgeons often rely on their previous experience and intuition as their sole guide is to perform mental transformation. In case of junior residents, this may lead to longer operation times or increased chance of error under additional cognitive demands. In this paper, we introduce a mixed augmented-/virtual-reality system to facilitate training for planning a common neurosurgical procedure, brain tumour resection. The proposed system is designed and evaluated with human factors explicitly in mind, alleviating the difficulty of mental transformation. Our results indicate that, compared to conventional planning environments, the proposed system greatly improves the nonclinicians' performance, independent of the sensorimotor tasks performed ( ). Furthermore, the use of the proposed system by clinicians resulted in a significant reduction in time to perform clinically relevant tasks ( ). These results demonstrate the role of mixed-reality systems in assisting residents to develop necessary spatial reasoning skills needed for planning brain tumour resection, improving patient outcomes.

  3. Invisible waves and hidden realms: augmented reality and experimental art

    NASA Astrophysics Data System (ADS)

    Ruzanka, Silvia

    2012-03-01

    Augmented reality is way of both altering the visible and revealing the invisible. It offers new opportunities for artistic exploration through virtual interventions in real space. In this paper, the author describes the implementation of two art installations using different AR technologies, one using optical marker tracking on mobile devices and one integrating stereoscopic projections into the physical environment. The first artwork, De Ondas y Abejas (The Waves and the Bees), is based on the widely publicized (but unproven) hypothesis of a link between cellphone radiation and the phenomenon of bee colony collapse disorder. Using an Android tablet, viewers search out small fiducial markers in the shape of electromagnetic waves hidden throughout the gallery, which reveal swarms of bees scattered on the floor. The piece also creates a generative soundscape based on electromagnetic fields. The second artwork, Urban Fauna, is a series of animations in which features of the urban landscape become plants and animals. Surveillance cameras become flocks of birds while miniature cellphone towers, lampposts, and telephone poles grow like small seedlings in time-lapse animation. The animations are presented as small stereoscopic projections, integrated into the physical space of the gallery. These two pieces explore the relationship between nature and technology through the visualization of invisible forces and hidden alternate realities.

  4. Augmented reality simulator for training in two-dimensional echocardiography.

    PubMed

    Weidenbach, M; Wick, C; Pieper, S; Quast, K J; Fox, T; Grunst, G; Redel, D A

    2000-02-01

    In two-dimensional echocardiography the sonographer must synthesize multiple tomographic slices into a mental three-dimensional (3D) model of the heart. Computer graphics and virtual reality environments are ideal to visualize complex 3D spatial relationships. In augmented reality (AR) applications, real and virtual image data are linked, to increase the information content. In the presented AR simulator a 3D surface model of the human heart is linked with echocardiographic volume data sets. The 3D echocardiographic data sets are registered with the heart model to establish spatial and temporal congruence. The heart model, together with an animated ultrasound sector represents a reference scenario, which displays the currently selected two-dimensional echocardiographic cutting plane calculated from the volume data set. Modifications of the cutting plane within the echocardiographic data are transferred and visualized simultaneously and in real time within the reference scenario. The trainee can interactively explore the 3D heart model and the registered 3D echocardiographic data sets by an animated ultrasound probe, whose position is controlled by an electromagnetic tracking system. The tracking system is attached to a dummy transducer and placed on a plastic puppet to give a realistic impression of a two-dimensional echocardiographic examination.

  5. D Tracking Based Augmented Reality for Cultural Heritage Data Management

    NASA Astrophysics Data System (ADS)

    Battini, C.; Landi, G.

    2015-02-01

    The development of contactless documentation techniques is allowing researchers to collect high volumes of three-dimensional data in a short time but with high levels of accuracy. The digitalisation of cultural heritage opens up the possibility of using image processing and analysis, and computer graphics techniques, to preserve this heritage for future generations; augmenting it with additional information or with new possibilities for its enjoyment and use. The collection of precise datasets about cultural heritage status is crucial for its interpretation, its conservation and during the restoration processes. The application of digital-imaging solutions for various feature extraction, image data-analysis techniques, and three-dimensional reconstruction of ancient artworks, allows the creation of multidimensional models that can incorporate information coming from heterogeneous data sets, research results and historical sources. Real objects can be scanned and reconstructed virtually, with high levels of data accuracy and resolution. Real-time visualisation software and hardware is rapidly evolving and complex three-dimensional models can be interactively visualised and explored on applications developed for mobile devices. This paper will show how a 3D reconstruction of an object, with multiple layers of information, can be stored and visualised through a mobile application that will allow interaction with a physical object for its study and analysis, using 3D Tracking based Augmented Reality techniques.

  6. Applied Operations Research: Augmented Reality in an Industrial Environment

    NASA Technical Reports Server (NTRS)

    Cole, Stuart K.

    2015-01-01

    Augmented reality is the application of computer generated data or graphics onto a real world view. Its use provides the operator additional information or a heightened situational awareness. While advancements have been made in automation and diagnostics of high value critical equipment to improve readiness, reliability and maintenance, the need for assisting and support to Operations and Maintenance staff persists. AR can improve the human machine interface where computer capabilities maximize the human experience and analysis capabilities. NASA operates multiple facilities with complex ground based HVCE in support of national aerodynamics and space exploration, and the need exists to improve operational support and close a gap related to capability sustainment where key and experienced staff consistently rotate work assignments and reach their expiration of term of service. The initiation of an AR capability to augment and improve human abilities and training experience in the industrial environment requires planning and establishment of a goal and objectives for the systems and specific applications. This paper explored use of AR in support of Operation staff in real time operation of HVCE and its maintenance. The results identified include identification of specific goal and objectives, challenges related to availability and computer system infrastructure.

  7. Registration using natural features for augmented reality systems.

    PubMed

    Yuan, M L; Ong, S K; Nee, A Y C

    2006-01-01

    Registration is one of the most difficult problems in augmented reality (AR) systems. In this paper, a simple registration method using natural features based on the projective reconstruction technique is proposed. This method consists of two steps: embedding and rendering. Embedding involves specifying four points to build the world coordinate system on which a virtual object will be superimposed. In rendering, the Kanade-Lucas-Tomasi (KLT) feature tracker is used to track the natural feature correspondences in the live video. The natural features that have been tracked are used to estimate the corresponding projective matrix in the image sequence. Next, the projective reconstruction technique is used to transfer the four specified points to compute the registration matrix for augmentation. This paper also proposes a robust method for estimating the projective matrix, where the natural features that have been tracked are normalized (translation and scaling) and used as the input data. The estimated projective matrix will be used as an initial estimate for a nonlinear optimization method that minimizes the actual residual errors based on the Levenberg-Marquardt (LM) minimization method, thus making the results more robust and stable. The proposed registration method has three major advantages: 1) It is simple, as no predefined fiducials or markers are used for registration for either indoor and outdoor AR applications. 2) It is robust, because it remains effective as long as at least six natural features are tracked during the entire augmentation, and the existence of the corresponding projective matrices in the live video is guaranteed. Meanwhile, the robust method to estimate the projective matrix can obtain stable results even when there are some outliers during the tracking process. 3) Virtual objects can still be superimposed on the specified areas, even if some parts of the areas are occluded during the entire process. Some indoor and outdoor experiments have

  8. D3D augmented reality imaging system: proof of concept in mammography

    PubMed Central

    Douglas, David B; Petricoin, Emanuel F; Liotta, Lance; Wilson, Eugene

    2016-01-01

    Purpose The purpose of this article is to present images from simulated breast microcalcifications and assess the pattern of the microcalcifications with a technical development called “depth 3-dimensional (D3D) augmented reality”. Materials and methods A computer, head display unit, joystick, D3D augmented reality software, and an in-house script of simulated data of breast microcalcifications in a ductal distribution were used. No patient data was used and no statistical analysis was performed. Results The D3D augmented reality system demonstrated stereoscopic depth perception by presenting a unique image to each eye, focal point convergence, head position tracking, 3D cursor, and joystick fly-through. Conclusion The D3D augmented reality imaging system offers image viewing with depth perception and focal point convergence. The D3D augmented reality system should be tested to determine its utility in clinical practice. PMID:27563261

  9. An Augmented-Reality Edge Enhancement Application for Google Glass

    PubMed Central

    Hwang, Alex D.; Peli, Eli

    2014-01-01

    Purpose Google Glass provides a platform that can be easily extended to include a vision enhancement tool. We have implemented an augmented vision system on Glass, which overlays enhanced edge information over the wearer’s real world view, to provide contrast-improved central vision to the Glass wearers. The enhanced central vision can be naturally integrated with scanning. Methods Goggle Glass’s camera lens distortions were corrected by using an image warping. Since the camera and virtual display are horizontally separated by 16mm, and the camera aiming and virtual display projection angle are off by 10°, the warped camera image had to go through a series of 3D transformations to minimize parallax errors before the final projection to the Glass’ see-through virtual display. All image processes were implemented to achieve near real-time performance. The impacts of the contrast enhancements were measured for three normal vision subjects, with and without a diffuser film to simulate vision loss. Results For all three subjects, significantly improved contrast sensitivity was achieved when the subjects used the edge enhancements with a diffuser film. The performance boost is limited by the Glass camera’s performance. The authors assume this accounts for why performance improvements were observed only with the diffuser filter condition (simulating low vision). Conclusions Improvements were measured with simulated visual impairments. With the benefit of see-through augmented reality edge enhancement, natural visual scanning process is possible, and suggests that the device may provide better visual function in a cosmetically and ergonomically attractive format for patients with macular degeneration. PMID:24978871

  10. Augmented Reality-Based Navigation System for Wrist Arthroscopy: Feasibility

    PubMed Central

    Zemirline, Ahmed; Agnus, Vincent; Soler, Luc; Mathoulin, Christophe L.; Liverneaux, Philippe A.; Obdeijn, Miryam

    2013-01-01

    Purpose In video surgery, and more specifically in arthroscopy, one of the major problems is positioning the camera and instruments within the anatomic environment. The concept of computer-guided video surgery has already been used in ear, nose, and throat (ENT), gynecology, and even in hip arthroscopy. These systems, however, rely on optical or mechanical sensors, which turn out to be restricting and cumbersome. The aim of our study was to develop and evaluate the accuracy of a navigation system based on electromagnetic sensors in video surgery. Methods We used an electromagnetic localization device (Aurora, Northern Digital Inc., Ontario, Canada) to track the movements in space of both the camera and the instruments. We have developed a dedicated application in the Python language, using the VTK library for the graphic display and the OpenCV library for camera calibration. Results A prototype has been designed and evaluated for wrist arthroscopy. It allows display of the theoretical position of instruments onto the arthroscopic view with useful accuracy. Discussion The augmented reality view represents valuable assistance when surgeons want to position the arthroscope or locate their instruments. It makes the maneuver more intuitive, increases comfort, saves time, and enhances concentration. PMID:24436832

  11. Concealed target detection using augmented reality with SIRE radar

    NASA Astrophysics Data System (ADS)

    Saponaro, Philip; Kambhamettu, Chandra; Ranney, Kenneth; Sullivan, Anders

    2013-05-01

    The Synchronous Impulse Reconstruction (SIRE) forward-looking radar, developed by the U.S. Army Research Laboratory (ARL), can detect concealed targets using ultra-wideband synthetic aperture technology. The SIRE radar has been mounted on a Ford Expedition and combined with other sensors, including a pan/tilt/zoom camera, to test its capabilities of concealed target detection in a realistic environment. Augmented Reality (AR) can be used to combine the SIRE radar image with the live camera stream into one view, which provides the user with information that is quicker to assess and easier to understand than each separated. In this paper we present an AR system which utilizes a global positioning system (GPS) and inertial measurement unit (IMU) to overlay a SIRE radar image onto a live video stream. We describe a method for transforming 3D world points in the UTM coordinate system onto the video stream by calibrating for the intrinsic parameters of the camera. This calibration is performed offline to save computation time and achieve real time performance. Since the intrinsic parameters are affected by the zoom of the camera, we calibrate at eleven different zooms and interpolate. We show the results of a real time transformation of the SAR imagery onto the video stream. Finally, we quantify both the 2D error and 3D residue associated with our transformation and show that the amount of error is reasonable for our application.

  12. Realistic real-time outdoor rendering in augmented reality.

    PubMed

    Kolivand, Hoshang; Sunar, Mohd Shahrizal

    2014-01-01

    Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems.

  13. An augmented reality binocular system (ARBS) for air traffic controllers

    NASA Astrophysics Data System (ADS)

    Fulbrook, Jim E.; Ruffner, John W.; Labbe, Roger

    2008-04-01

    The primary means by which air traffic tower controllers obtain information is through direct out-thewindow viewing, although a considerable amount of time is spent looking at electronic displays and other information sources inside the tower cab. The Air Force Research Laboratory sponsored the development of a prototype Augmented Reality Binocular System (ARBS) that enhances tower controller performance, situation awareness, and safety. The ARBS is composed of a virtual binocular (VB) that displays real-time imagery from high resolution telephoto cameras and sensors mounted on pan/tilt units (PTUs). The selected PTU tracks to the movement of the VB, which has an inertial heading and elevation sensor. Relevant airfield situation text and graphic depictions that identify airfield features are overlaid on the imagery. In addition, the display is capable of labeling and tracking vehicles on which an Automatic Dependent Surveillance - Broadcast (ADS-B) system has been installed. The ARBS provides air traffic controllers and airfield security forces with the capability to orient toward, observe, and conduct continuous airfield operations and surveillance/security missions from any number of viewing aspects in limited visibility conditions. In this paper, we describe the ARBS in detail, discuss the results of a Usability Test of the prototype ARBS, and discuss ideas for follow-on efforts to develop the ARBS to a fieldable level.

  14. Augmented reality enabling intelligence exploitation at the edge

    NASA Astrophysics Data System (ADS)

    Kase, Sue E.; Roy, Heather; Bowman, Elizabeth K.; Patton, Debra

    2015-05-01

    Today's Warfighters need to make quick decisions while interacting in densely populated environments comprised of friendly, hostile, and neutral host nation locals. However, there is a gap in the real-time processing of big data streams for edge intelligence. We introduce a big data processing pipeline called ARTEA that ingests, monitors, and performs a variety of analytics including noise reduction, pattern identification, and trend and event detection in the context of an area of operations (AOR). Results of the analytics are presented to the Soldier via an augmented reality (AR) device Google Glass (Glass). Non-intrusive AR devices such as Glass can visually communicate contextually relevant alerts to the Soldier based on the current mission objectives, time, location, and observed or sensed activities. This real-time processing and AR presentation approach to knowledge discovery flattens the intelligence hierarchy enabling the edge Soldier to act as a vital and active participant in the analysis process. We report preliminary observations testing ARTEA and Glass in a document exploitation and person of interest scenario simulating edge Soldier participation in the intelligence process in disconnected deployment conditions.

  15. L-split marker for augmented reality in aircraft assembly

    NASA Astrophysics Data System (ADS)

    Han, Pengfei; Zhao, Gang

    2016-04-01

    In order to improve the performance of conventional square markers widely used by marker-based augmented reality systems in aircraft assembly environments, an L-split marker is proposed. Every marker consists of four separate L-shaped parts and each of them contains partial information about the marker. Geometric features of the L-shape, which are more discriminate than the symmetrical square shape adopted by conventional markers, are used to detect proposed markers from the camera images effectively. The marker is split into four separate parts in order to improve the robustness to occlusion and curvature to some extent. The registration process can be successfully completed as long as three parts are detected (up to about 80% of the area could be occluded). Moreover, when we attach the marker on nonplanar surfaces, the curvature status of the marker can be roughly analyzed with every part's normal direction, which can be obtained since their six corners have been explicitly determined in the previous detection process. And based on the marker design, new detection and recognition algorithms are proposed and detailed. The experimental results show that the marker and the algorithms are effective.

  16. Development of a mobile borehole investigation software using augmented reality

    NASA Astrophysics Data System (ADS)

    Son, J.; Lee, S.; Oh, M.; Yun, D. E.; Kim, S.; Park, H. D.

    2015-12-01

    Augmented reality (AR) is one of the most developing technologies in smartphone and IT areas. While various applications have been developed using the AR, there are a few geological applications which adopt its advantages. In this study, a smartphone application to manage boreholes using AR has been developed. The application is consisted of three major modules, an AR module, a map module and a data management module. The AR module calculates the orientation of the device and displays nearby boreholes distributed in three dimensions using the orientation. This module shows the boreholes in a transparent layer on a live camera screen so the user can find and understand the overall characteristics of the underground geology. The map module displays the boreholes on a 2D map to show their distribution and the location of the user. The database module uses SQLite library which has proper characteristics for mobile platforms, and Binary XML is adopted to enable containing additional customized data. The application is able to provide underground information in an intuitive and refined forms and to decrease time and general equipment required for geological field investigations.

  17. A spatially augmented reality sketching interface for architectural daylighting design.

    PubMed

    Sheng, Yu; Yapo, Theodore C; Young, Christopher; Cutler, Barbara

    2011-01-01

    We present an application of interactive global illumination and spatially augmented reality to architectural daylight modeling that allows designers to explore alternative designs and new technologies for improving the sustainability of their buildings. Images of a model in the real world, captured by a camera above the scene, are processed to construct a virtual 3D model. To achieve interactive rendering rates, we use a hybrid rendering technique, leveraging radiosity to simulate the interreflectance between diffuse patches and shadow volumes to generate per-pixel direct illumination. The rendered images are then projected on the real model by four calibrated projectors to help users study the daylighting illumination. The virtual heliodon is a physical design environment in which multiple designers, a designer and a client, or a teacher and students can gather to experience animated visualizations of the natural illumination within a proposed design by controlling the time of day, season, and climate. Furthermore, participants may interactively redesign the geometry and materials of the space by manipulating physical design elements and see the updated lighting simulation.

  18. Realistic Real-Time Outdoor Rendering in Augmented Reality

    PubMed Central

    Kolivand, Hoshang; Sunar, Mohd Shahrizal

    2014-01-01

    Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems. PMID:25268480

  19. Intraoperative surgical photoacoustic microscopy (IS-PAM) using augmented reality

    NASA Astrophysics Data System (ADS)

    Lee, Changho; Han, Seunghoon; Kim, Sehui; Jeon, Mansik; Kim, Jeehyun; Kim, Chulhong

    2014-03-01

    We have developed an intraoperative surgical photoacoustic microscopy (IS-PAM) system by integrating an optical resolution photoacoustic microscopy (OR-PAM) and conventional surgical microscope. Based on the common optical path in the OR-PAM and microscope system, we can acquire the PAM and microscope images at the same time. Furthermore, by utilizing a mini-sized beam projector, 2D PAM images are back-projected onto the microscope view plane as augmented reality. Thus, both the conventional microscopic and 2D cross-sectional PAM images are displayed on the plane through an eyepiece lens of the microscope. In our method, additional image display tool is not required to show the PAM image. Therefore, it potentially offers significant convenience to surgeons without movement of their sights during surgeries. In order to demonstrate the performance of our IS-PAM system, first, we successfully monitored needle intervention in phantoms. Moreover, we successfully guided needle insertion into mice skins in vivo by visualizing surrounding blood vessels from the PAM images and the magnified skin surfaces from the conventional microscopic images simultaneously.

  20. Augmented Reality Image Guidance in Minimally Invasive Prostatectomy

    NASA Astrophysics Data System (ADS)

    Cohen, Daniel; Mayer, Erik; Chen, Dongbin; Anstee, Ann; Vale, Justin; Yang, Guang-Zhong; Darzi, Ara; Edwards, Philip'eddie'

    This paper presents our work aimed at providing augmented reality (AR) guidance of robot-assisted laparoscopic surgery (RALP) using the da Vinci system. There is a good clinical case for guidance due to the significant rate of complications and steep learning curve for this procedure. Patients who were due to undergo robotic prostatectomy for organ-confined prostate cancer underwent preoperative 3T MRI scans of the pelvis. These were segmented and reconstructed to form 3D images of pelvic anatomy. The reconstructed image was successfully overlaid onto screenshots of the recorded surgery post-procedure. Surgeons who perform minimally-invasive prostatectomy took part in a user-needs analysis to determine the potential benefits of an image guidance system after viewing the overlaid images. All surgeons stated that the development would be useful at key stages of the surgery and could help to improve the learning curve of the procedure and improve functional and oncological outcomes. Establishing the clinical need in this way is a vital early step in development of an AR guidance system. We have also identified relevant anatomy from preoperative MRI. Further work will be aimed at automated registration to account for tissue deformation during the procedure, using a combination of transrectal ultrasound and stereoendoscopic video.

  1. Shape recognition and pose estimation for mobile Augmented Reality.

    PubMed

    Hagbi, Nate; Bergig, Oriel; El-Sana, Jihad; Billinghurst, Mark

    2011-10-01

    Nestor is a real-time recognition and camera pose estimation system for planar shapes. The system allows shapes that carry contextual meanings for humans to be used as Augmented Reality (AR) tracking targets. The user can teach the system new shapes in real time. New shapes can be shown to the system frontally, or they can be automatically rectified according to previously learned shapes. Shapes can be automatically assigned virtual content by classification according to a shape class library. Nestor performs shape recognition by analyzing contour structures and generating projective-invariant signatures from their concavities. The concavities are further used to extract features for pose estimation and tracking. Pose refinement is carried out by minimizing the reprojection error between sample points on each image contour and its library counterpart. Sample points are matched by evolving an active contour in real time. Our experiments show that the system provides stable and accurate registration, and runs at interactive frame rates on a Nokia N95 mobile phone.

  2. Time-lapse microscopy using smartphone with augmented reality markers.

    PubMed

    Baek, Dongyoub; Cho, Sungmin; Yun, Kyungwon; Youn, Keehong; Bang, Hyunwoo

    2014-04-01

    A prototype system that replaces the conventional time-lapse imaging in microscopic inspection for use with smartphones is presented. Existing time-lapse imaging requires a video data feed between a microscope and a computer that varies depending on the type of image grabber. Even with proper hardware setups, a series of tedious and repetitive tasks is still required to relocate to the region-of-interest (ROI) of the specimens. In order to simplify the system and improve the efficiency of time-lapse imaging tasks, a smartphone-based platform utilizing microscopic augmented reality (μ-AR) markers is proposed. To evaluate the feasibility and efficiency of the proposed system, a user test was designed and performed, measuring the elapse time for a trial of the task starting from the execution of the application software to the completion of restoring and imaging of an ROI saved in advance. The results of the user test showed that the average elapse time was 65.3 ± 15.2 s with 6.86 ± 3.61 μm of position error and 0.08 ± 0.40 degrees of angle error. This indicates that the time-lapse imaging task was accomplished rapidly with a high level of accuracy. Thus, simplification of both the system and the task was achieved via our proposed system.

  3. MetaTree: augmented reality narrative explorations of urban forests

    NASA Astrophysics Data System (ADS)

    West, Ruth; Margolis, Todd; O'Neil-Dunne, Jarlath; Mendelowitz, Eitan

    2012-03-01

    As cities world-wide adopt and implement reforestation initiatives to plant millions of trees in urban areas, they are engaging in what is essentially a massive ecological and social experiment. Existing air-borne, space-borne, and fieldbased imaging and inventory mechanisms fail to provide key information on urban tree ecology that is crucial to informing management, policy, and supporting citizen initiatives for the planting and stewardship of trees. The shortcomings of the current approaches include: spatial and temporal resolution, poor vantage point, cost constraints and biological metric limitations. Collectively, this limits their effectiveness as real-time inventory and monitoring tools. Novel methods for imaging and monitoring the status of these emerging urban forests and encouraging their ongoing stewardship by the public are required to ensure their success. This art-science collaboration proposes to re-envision citizens' relationship with urban spaces by foregrounding urban trees in relation to local architectural features and simultaneously creating new methods for urban forest monitoring. We explore creating a shift from overhead imaging or field-based tree survey data acquisition methods to continuous, ongoing monitoring by citizen scientists as part of a mobile augmented reality experience. We consider the possibilities of this experience as a medium for interacting with and visualizing urban forestry data and for creating cultural engagement with urban ecology.

  4. Explore and experience: mobile augmented reality for medical training.

    PubMed

    Albrecht, Urs-Vito; Noll, Christoph; von Jan, Ute

    2013-01-01

    In medicine, especially in basic education, it may sometimes be inappropriate to integrate real patients into classes due to ethical issues that must be avoided. Nevertheless, the quality of medical education may suffer without the use of real cases. This is especially true of medical specialties such as legal medicine: survivors of a crime are already subjected to procedures that constitute a severe emotional burden and may cause additional distress even without the added presence of students. Using augmented reality based applications may alleviate this ethical dilemma by giving students the possibility to practice the necessary skills based on virtual but nevertheless almost realistic cases. The app "mARble®" that is presented in this paper follows this approach. The currently available learning module for legal medicine gives users an opportunity to learn about various wound patterns by virtually overlaying them on their own skin and is applicable in different learning settings. Preliminary evaluation results covering learning efficiency and emotional components of the learning process are promising. Content modules for other medical specialtiesare currently under construction.

  5. Integrating a Mobile Augmented Reality Activity to Contextualize Student Learning of a Socioscienti?c Issue

    ERIC Educational Resources Information Center

    Chang, Hsin-Yi; Wu, Hsin-Kai; Hsu, Ying-Shao

    2013-01-01

    virtual objects or information overlaying physical objects or environments, resulting in a mixed reality in which virtual objects and real environments coexist in a meaningful way to augment learning…

  6. An indoor augmented reality mobile application for simulation of building evacuation

    NASA Astrophysics Data System (ADS)

    Sharma, Sharad; Jerripothula, Shanmukha

    2015-03-01

    Augmented Reality enables people to remain connected with the physical environment they are in, and invites them to look at the world from new and alternative perspectives. There has been an increasing interest in emergency evacuation applications for mobile devices. Nearly all the smart phones these days are Wi-Fi and GPS enabled. In this paper, we propose a novel emergency evacuation system that will help people to safely evacuate a building in case of an emergency situation. It will further enhance knowledge and understanding of where the exits are in the building and safety evacuation procedures. We have applied mobile augmented reality (mobile AR) to create an application with Unity 3D gaming engine. We show how the mobile AR application is able to display a 3D model of the building and animation of people evacuation using markers and web camera. The system gives a visual representation of a building in 3D space, allowing people to see where exits are in the building through the use of a smart phone or tablets. Pilot studies were conducted with the system showing its partial success and demonstrated the effectiveness of the application in emergency evacuation. Our computer vision methods give good results when the markers are closer to the camera, but accuracy decreases when the markers are far away from the camera.

  7. Transforming Experience: The Potential of Augmented Reality and Virtual Reality for Enhancing Personal and Clinical Change

    PubMed Central

    Riva, Giuseppe; Baños, Rosa M.; Botella, Cristina; Mantovani, Fabrizia; Gaggioli, Andrea

    2016-01-01

    During life, many personal changes occur. These include changing house, school, work, and even friends and partners. However, the daily experience shows clearly that, in some situations, subjects are unable to change even if they want to. The recent advances in psychology and neuroscience are now providing a better view of personal change, the change affecting our assumptive world: (a) the focus of personal change is reducing the distance between self and reality (conflict); (b) this reduction is achieved through (1) an intense focus on the particular experience creating the conflict or (2) an internal or external reorganization of this experience; (c) personal change requires a progression through a series of different stages that however happen in discontinuous and non-linear ways; and (d) clinical psychology is often used to facilitate personal change when subjects are unable to move forward. Starting from these premises, the aim of this paper is to review the potential of virtuality for enhancing the processes of personal and clinical change. First, the paper focuses on the two leading virtual technologies – augmented reality (AR) and virtual reality (VR) – exploring their current uses in behavioral health and the outcomes of the 28 available systematic reviews and meta-analyses. Then the paper discusses the added value provided by VR and AR in transforming our external experience by focusing on the high level of personal efficacy and self-reflectiveness generated by their sense of presence and emotional engagement. Finally, it outlines the potential future use of virtuality for transforming our inner experience by structuring, altering, and/or replacing our bodily self-consciousness. The final outcome may be a new generation of transformative experiences that provide knowledge that is epistemically inaccessible to the individual until he or she has that experience, while at the same time transforming the individual’s worldview. PMID:27746747

  8. Transforming Experience: The Potential of Augmented Reality and Virtual Reality for Enhancing Personal and Clinical Change.

    PubMed

    Riva, Giuseppe; Baños, Rosa M; Botella, Cristina; Mantovani, Fabrizia; Gaggioli, Andrea

    2016-01-01

    During life, many personal changes occur. These include changing house, school, work, and even friends and partners. However, the daily experience shows clearly that, in some situations, subjects are unable to change even if they want to. The recent advances in psychology and neuroscience are now providing a better view of personal change, the change affecting our assumptive world: (a) the focus of personal change is reducing the distance between self and reality (conflict); (b) this reduction is achieved through (1) an intense focus on the particular experience creating the conflict or (2) an internal or external reorganization of this experience; (c) personal change requires a progression through a series of different stages that however happen in discontinuous and non-linear ways; and (d) clinical psychology is often used to facilitate personal change when subjects are unable to move forward. Starting from these premises, the aim of this paper is to review the potential of virtuality for enhancing the processes of personal and clinical change. First, the paper focuses on the two leading virtual technologies - augmented reality (AR) and virtual reality (VR) - exploring their current uses in behavioral health and the outcomes of the 28 available systematic reviews and meta-analyses. Then the paper discusses the added value provided by VR and AR in transforming our external experience by focusing on the high level of personal efficacy and self-reflectiveness generated by their sense of presence and emotional engagement. Finally, it outlines the potential future use of virtuality for transforming our inner experience by structuring, altering, and/or replacing our bodily self-consciousness. The final outcome may be a new generation of transformative experiences that provide knowledge that is epistemically inaccessible to the individual until he or she has that experience, while at the same time transforming the individual's worldview.

  9. Augmented Reality Cues and Elderly Driver Hazard Perception

    PubMed Central

    Schall, Mark C.; Rusch, Michelle L.; Lee, John D.; Dawson, Jeffrey D.; Thomas, Geb; Aksan, Nazan; Rizzo, Matthew

    2013-01-01

    Objective Evaluate the effectiveness of augmented reality (AR) cues in improving driving safety in elderly drivers who are at increased crash risk due to cognitive impairments. Background Cognitively challenging driving environments pose a particular crash risk for elderly drivers. AR cueing is a promising technology to mitigate risk by directing driver attention to roadway hazards. This study investigates whether AR cues improve or interfere with hazard perception in elderly drivers with age-related cognitive decline. Methods Twenty elderly (Mean= 73 years, SD= 5 years), licensed drivers with a range of cognitive abilities measured by a speed of processing (SOP) composite participated in a one-hour drive in an interactive, fixed-base driving simulator. Each participant drove through six, straight, six-mile-long rural roadway scenarios following a lead vehicle. AR cues directed attention to potential roadside hazards in three of the scenarios, and the other three were uncued (baseline) drives. Effects of AR cueing were evaluated with respect to: 1) detection of hazardous target objects, 2) interference with detecting nonhazardous secondary objects, and 3) impairment in maintaining safe distance behind a lead vehicle. Results AR cueing improved the detection of hazardous target objects of low visibility. AR cues did not interfere with detection of nonhazardous secondary objects and did not impair ability to maintain safe distance behind a lead vehicle. SOP capacity did not moderate those effects. Conclusion AR cues show promise for improving elderly driver safety by increasing hazard detection likelihood without interfering with other driving tasks such as maintaining safe headway. PMID:23829037

  10. Augmented reality guidance system for peripheral nerve blocks

    NASA Astrophysics Data System (ADS)

    Wedlake, Chris; Moore, John; Rachinsky, Maxim; Bainbridge, Daniel; Wiles, Andrew D.; Peters, Terry M.

    2010-02-01

    Peripheral nerve block treatments are ubiquitous in hospitals and pain clinics worldwide. State of the art techniques use ultrasound (US) guidance and/or electrical stimulation to verify needle tip location. However, problems such as needle-US beam alignment, poor echogenicity of block needles and US beam thickness can make it difficult for the anesthetist to know the exact needle tip location. Inaccurate therapy delivery raises obvious safety and efficacy issues. We have developed and evaluated a needle guidance system that makes use of a magnetic tracking system (MTS) to provide an augmented reality (AR) guidance platform to accurately localize the needle tip as well as its projected trajectory. Five anesthetists and five novices performed simulated nerve block deliveries in a polyvinyl alcohol phantom to compare needle guidance under US alone to US placed in our AR environment. Our phantom study demonstrated a decrease in targeting attempts, decrease in contacting of critical structures, and an increase in accuracy of 0.68 mm compared to 1.34mm RMS in US guidance alone. Currently, the MTS uses 18 and 21 gauge hypodermic needles with a 5 degree of freedom sensor located at the needle tip. These needles can only be sterilized using an ethylene oxide process. In the interest of providing clinicians with a simple and efficient guidance system, we also evaluated attaching the sensor at the needle hub as a simple clip-on device. To do this, we simultaneously performed a needle bending study to assess the reliability of a hub-based sensor.

  11. Augmented reality in healthcare education: an integrative review.

    PubMed

    Zhu, Egui; Hadadgar, Arash; Masiello, Italo; Zary, Nabil

    2014-01-01

    Background. The effective development of healthcare competencies poses great educational challenges. A possible approach to provide learning opportunities is the use of augmented reality (AR) where virtual learning experiences can be embedded in a real physical context. The aim of this study was to provide a comprehensive overview of the current state of the art in terms of user acceptance, the AR applications developed and the effect of AR on the development of competencies in healthcare. Methods. We conducted an integrative review. Integrative reviews are the broadest type of research review methods allowing for the inclusion of various research designs to more fully understand a phenomenon of concern. Our review included multi-disciplinary research publications in English reported until 2012. Results. 2529 research papers were found from ERIC, CINAHL, Medline, PubMed, Web of Science and Springer-link. Three qualitative, 20 quantitative and 2 mixed studies were included. Using a thematic analysis, we've described three aspects related to the research, technology and education. This study showed that AR was applied in a wide range of topics in healthcare education. Furthermore acceptance for AR as a learning technology was reported among the learners and its potential for improving different types of competencies. Discussion. AR is still considered as a novelty in the literature. Most of the studies reported early prototypes. Also the designed AR applications lacked an explicit pedagogical theoretical framework. Finally the learning strategies adopted were of the traditional style 'see one, do one and teach one' and do not integrate clinical competencies to ensure patients' safety.

  12. Augmented reality in healthcare education: an integrative review

    PubMed Central

    Zhu, Egui; Hadadgar, Arash; Masiello, Italo

    2014-01-01

    Background. The effective development of healthcare competencies poses great educational challenges. A possible approach to provide learning opportunities is the use of augmented reality (AR) where virtual learning experiences can be embedded in a real physical context. The aim of this study was to provide a comprehensive overview of the current state of the art in terms of user acceptance, the AR applications developed and the effect of AR on the development of competencies in healthcare. Methods. We conducted an integrative review. Integrative reviews are the broadest type of research review methods allowing for the inclusion of various research designs to more fully understand a phenomenon of concern. Our review included multi-disciplinary research publications in English reported until 2012. Results. 2529 research papers were found from ERIC, CINAHL, Medline, PubMed, Web of Science and Springer-link. Three qualitative, 20 quantitative and 2 mixed studies were included. Using a thematic analysis, we’ve described three aspects related to the research, technology and education. This study showed that AR was applied in a wide range of topics in healthcare education. Furthermore acceptance for AR as a learning technology was reported among the learners and its potential for improving different types of competencies. Discussion. AR is still considered as a novelty in the literature. Most of the studies reported early prototypes. Also the designed AR applications lacked an explicit pedagogical theoretical framework. Finally the learning strategies adopted were of the traditional style ‘see one, do one and teach one’ and do not integrate clinical competencies to ensure patients’ safety. PMID:25071992

  13. Augmented reality application utility for aviation maintenance work instruction

    NASA Astrophysics Data System (ADS)

    Pourcho, John Bryan

    Current aviation maintenance work instructions do not display information effectively enough to prevent costly errors and safety concerns. Aircraft are complex assemblies of highly interrelated components that confound troubleshooting and can make the maintenance procedure difficult (Drury & Gramopadhye, 2001). The sophisticated nature of aircraft maintenance necessitates a revolutionized training intervention for aviation maintenance technicians (United States General Accounting Office, 2003). Quite simply, the paper based job task cards fall short of offering rapid access to technical data and the system or component visualization necessary for working on complex integrated aircraft systems. Possible solutions to this problem include upgraded standards for paper based task cards and the use of integrated 3D product definition used on various mobile platforms (Ropp, Thomas, Lee, Broyles, Lewin, Andreychek, & Nicol, 2013). Previous studies have shown that incorporation of 3D graphics in work instructions allow the user to more efficiently and accurately interpret maintenance information (Jackson & Batstone, 2008). For aircraft maintenance workers, the use of mobile 3D model-based task cards could make current paper task card standards obsolete with their ability to deliver relevant, synchronized information to and from the hangar. Unlike previous versions of 3D model-based definition task cards and paper task cards, which are currently used in the maintenance industry, 3D model based definition task cards have the potential to be more mobile and accessible. Utilizing augmented reality applications on mobile devices to seamlessly deliver 3D product definition on mobile devices could increase the efficiency, accuracy, and reduce the mental workload for technicians when performing maintenance tasks (Macchiarella, 2004). This proposal will serve as a literary review of the aviation maintenance industry, the spatial ability of maintenance technicians, and benefits of

  14. Integrating Augmented Reality in Higher Education: A Multidisciplinary Study of Student Perceptions

    ERIC Educational Resources Information Center

    Delello, Julie A.; McWhorter, Rochell R.; Camp, Kerri M.

    2015-01-01

    Augmented reality (AR) is an emerging technology that blends physical objects with virtual reality. Through the integration of digital and print media, a gap between the "on and offline" worlds are merged, radically shifting student-computer interaction in the classroom. This research examined the results of a multiple case study on the…

  15. Game-Based Evacuation Drill Using Augmented Reality and Head-Mounted Display

    ERIC Educational Resources Information Center

    Kawai, Junya; Mitsuhara, Hiroyuki; Shishibori, Masami

    2016-01-01

    Purpose: Evacuation drills should be more realistic and interactive. Focusing on situational and audio-visual realities and scenario-based interactivity, the authors have developed a game-based evacuation drill (GBED) system that presents augmented reality (AR) materials on tablet computers. The paper's current research purpose is to improve…

  16. Constructing Liminal Blends in a Collaborative Augmented-Reality Learning Environment

    ERIC Educational Resources Information Center

    Enyedy, Noel; Danish, Joshua A.; DeLiema, David

    2015-01-01

    In vision-based augmented-reality (AR) environments, users view the physical world through a video feed or device that "augments" the display with a graphical or informational overlay. Our goal in this manuscript is to ask "how" and "why" these new technologies create opportunities for learning. We suggest that AR is…

  17. Evaluating the Effect on User Perception and Performance of Static and Dynamic Contents Deployed in Augmented Reality Based Learning Application

    ERIC Educational Resources Information Center

    Montoya, Mauricio Hincapié; Díaz, Christian Andrés; Moreno, Gustavo Adolfo

    2017-01-01

    Nowadays, the use of technology to improve teaching and learning experiences in the classroom has been promoted. One of these technologies is augmented reality, which allows overlaying layers of virtual information on real scene with the aim of increasing the perception that user has of reality. Augmented reality has proved to offer several…

  18. Sensorized Garment Augmented 3D Pervasive Virtual Reality System

    NASA Astrophysics Data System (ADS)

    Gulrez, Tauseef; Tognetti, Alessandro; de Rossi, Danilo

    Virtual reality (VR) technology has matured to a point where humans can navigate in virtual scenes; however, providing them with a comfortable fully immersive role in VR remains a challenge. Currently available sensing solutions do not provide ease of deployment, particularly in the seated position due to sensor placement restrictions over the body, and optic-sensing requires a restricted indoor environment to track body movements. Here we present a 52-sensor laden garment interfaced with VR, which offers both portability and unencumbered user movement in a VR environment. This chapter addresses the systems engineering aspects of our pervasive computing solution of the interactive sensorized 3D VR and presents the initial results and future research directions. Participants navigated in a virtual art gallery using natural body movements that were detected by their wearable sensor shirt and then mapped the signals to electrical control signals responsible for VR scene navigation. The initial results are positive, and offer many opportunities for use in computationally intelligentman-machine multimedia control.

  19. Feasibility of Augmented Reality in Clinical Simulations: Using Google Glass With Manikins

    PubMed Central

    2016-01-01

    Background Studies show that students who use fidelity-based simulation technology perform better and have higher retention rates than peers who learn in traditional paper-based training. Augmented reality is increasingly being used as a teaching and learning tool in a continual effort to make simulations more realistic for students. Objective The aim of this project was to assess the feasibility and acceptability of using augmented reality via Google Glass during clinical simulation scenarios for training health science students. Methods Students performed a clinical simulation while watching a video through Google Glass of a patient actor simulating respiratory distress. Following participation in the scenarios students completed two surveys and were questioned if they would recommend continued use of this technology in clinical simulation experiences. Results We were able to have students watch a video in their field of vision of a patient who mimicked the simulated manikin. Students were overall positive about the implications for being able to view a patient during the simulations, and most students recommended using the technology in the future. Overall, students reported perceived realism with augmented reality using Google Glass. However, there were technical and usability challenges with the device. Conclusions As newer portable and consumer-focused technologies become available, augmented reality is increasingly being used as a teaching and learning tool to make clinical simulations more realistic for health science students. We found Google Glass feasible and acceptable as a tool for augmented reality in clinical simulations. PMID:27731862

  20. A novel augmented reality system for displaying inferior alveolar nerve bundles in maxillofacial surgery

    PubMed Central

    Zhu, Ming; Liu, Fei; Chai, Gang; Pan, Jun J.; Jiang, Taoran; Lin, Li; Xin, Yu; Zhang, Yan; Li, Qingfeng

    2017-01-01

    Augmented reality systems can combine virtual images with a real environment to ensure accurate surgery with lower risk. This study aimed to develop a novel registration and tracking technique to establish a navigation system based on augmented reality for maxillofacial surgery. Specifically, a virtual image is reconstructed from CT data using 3D software. The real environment is tracked by the augmented reality (AR) software. The novel registration strategy that we created uses an occlusal splint compounded with a fiducial marker (OSM) to establish a relationship between the virtual image and the real object. After the fiducial marker is recognized, the virtual image is superimposed onto the real environment, forming the “integrated image” on semi-transparent glass. Via the registration process, the integral image, which combines the virtual image with the real scene, is successfully presented on the semi-transparent helmet. The position error of this navigation system is 0.96 ± 0.51 mm. This augmented reality system was applied in the clinic and good surgical outcomes were obtained. The augmented reality system that we established for maxillofacial surgery has the advantages of easy manipulation and high accuracy, which can improve surgical outcomes. Thus, this system exhibits significant potential in clinical applications. PMID:28198442

  1. A novel augmented reality system for displaying inferior alveolar nerve bundles in maxillofacial surgery.

    PubMed

    Zhu, Ming; Liu, Fei; Chai, Gang; Pan, Jun J; Jiang, Taoran; Lin, Li; Xin, Yu; Zhang, Yan; Li, Qingfeng

    2017-02-15

    Augmented reality systems can combine virtual images with a real environment to ensure accurate surgery with lower risk. This study aimed to develop a novel registration and tracking technique to establish a navigation system based on augmented reality for maxillofacial surgery. Specifically, a virtual image is reconstructed from CT data using 3D software. The real environment is tracked by the augmented reality (AR) software. The novel registration strategy that we created uses an occlusal splint compounded with a fiducial marker (OSM) to establish a relationship between the virtual image and the real object. After the fiducial marker is recognized, the virtual image is superimposed onto the real environment, forming the "integrated image" on semi-transparent glass. Via the registration process, the integral image, which combines the virtual image with the real scene, is successfully presented on the semi-transparent helmet. The position error of this navigation system is 0.96 ± 0.51 mm. This augmented reality system was applied in the clinic and good surgical outcomes were obtained. The augmented reality system that we established for maxillofacial surgery has the advantages of easy manipulation and high accuracy, which can improve surgical outcomes. Thus, this system exhibits significant potential in clinical applications.

  2. Towards Determination of Visual Requirements for Augmented Reality Displays and Virtual Environments for the Airport Tower

    NASA Technical Reports Server (NTRS)

    Ellis, Stephen R.

    2006-01-01

    The visual requirements for augmented reality or virtual environments displays that might be used in real or virtual towers are reviewed with respect to similar displays already used in aircraft. As an example of the type of human performance studies needed to determine the useful specifications of augmented reality displays, an optical see-through display was used in an ATC Tower simulation. Three different binocular fields of view (14deg, 28deg, and 47deg) were examined to determine their effect on subjects ability to detect aircraft maneuvering and landing. The results suggest that binocular fields of view much greater than 47deg are unlikely to dramatically improve search performance and that partial binocular overlap is a feasible display technique for augmented reality Tower applications.

  3. Towards Determination of Visual Requirements for Augmented Reality Displays and Virtual Environments for the Airport Tower

    NASA Technical Reports Server (NTRS)

    Ellis, Stephen R.

    2006-01-01

    The visual requirements for augmented reality or virtual environments displays that might be used in real or virtual towers are reviewed wi th respect to similar displays already used in aircraft. As an example of the type of human performance studies needed to determine the use ful specifications of augmented reality displays, an optical see-thro ugh display was used in an ATC Tower simulation. Three different binocular fields of view (14 deg, 28 deg, and 47 deg) were examined to det ermine their effect on subjects# ability to detect aircraft maneuveri ng and landing. The results suggest that binocular fields of view much greater than 47 deg are unlikely to dramatically improve search perf ormance and that partial binocular overlap is a feasible display tech nique for augmented reality Tower applications.

  4. Emerging technology in surgical education: combining real-time augmented reality and wearable computing devices.

    PubMed

    Ponce, Brent A; Menendez, Mariano E; Oladeji, Lasun O; Fryberger, Charles T; Dantuluri, Phani K

    2014-11-01

    The authors describe the first surgical case adopting the combination of real-time augmented reality and wearable computing devices such as Google Glass (Google Inc, Mountain View, California). A 66-year-old man presented to their institution for a total shoulder replacement after 5 years of progressive right shoulder pain and decreased range of motion. Throughout the surgical procedure, Google Glass was integrated with the Virtual Interactive Presence and Augmented Reality system (University of Alabama at Birmingham, Birmingham, Alabama), enabling the local surgeon to interact with the remote surgeon within the local surgical field. Surgery was well tolerated by the patient and early surgical results were encouraging, with an improvement of shoulder pain and greater range of motion. The combination of real-time augmented reality and wearable computing devices such as Google Glass holds much promise in the field of surgery.

  5. A Multimedia, Augmented Reality Interactive System for the Application of a Guided School Tour

    NASA Astrophysics Data System (ADS)

    Lin, Ko-Chun; Huang, Sheng-Wen; Chu, Sheng-Kai; Su, Ming-Wei; Chen, Chia-Yen; Chen, Chi-Fa

    The paper describes an implementation of a multimedia, augmented reality system used for a guided school tour. The aim of this work is to improve the level of interactions between a viewer and the system by means of augmented reality. In the implemented system, hand motions are captured via computer vision based approaches and analyzed to extract representative actions which are used to interact with the system. In this manner, tactile peripheral hardware such as keyboard and mouse can be eliminated. In addition, the proposed system also aims to reduce hardware related costs and avoid health risks associated with contaminations by contact in public areas.

  6. Embedded Augmented Reality Training System for Dynamic Human-Robot Cooperation

    DTIC Science & Technology

    2009-10-01

    Kagami, S., Kanade, T.: Overlay what Humanoid Robot Perceives and Thinks to the Real-world by Mixed Reality System. In: ISMAR 2007: 6th IEEE and ACM...RTO-MP-HFM-169 6 - 1 Embedded Augmented Reality Training System for Dynamic Human- Robot Cooperation Jan A. Neuhoefer (Scientist) Bernhard...effectiveness and flexibility is still essential in many scenarios. Implementing the idea of mutual completion, direct human- robot cooperation appears suitable

  7. Utilizing virtual and augmented reality for educational and clinical enhancements in neurosurgery.

    PubMed

    Pelargos, Panayiotis E; Nagasawa, Daniel T; Lagman, Carlito; Tenn, Stephen; Demos, Joanna V; Lee, Seung J; Bui, Timothy T; Barnette, Natalie E; Bhatt, Nikhilesh S; Ung, Nolan; Bari, Ausaf; Martin, Neil A; Yang, Isaac

    2017-01-01

    Neurosurgery has undergone a technological revolution over the past several decades, from trephination to image-guided navigation. Advancements in virtual reality (VR) and augmented reality (AR) represent some of the newest modalities being integrated into neurosurgical practice and resident education. In this review, we present a historical perspective of the development of VR and AR technologies, analyze its current uses, and discuss its emerging applications in the field of neurosurgery.

  8. Moving from Virtual Reality Exposure-Based Therapy to Augmented Reality Exposure-Based Therapy: A Review

    PubMed Central

    Baus, Oliver; Bouchard, Stéphane

    2014-01-01

    This paper reviews the move from virtual reality exposure-based therapy to augmented reality exposure-based therapy (ARET). Unlike virtual reality (VR), which entails a complete virtual environment (VE), augmented reality (AR) limits itself to producing certain virtual elements to then merge them into the view of the physical world. Although, the general public may only have become aware of AR in the last few years, AR type applications have been around since beginning of the twentieth century. Since, then, technological developments have enabled an ever increasing level of seamless integration of virtual and physical elements into one view. Like VR, AR allows the exposure to stimuli which, due to various reasons, may not be suitable for real-life scenarios. As such, AR has proven itself to be a medium through which individuals suffering from specific phobia can be exposed “safely” to the object(s) of their fear, without the costs associated with programing complete VEs. Thus, ARET can offer an efficacious alternative to some less advantageous exposure-based therapies. Above and beyond presenting what has been accomplished in ARET, this paper covers some less well-known aspects of the history of AR, raises some ARET related issues, and proposes potential avenues to be followed. These include the type of measures to be used to qualify the user’s experience in an augmented reality environment, the exclusion of certain AR-type functionalities from the definition of AR, as well as the potential use of ARET to treat non-small animal phobias, such as social phobia. PMID:24624073

  9. Moving from virtual reality exposure-based therapy to augmented reality exposure-based therapy: a review.

    PubMed

    Baus, Oliver; Bouchard, Stéphane

    2014-01-01

    This paper reviews the move from virtual reality exposure-based therapy to augmented reality exposure-based therapy (ARET). Unlike virtual reality (VR), which entails a complete virtual environment (VE), augmented reality (AR) limits itself to producing certain virtual elements to then merge them into the view of the physical world. Although, the general public may only have become aware of AR in the last few years, AR type applications have been around since beginning of the twentieth century. Since, then, technological developments have enabled an ever increasing level of seamless integration of virtual and physical elements into one view. Like VR, AR allows the exposure to stimuli which, due to various reasons, may not be suitable for real-life scenarios. As such, AR has proven itself to be a medium through which individuals suffering from specific phobia can be exposed "safely" to the object(s) of their fear, without the costs associated with programing complete VEs. Thus, ARET can offer an efficacious alternative to some less advantageous exposure-based therapies. Above and beyond presenting what has been accomplished in ARET, this paper covers some less well-known aspects of the history of AR, raises some ARET related issues, and proposes potential avenues to be followed. These include the type of measures to be used to qualify the user's experience in an augmented reality environment, the exclusion of certain AR-type functionalities from the definition of AR, as well as the potential use of ARET to treat non-small animal phobias, such as social phobia.

  10. The Use of Augmented Reality in Formal Education: A Scoping Review

    ERIC Educational Resources Information Center

    Saltan, Fatih; Arslan, Ömer

    2017-01-01

    Augmented Reality (AR) is recognized as one of the most important developments in educational technology for both higher and K-12 education as emphasized in Horizon report (Johnson et al., 2016, 2015). Furthermore, AR is expected to achieve widespread adoption that will take two to three years in higher education and four to five years in K-12…

  11. Affordances of Augmented Reality in Science Learning: Suggestions for Future Research

    ERIC Educational Resources Information Center

    Cheng, Kun-Hung; Tsai, Chin-Chung

    2013-01-01

    Augmented reality (AR) is currently considered as having potential for pedagogical applications. However, in science education, research regarding AR-aided learning is in its infancy. To understand how AR could help science learning, this review paper firstly has identified two major approaches of utilizing AR technology in science education,…

  12. Examining Young Children's Perception toward Augmented Reality-Infused Dramatic Play

    ERIC Educational Resources Information Center

    Han, Jeonghye; Jo, Miheon; Hyun, Eunja; So, Hyo-jeong

    2015-01-01

    Amid the increasing interest in applying augmented reality (AR) in educational settings, this study explores the design and enactment of an AR-infused robot system to enhance children's satisfaction and sensory engagement with dramatic play activities. In particular, we conducted an exploratory study to empirically examine children's perceptions…

  13. PRISMA-MAR: An Architecture Model for Data Visualization in Augmented Reality Mobile Devices

    ERIC Educational Resources Information Center

    Gomes Costa, Mauro Alexandre Folha; Serique Meiguins, Bianchi; Carneiro, Nikolas S.; Gonçalves Meiguins, Aruanda Simões

    2013-01-01

    This paper proposes an extension to mobile augmented reality (MAR) environments--the addition of data charts to the more usual text, image and video components. To this purpose, we have designed a client-server architecture including the main necessary modules and services to provide an Information Visualization MAR experience. The server side…

  14. Research on gesture recognition of augmented reality maintenance guiding system based on improved SVM

    NASA Astrophysics Data System (ADS)

    Zhao, Shouwei; Zhang, Yong; Zhou, Bin; Ma, Dongxi

    2014-09-01

    Interaction is one of the key techniques of augmented reality (AR) maintenance guiding system. Because of the complexity of the maintenance guiding system's image background and the high dimensionality of gesture characteristics, the whole process of gesture recognition can be divided into three stages which are gesture segmentation, gesture characteristic feature modeling and trick recognition. In segmentation stage, for solving the misrecognition of skin-like region, a segmentation algorithm combing background mode and skin color to preclude some skin-like regions is adopted. In gesture characteristic feature modeling of image attributes stage, plenty of characteristic features are analyzed and acquired, such as structure characteristics, Hu invariant moments features and Fourier descriptor. In trick recognition stage, a classifier based on Support Vector Machine (SVM) is introduced into the augmented reality maintenance guiding process. SVM is a novel learning method based on statistical learning theory, processing academic foundation and excellent learning ability, having a lot of issues in machine learning area and special advantages in dealing with small samples, non-linear pattern recognition at high dimension. The gesture recognition of augmented reality maintenance guiding system is realized by SVM after the granulation of all the characteristic features. The experimental results of the simulation of number gesture recognition and its application in augmented reality maintenance guiding system show that the real-time performance and robustness of gesture recognition of AR maintenance guiding system can be greatly enhanced by improved SVM.

  15. Quantification of Contrast Sensitivity and Color Perception using Head-worn Augmented Reality Displays

    DTIC Science & Technology

    2009-03-01

    the case the Sony Glasstron PLM -50, enables a filter that reduces transmittance of light from the environment. Two binocular displays showed differ...Edition). Thomson Wadsworth, 2007. [5] S. E. Kirkley, Jr. Augmented Reality Performance Assessment Bat- tery (ARPAB): Object Recognition, Distance

  16. Affordances and Limitations of Immersive Participatory Augmented Reality Simulations for Teaching and Learning

    ERIC Educational Resources Information Center

    Dunleavy, Matt; Dede, Chris; Mitchell, Rebecca

    2009-01-01

    The purpose of this study was to document how teachers and students describe and comprehend the ways in which participating in an augmented reality (AR) simulation aids or hinders teaching and learning. Like the multi-user virtual environment (MUVE) interface that underlies Internet games, AR is a good medium for immersive collaborative…

  17. Integrating Augmented Reality Technology to Enhance Children's Learning in Marine Education

    ERIC Educational Resources Information Center

    Lu, Su-Ju; Liu, Ying-Chieh

    2015-01-01

    Marine education comprises rich and multifaceted issues. Raising general awareness of marine environments and issues demands the development of new learning materials. This study adapts concepts from digital game-based learning to design an innovative marine learning program integrating augmented reality (AR) technology for lower grade primary…

  18. The AIDLET Model: A Framework for Selecting Games, Simulations and Augmented Reality Environments in Mobile Learning

    ERIC Educational Resources Information Center

    Bidarra, José; Rothschild, Meagan; Squire, Kurt; Figueiredo, Mauro

    2013-01-01

    Smartphones and other mobile devices like the iPhone, Android, Kindle Fire, and iPad have boosted educators' interest in using mobile media for education. Applications from games to augmented reality are thriving in research settings, and in some cases schools and universities, but relatively little is known about how such devices may be used for…

  19. Usability Evaluation of an Augmented Reality System for Teaching Euclidean Vectors

    ERIC Educational Resources Information Center

    Martin-Gonzalez, Anabel; Chi-Poot, Angel; Uc-Cetina, Victor

    2016-01-01

    Augmented reality (AR) is one of the emerging technologies that has demonstrated to be an efficient technological tool to enhance learning techniques. In this paper, we describe the development and evaluation of an AR system for teaching Euclidean vectors in physics and mathematics. The goal of this pedagogical tool is to facilitate user's…

  20. Apply an Augmented Reality in a Mobile Guidance to Increase Sense of Place for Heritage Places

    ERIC Educational Resources Information Center

    Chang, Yu-Lien; Hou, Huei-Tse; Pan, Chao-Yang; Sung, Yao-Ting; Chang, Kuo-En

    2015-01-01

    Based on the sense of place theory and the design principles of guidance and interpretation, this study developed an augmented reality mobile guidance system that used a historical geo-context-embedded visiting strategy. This tool for heritage guidance and educational activities enhanced visitor sense of place. This study consisted of 3 visitor…

  1. Integrated Authoring Tool for Mobile Augmented Reality-Based E-Learning Applications

    ERIC Educational Resources Information Center

    Lobo, Marcos Fermin; Álvarez García, Víctor Manuel; del Puerto Paule Ruiz, María

    2013-01-01

    Learning management systems are increasingly being used to complement classroom teaching and learning and in some instances even replace traditional classroom settings with online educational tools. Mobile augmented reality is an innovative trend in e-learning that is creating new opportunities for teaching and learning. This article proposes a…

  2. Exploring the Effect of Materials Designed with Augmented Reality on Language Learners' Vocabulary Learning

    ERIC Educational Resources Information Center

    Solak, Ekrem; Cakir, Recep

    2015-01-01

    The purpose of this study was to determine the motivational level of the participants in a language classroom towards course materials designed in accordance with augmented reality technology and to identify the correlation between academic achievement and motivational level. 130 undergraduate students from a state-run university in Turkey…

  3. Examining Augmented Reality to Improve Navigation Skills in Postsecondary Students with Intellectual Disability

    ERIC Educational Resources Information Center

    Smith, Cate C.; Cihak, David F.; Kim, Byungkeon; McMahon, Don D.; Wright, Rachel

    2017-01-01

    The purpose of this study was to examine the effects of using mobile technology to improve navigation skills in three students with intellectual disability (ID) in a postsecondary education program. Navigation skills included using an augmented reality iPhone app to make correct "waypoint" decisions when traveling by foot on a university…

  4. Augmented Reality for Teaching Science Vocabulary to Postsecondary Education Students with Intellectual Disabilities and Autism

    ERIC Educational Resources Information Center

    McMahon, Don D.; Cihak, David F.; Wright, Rachel E.; Bell, Sherry Mee

    2016-01-01

    The purpose of this study was to examine the use of an emerging technology called augmented reality to teach science vocabulary words to college students with intellectual disability and autism spectrum disorders. One student with autism and three students with an intellectual disability participated in a multiple probe across behaviors (i.e.,…

  5. Exploring the Potential of a Location Based Augmented Reality Game for Language Learning

    ERIC Educational Resources Information Center

    Richardson, Donald

    2016-01-01

    This paper adds to the small but growing body of research into the potential of augmented reality games for teaching and learning English as a foreign language (EFL). It explores the extent to which such games enhance the language learning experience of advanced level EFL learners. The author draws on his work developing "Mission not really…

  6. Comparing Virtual and Location-Based Augmented Reality Mobile Learning: Emotions and Learning Outcomes

    ERIC Educational Resources Information Center

    Harley, Jason M.; Poitras, Eric G.; Jarrell, Amanda; Duffy, Melissa C.; Lajoie, Susanne P.

    2016-01-01

    Research on the effectiveness of augmented reality (AR) on learning exists, but there is a paucity of empirical work that explores the role that positive emotions play in supporting learning in such settings. To address this gap, this study compared undergraduate students' emotions and learning outcomes during a guided historical tour using mobile…

  7. What Teachers Need to Know about Augmented Reality Enhanced Learning Environments

    ERIC Educational Resources Information Center

    Wasko, Christopher

    2013-01-01

    Augmented reality (AR) enhanced learning environments have been designed to teach a variety of subjects by having learners act like professionals in the field as opposed to students in a classroom. The environments, grounded in constructivist and situated learning theories, place students in a meaningful, non-classroom environment and force them…

  8. Assessing the Effectiveness of Learning Solid Geometry by Using an Augmented Reality-Assisted Learning System

    ERIC Educational Resources Information Center

    Lin, Hao-Chiang Koong; Chen, Mei-Chi; Chang, Chih-Kai

    2015-01-01

    This study integrates augmented reality (AR) technology into teaching activities to design a learning system that assists junior high-school students in learning solid geometry. The following issues are addressed: (1) the relationship between achievements in mathematics and performance in spatial perception; (2) whether system-assisted learning…

  9. Making the Invisible Visible in Science Museums through Augmented Reality Devices

    ERIC Educational Resources Information Center

    Yoon, Susan A.; Wang, Joyce

    2014-01-01

    Despite the potential of augmented reality (AR) in enabling students to construct new understanding, little is known about how the processes and interactions with the multimedia lead to increased learning. This study seeks to explore the affordances of an AR tool on learning that is focused on the science concept of magnets and magnetic fields.…

  10. Using Augmented Reality in Early Art Education: A Case Study in Hong Kong Kindergarten

    ERIC Educational Resources Information Center

    Huang, Yujia; Li, Hui; Fong, Ricci

    2016-01-01

    Innovation in pedagogy by technology integration in kindergarten classroom has always been a challenge for most teachers. This design-based research aimed to explore the feasibility of using Augmented Reality (AR) technology in early art education with a focus on the gains and pains of this innovation. A case study was conducted in a typical…

  11. Landscape Interpretation with Augmented Reality and Maps to Improve Spatial Orientation Skill

    ERIC Educational Resources Information Center

    Carbonell Carrera, Carlos; Bermejo Asensio, Luis A.

    2017-01-01

    Landscape interpretation is needed for navigating and determining an orientation: with traditional cartography, interpreting 3D topographic information from 2D landform representations to get self-location requires spatial orientation skill. Augmented reality technology allows a new way to interact with 3D landscape representation and thereby…

  12. Augmented Reality M-Learning to Enhance Nursing Skills Acquisition in the Clinical Skills Laboratory

    ERIC Educational Resources Information Center

    Garrett, Bernard M.; Jackson, Cathryn; Wilson, Brian

    2015-01-01

    Purpose: This paper aims to report on a pilot research project designed to explore if new mobile augmented reality (AR) technologies have the potential to enhance the learning of clinical skills in the lab. Design/methodology/approach: An exploratory action-research-based pilot study was undertaken to explore an initial proof-of-concept design in…

  13. A Mobile Augmented Reality System for the Learning of Dental Morphology

    ERIC Educational Resources Information Center

    Juan, M.-Carmen; Alexandrescu, Lucian; Folguera, Fernando; García-García, Inmaculada

    2016-01-01

    Three-dimensional models are important when the learning content is difficult to acquire from 2D images or other traditional methods. This is the case for learning dental morphology. In this paper, we present a mobile augmented reality (AR) system for learning dental morphology. A study with students was carried out to determine whether learning…

  14. Alien Contact!: Exploring Teacher Implementation of an Augmented Reality Curricular Unit

    ERIC Educational Resources Information Center

    Mitchell, Rebecca

    2011-01-01

    This paper reports on findings from a five-teacher, exploratory case study, critically observing their implementation of a technology-intensive, augmented reality (AR) mathematics curriculum unit, along with its paper-based control. The unit itself was intended to promote multiple proportional-reasoning strategies with urban, public, middle school…

  15. ARBOOK: Development and Assessment of a Tool Based on Augmented Reality for Anatomy

    ERIC Educational Resources Information Center

    Ferrer-Torregrosa, J.; Torralba, J.; Jimenez, M. A.; García, S.; Barcia, J. M.

    2015-01-01

    The evolution of technologies and the development of new tools with educational purposes are growing up. This work presents the experience of a new tool based on augmented reality (AR) focusing on the anatomy of the lower limb. ARBOOK was constructed and developed based on TC and MRN images, dissections and drawings. For ARBOOK evaluation, a…

  16. [Identification of perforating vessels by augmented reality: Application for the deep inferior epigastric perforator flap].

    PubMed

    Bosc, R; Fitoussi, A; Pigneur, F; Tacher, V; Hersant, B; Meningaud, J-P

    2017-03-07

    The augmented reality on smart glasses allows the surgeon to visualize three-dimensional virtual objects during surgery, superimposed in real time to the anatomy of the patient. This makes it possible to preserve the vision of the surgical field and to dispose of added computerized information without the need to use a physical surgical guide or a deported screen.

  17. Modeling Augmented Reality Games with Preservice Elementary and Secondary Science Teachers

    ERIC Educational Resources Information Center

    Burton, Erin Peters; Frazier, Wendy; Annetta, Leonard; Lamb, Richard; Cheng, Rebecca; Chmiel, Margaret

    2011-01-01

    Cell phones are ever-present in daily life, yet vastly underused in the formal science classroom. The purpose of this study was to implement a novel learning tool on cell phones, Augmented Reality Games, and determine how the interaction influenced preservice teachers' content knowledge and self-efficacy of cell phone use in schools. Results show…

  18. The Use of Augmented Reality Games in Education: A Review of the Literature

    ERIC Educational Resources Information Center

    Koutromanos, George; Sofos, Alivisos; Avraamidou, Lucy

    2015-01-01

    This paper provides a review of the literature about the use of augmented reality in education and specifically in the context of formal and informal environments. It examines the research that has been conducted up to date on the use of those games through mobile technology devices such as mobile phones and tablets, both in primary and secondary…

  19. Augmented-reality visualization in iMRI operating room: system description and preclinical testing

    NASA Astrophysics Data System (ADS)

    Sauer, Frank; Khamene, Ali; Bascle, Benedicte; Vogt, Sebastian; Rubino, Gregory

    2002-05-01

    We developed an augmented reality system targeting image guidance for surgical procedures. The surgeon wears a video- see-through head mounted display that provides him with a stereo video view of the patient. The live video images are augmented with graphical representations of anatomical structures that are segmented from medical image data. The surgeon can see, e.g., a tumor in its actual location inside the patient. This in-situ visualization, where the computer maps the image information onto the patient, promises the most direct, intuitive guidance for surgical procedures. In this paper, we describe technical details of the system and its installation in UCLA's iMRI operating room. We added instrument tracking to the capabilities of our system to prepare it for minimally invasive procedures. We discuss several pre-clinical phantom experiments that support the potential clinical usefulness of augmented reality guidance.

  20. Registration based on projective reconstruction technique for augmented reality systems.

    PubMed

    Yuan, M L; Ong, S K; Nee, A Y C

    2005-01-01

    In AR systems, registration is one of the most difficult problems currently limiting their application. In this paper, we propose a simple registration method using projective reconstruction. This method consists of two steps: embedding and tracking. Embedding involves specifying four points to build the world coordinate system on which a virtual object will be superimposed. In tracking, a projective reconstruction technique is used to track these four specified points to compute the model view transformation for augmentation. This method is simple, as only four points need to be specified at the embedding stage and the virtual object can then be easily augmented onto a real scene from a video sequence. In addition, it can be extended to a scenario using the projective matrix that has been obtained from previous registration results using the same AR system. The proposed method has three advantages: 1) It is fast because the linear least square method can be used to estimate the related matrix in the algorithm and it is not necessary to calculate the fundamental matrix in the extended case. 2) A virtual object can still be superimposed on a related area even if some parts of the specified area are occluded during the whole process. 3) This method is robust because it remains effective even when not all the reference points are detected during the whole process, as long as at least six pairs of related reference points correspondences can be found. Some experiments have been conducted to validate the performance of the proposed method.

  1. Augmented reality image guidance for minimally invasive coronary artery bypass

    NASA Astrophysics Data System (ADS)

    Figl, Michael; Rueckert, Daniel; Hawkes, David; Casula, Roberto; Hu, Mingxing; Pedro, Ose; Zhang, Dong Ping; Penney, Graeme; Bello, Fernando; Edwards, Philip

    2008-03-01

    We propose a novel system for image guidance in totally endoscopic coronary artery bypass (TECAB). A key requirement is the availability of 2D-3D registration techniques that can deal with non-rigid motion and deformation. Image guidance for TECAB is mainly required before the mechanical stabilization of the heart, thus the most dominant source of non-rigid deformation is the motion of the beating heart. To augment the images in the endoscope of the da Vinci robot, we have to find the transformation from the coordinate system of the preoperative imaging modality to the system of the endoscopic cameras. In a first step we build a 4D motion model of the beating heart. Intraoperatively we can use the ECG or video processing to determine the phase of the cardiac cycle. We can then take the heart surface from the motion model and register it to the stereo-endoscopic images of the da Vinci robot using 2D-3D registration methods. We are investigating robust feature tracking and intensity-based methods for this purpose. Images of the vessels available in the preoperative coordinate system can then be transformed to the camera system and projected into the calibrated endoscope view using two video mixers with chroma keying. It is hoped that the augmented view can improve the efficiency of TECAB surgery and reduce the conversion rate to more conventional procedures.

  2. Smart maintenance of riverbanks using a standard data layer and Augmented Reality

    NASA Astrophysics Data System (ADS)

    Pierdicca, Roberto; Frontoni, Emanuele; Zingaretti, Primo; Mancini, Adriano; Malinverni, Eva Savina; Tassetti, Anna Nora; Marcheggiani, Ernesto; Galli, Andrea

    2016-10-01

    Linear buffer strips (BS) along watercourses are commonly adopted to reduce run-off, accumulation of bank-top sediments and the leaking of pesticides into fresh-waters, which strongly increase water pollution. However, the monitoring of their conditions is a difficult task because they are scattered over wide rural areas. This work demonstrates the benefits of using a standard data layer and Augmented Reality (AR) in watershed control and outlines the guideline of a novel approach for the health-check of linear BS. We designed a mobile environmental monitoring system for smart maintenance of riverbanks by embedding the AR technology within a Geographical Information System (GIS). From the technological point of view, the system's architecture consists of a cloud-based service for data sharing, using a standard data layer, and of a mobile device provided with a GPS based AR engine for augmented data visualization. The proposed solution aims to ease the overall inspection process by reducing the time required to run a survey. Indeed, ordinary operational survey conditions are usually performed basing the fieldwork on just classical digitized maps. Our application proposes to enrich inspections by superimposing information on the device screen with the same point of view of the camera, providing an intuitive visualization of buffer strip location. This way, the inspection officer can quickly and dynamically access relevant information overlaying geographic features, comments and other contents in real time. The solution has been tested in fieldwork to prove at what extent this cutting-edge technology contributes to an effective monitoring over large territorial settings. The aim is to encourage officers, land managers and practitioners toward more effective monitoring and management practices.

  3. Mobile augmented reality in support of building damage and safety assessment

    NASA Astrophysics Data System (ADS)

    Kim, W.; Kerle, N.; Gerke, M.

    2016-02-01

    Rapid and accurate assessment of the state of buildings in the aftermath of a disaster event is critical for an effective and timely response. For rapid damage assessment of buildings, the utility of remote sensing (RS) technology has been widely researched, with focus on a range of platforms and sensors. However, RS-based approaches still have limitations to assess structural integrity and the specific damage status of individual buildings. Structural integrity refers to the ability of a building to hold the entire structure. Consequently, ground-based assessment conducted by structural engineers and first responders is still required. This paper demonstrates the concept of mobile augmented reality (mAR) to improve performance of building damage and safety assessment in situ. Mobile AR provides a means to superimpose various types of reference or pre-disaster information (virtual data) on actual post-disaster building data (real buildings). To adopt mobile AR, this study defines a conceptual framework based on the level of complexity (LOC). The framework consists of four LOCs, and for each of these, the data types, required processing steps, AR implementation and use for damage assessment are described. Based on this conceptualization we demonstrate prototypes of mAR for both indoor and outdoor purposes. Finally, we conduct a user evaluation of the prototypes to validate the mAR approach for building damage and safety assessment.

  4. Mobile Augmented Reality in support of building damage and safety assessment

    NASA Astrophysics Data System (ADS)

    Kim, W.; Kerle, N.; Gerke, M.

    2015-04-01

    Rapid and accurate assessment of the state of buildings in the aftermath of a disaster event is critical for an effective and timely response. For rapid damage assessment of buildings, the utility of remote sensing (RS) technology has been widely researched, with focus on a range of platforms and sensors. However, RS-based approach still have limitations to assess structural integrity and the specific damage status of individual buildings. Consequently, ground-based assessment conducted by structural engineers and first responders is still required. This paper demonstrates the concept of mobile Augmented Reality (mAR) to improve performance of building damage and safety assessment in situ. Mobile AR provides a means to superimpose various types of reference or pre-disaster information (virtual data) on actual post-disaster building data (real building). To adopt mobile AR, this study defines a conceptual framework based on Level of Complexity (LOC). The framework consists of four LOCs, and for each of these the data types, required processing steps, AR implementation, and use for damage assessment, are described. Based on this conceptualization we demonstrate prototypes of mAR for both indoor and outdoor purposes. Finally, we conduct a user evaluation of the prototypes to validate the mAR approach for building damage and safety assessment.

  5. A method for real-time generation of augmented reality work instructions via expert movements

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Bhaskar; Winer, Eliot

    2015-03-01

    Augmented Reality (AR) offers tremendous potential for a wide range of fields including entertainment, medicine, and engineering. AR allows digital models to be integrated with a real scene (typically viewed through a video camera) to provide useful information in a variety of contexts. The difficulty in authoring and modifying scenes is one of the biggest obstacles to widespread adoption of AR. 3D models must be created, textured, oriented and positioned to create the complex overlays viewed by a user. This often requires using multiple software packages in addition to performing model format conversions. In this paper, a new authoring tool is presented which uses a novel method to capture product assembly steps performed by a user with a depth+RGB camera. Through a combination of computer vision and imaging process techniques, each individual step is decomposed into objects and actions. The objects are matched to those in a predetermined geometry library and the actions turned into animated assembly steps. The subsequent instruction set is then generated with minimal user input. A proof of concept is presented to establish the method's viability.

  6. Augmented reality: don't we all wish we lived in one?

    SciTech Connect

    Hayes, Birchard P; Michel, Kelly D; Few, Douglas A; Gertman, David; Le Blanc, Katya

    2010-01-01

    From stereophonic, positional sound to high-definition imagery that is crisp and clean, high fidelity computer graphics enhance our view, insight, and intuition regarding our environments and conditions. Contemporary 3-D modeling tools offer an open architecture framework that enables integration with other technologically innovative arenas. One innovation of great interest is Augmented Reality, the merging of virtual, digital environments with physical, real-world environments creating a mixed reality where relevant data and information augments the real or actual experience in real-time by spatial or semantic context. Pairing 3-D virtual immersive models with a dynamic platform such as semi-autonomous robotics or personnel odometry systems to create a mixed reality offers a new and innovative design information verification inspection capability, evaluation accuracy, and information gathering capability for nuclear facilities. Our paper discusses the integration of two innovative technologies, 3-D visualizations with inertial positioning systems, and the resulting augmented reality offered to the human inspector. The discussion in the paper includes an exploration of human and non-human (surrogate) inspections of a nuclear facility, integrated safeguards knowledge within a synchronized virtual model operated, or worn, by a human inspector, and the anticipated benefits to safeguards evaluations of facility operations.

  7. Augmented and virtual reality in surgery-the digital surgical environment: applications, limitations and legal pitfalls.

    PubMed

    Khor, Wee Sim; Baker, Benjamin; Amin, Kavit; Chan, Adrian; Patel, Ketan; Wong, Jason

    2016-12-01

    The continuing enhancement of the surgical environment in the digital age has led to a number of innovations being highlighted as potential disruptive technologies in the surgical workplace. Augmented reality (AR) and virtual reality (VR) are rapidly becoming increasingly available, accessible and importantly affordable, hence their application into healthcare to enhance the medical use of data is certain. Whether it relates to anatomy, intraoperative surgery, or post-operative rehabilitation, applications are already being investigated for their role in the surgeons armamentarium. Here we provide an introduction to the technology and the potential areas of development in the surgical arena.

  8. Augmented and virtual reality in surgery—the digital surgical environment: applications, limitations and legal pitfalls

    PubMed Central

    Baker, Benjamin; Amin, Kavit; Chan, Adrian; Patel, Ketan; Wong, Jason

    2016-01-01

    The continuing enhancement of the surgical environment in the digital age has led to a number of innovations being highlighted as potential disruptive technologies in the surgical workplace. Augmented reality (AR) and virtual reality (VR) are rapidly becoming increasingly available, accessible and importantly affordable, hence their application into healthcare to enhance the medical use of data is certain. Whether it relates to anatomy, intraoperative surgery, or post-operative rehabilitation, applications are already being investigated for their role in the surgeons armamentarium. Here we provide an introduction to the technology and the potential areas of development in the surgical arena. PMID:28090510

  9. Virtual reality, augmented reality, and robotics applied to digestive operative procedures: from in vivo animal preclinical studies to clinical use

    NASA Astrophysics Data System (ADS)

    Soler, Luc; Marescaux, Jacques

    2006-04-01

    Technological innovations of the 20 th century provided medicine and surgery with new tools, among which virtual reality and robotics belong to the most revolutionary ones. Our work aims at setting up new techniques for detection, 3D delineation and 4D time follow-up of small abdominal lesions from standard mecial images (CT scsan, MRI). It also aims at developing innovative systems making tumor resection or treatment easier with the use of augmented reality and robotized systems, increasing gesture precision. It also permits a realtime great distance connection between practitioners so they can share a same 3D reconstructed patient and interact on a same patient, virtually before the intervention and for real during the surgical procedure thanks to a telesurgical robot. In preclinical studies, our first results obtained from a micro-CT scanner show that these technologies provide an efficient and precise 3D modeling of anatomical and pathological structures of rats and mice. In clinical studies, our first results show the possibility to improve the therapeutic choice thanks to a better detection and and representation of the patient before performing the surgical gesture. They also show the efficiency of augmented reality that provides virtual transparency of the patient in real time during the operative procedure. In the near future, through the exploitation of these systems, surgeons will program and check on the virtual patient clone an optimal procedure without errors, which will be replayed on the real patient by the robot under surgeon control. This medical dream is today about to become reality.

  10. Generic precise augmented reality guiding system and its calibration method based on 3D virtual model.

    PubMed

    Liu, Miao; Yang, Shourui; Wang, Zhangying; Huang, Shujun; Liu, Yue; Niu, Zhenqi; Zhang, Xiaoxuan; Zhu, Jigui; Zhang, Zonghua

    2016-05-30

    Augmented reality system can be applied to provide precise guidance for various kinds of manual works. The adaptability and guiding accuracy of such systems are decided by the computational model and the corresponding calibration method. In this paper, a novel type of augmented reality guiding system and the corresponding designing scheme are proposed. Guided by external positioning equipment, the proposed system can achieve high relative indication accuracy in a large working space. Meanwhile, the proposed system is realized with a digital projector and the general back projection model is derived with geometry relationship between digitized 3D model and the projector in free space. The corresponding calibration method is also designed for the proposed system to obtain the parameters of projector. To validate the proposed back projection model, the coordinate data collected by a 3D positioning equipment is used to calculate and optimize the extrinsic parameters. The final projecting indication accuracy of the system is verified with subpixel pattern projecting technique.

  11. Sensor-Aware Recognition and Tracking for Wide-Area Augmented Reality on Mobile Phones

    PubMed Central

    Chen, Jing; Cao, Ruochen; Wang, Yongtian

    2015-01-01

    Wide-area registration in outdoor environments on mobile phones is a challenging task in mobile augmented reality fields. We present a sensor-aware large-scale outdoor augmented reality system for recognition and tracking on mobile phones. GPS and gravity information is used to improve the VLAD performance for recognition. A kind of sensor-aware VLAD algorithm, which is self-adaptive to different scale scenes, is utilized to recognize complex scenes. Considering vision-based registration algorithms are too fragile and tend to drift, data coming from inertial sensors and vision are fused together by an extended Kalman filter (EKF) to achieve considerable improvements in tracking stability and robustness. Experimental results show that our method greatly enhances the recognition rate and eliminates the tracking jitters. PMID:26690439

  12. Augmented Reality in a Simulated Tower Environment: Effect of Field of View on Aircraft Detection

    NASA Technical Reports Server (NTRS)

    Ellis, Stephen R.; Adelstein, Bernard D.; Reisman, Ronald J.; Schmidt-Ott, Joelle R.; Gips, Jonathan; Krozel, Jimmy; Cohen, Malcolm (Technical Monitor)

    2002-01-01

    An optical see-through, augmented reality display was used to study subjects' ability to detect aircraft maneuvering and landing at the Dallas Ft. Worth International airport in an ATC Tower simulation. Subjects monitored the traffic patterns as if from the airport's western control tower. Three binocular fields of view (14 deg, 28 deg and 47 deg) were studied in an independent groups' design to measure the degradation in detection performance associated with the visual field restrictions. In a second experiment the 14 deg and 28 deg fields were presented either with 46% binocular overlap or 100% overlap for separate groups. The near asymptotic results of the first experiment suggest that binocular fields of view much greater than 47% are unlikely to dramatically improve performance; and those of the second experiment suggest that partial binocular overlap is feasible for augmented reality displays such as may be used for ATC tower applications.

  13. Sensor-Aware Recognition and Tracking for Wide-Area Augmented Reality on Mobile Phones.

    PubMed

    Chen, Jing; Cao, Ruochen; Wang, Yongtian

    2015-12-10

    Wide-area registration in outdoor environments on mobile phones is a challenging task in mobile augmented reality fields. We present a sensor-aware large-scale outdoor augmented reality system for recognition and tracking on mobile phones. GPS and gravity information is used to improve the VLAD performance for recognition. A kind of sensor-aware VLAD algorithm, which is self-adaptive to different scale scenes, is utilized to recognize complex scenes. Considering vision-based registration algorithms are too fragile and tend to drift, data coming from inertial sensors and vision are fused together by an extended Kalman filter (EKF) to achieve considerable improvements in tracking stability and robustness. Experimental results show that our method greatly enhances the recognition rate and eliminates the tracking jitters.

  14. Comparative evaluation of monocular augmented-reality display for surgical microscopes.

    PubMed

    Rodriguez Palma, Santiago; Becker, Brian C; Lobes, Louis A; Riviere, Cameron N

    2012-01-01

    Medical augmented reality has undergone much development recently. However, there is a lack of studies quantitatively comparing the different display options available. This paper compares the effects of different graphical overlay systems in a simple micromanipulation task with "soft" visual servoing. We compared positioning accuracy in a real-time visually-guided task using Micron, an active handheld tremor-canceling microsurgical instrument, using three different displays: 2D screen, 3D screen, and microscope with monocular image injection. Tested with novices and an experienced vitreoretinal surgeon, display of virtual cues in the microscope via an augmented reality injection system significantly decreased 3D error (p < 0.05) compared to the 2D and 3D monitors when confounding factors such as magnification level were normalized.

  15. Tracking and registration method based on vector operation for augmented reality system

    NASA Astrophysics Data System (ADS)

    Gao, Yanfei; Wang, Hengyou; Bian, Xiaoning

    2015-08-01

    Tracking and registration is one key issue for an augmented reality (AR) system. For the marker-based AR system, the research focuses on detecting the real-time position and orientation of camera. In this paper, we describe a method of tracking and registration using the vector operations. Our method is proved to be stable and accurate, and have a good real-time performance.

  16. Building a Mobile Augmented Reality System for Embedded Training: Lessons Learned

    DTIC Science & Technology

    2004-12-01

    tracked see-through Head Mounted Display (HMD) (Sony Glasstron, Microvision Nomad, or Trivisio). Three-dimensional ( 3D ) data about the environment is...graphics, 3D displays, tracking, vision, mobile augmented reality and wearable computers. Baillot is a member of the IEEE Computer Society. Simon...which he is looking. Figure 1 shows the BARS wearable system. Based on this data, the desired 3D data is rendered to appear as if it were in the real

  17. How to reduce workload--augmented reality to ease the work of air traffic controllers.

    PubMed

    Hofmann, Thomas; König, Christina; Bruder, Ralph; Bergner, Jörg

    2012-01-01

    In the future the air traffic will rise--the workload of the controllers will do the same. In the BMWi research project, one of the tasks is, how to ensure safe air traffic, and a reasonable workload for the air traffic controllers. In this project it was the goal to find ways how to reduce the workload (and stress) for the controllers to allow safe air traffic, esp. at huge hub-airports by implementing augmented reality visualization and interaction.

  18. Does Augmented Reality Affect High School Students' Learning Outcomes in Chemistry?

    NASA Astrophysics Data System (ADS)

    Renner, Jonathan Christopher

    Some teens may prefer using a self-directed, constructivist, and technologic approach to learning rather than traditional classroom instruction. If it can be demonstrated, educators may adjust their teaching methodology. The guiding research question for this study focused on how augmented reality affects high school students' learning outcomes in chemistry, as measured by a pretest and posttest methodology when ensuring that the individual outcomes were not the result of group collaboration. This study employed a quantitative, quasi-experimental study design that used a comparison and experimental group. Inferential statistical analysis was employed. The study was conducted at a high school in southwest Colorado. Eighty-nine respondents returned completed and signed consent forms, and 78 participants completed the study. Results demonstrated that augmented reality instruction caused posttest scores to significantly increase, as compared to pretest scores, but it was not as effective as traditional classroom instruction. Scores did improve under both types of instruction; therefore, more research is needed in this area. The present study was the first quantitative experiment controlling for individual learning to validate augmented reality using mobile handheld digital devices that affected individual students' learning outcomes without group collaboration. This topic was important to the field of education as it may help educators understand how students learn and it may also change the way students are taught.

  19. 3D interactive augmented reality-enhanced digital learning systems for mobile devices

    NASA Astrophysics Data System (ADS)

    Feng, Kai-Ten; Tseng, Po-Hsuan; Chiu, Pei-Shuan; Yang, Jia-Lin; Chiu, Chun-Jie

    2013-03-01

    With enhanced processing capability of mobile platforms, augmented reality (AR) has been considered a promising technology for achieving enhanced user experiences (UX). Augmented reality is to impose virtual information, e.g., videos and images, onto a live-view digital display. UX on real-world environment via the display can be e ectively enhanced with the adoption of interactive AR technology. Enhancement on UX can be bene cial for digital learning systems. There are existing research works based on AR targeting for the design of e-learning systems. However, none of these work focuses on providing three-dimensional (3-D) object modeling for en- hanced UX based on interactive AR techniques. In this paper, the 3-D interactive augmented reality-enhanced learning (IARL) systems will be proposed to provide enhanced UX for digital learning. The proposed IARL systems consist of two major components, including the markerless pattern recognition (MPR) for 3-D models and velocity-based object tracking (VOT) algorithms. Realistic implementation of proposed IARL system is conducted on Android-based mobile platforms. UX on digital learning can be greatly improved with the adoption of proposed IARL systems.

  20. Design and Validation of an Augmented Reality System for Laparoscopic Surgery in a Real Environment

    PubMed Central

    López-Mir, F.; Naranjo, V.; Fuertes, J. J.; Alcañiz, M.; Bueno, J.; Pareja, E.

    2013-01-01

    Purpose. This work presents the protocol carried out in the development and validation of an augmented reality system which was installed in an operating theatre to help surgeons with trocar placement during laparoscopic surgery. The purpose of this validation is to demonstrate the improvements that this system can provide to the field of medicine, particularly surgery. Method. Two experiments that were noninvasive for both the patient and the surgeon were designed. In one of these experiments the augmented reality system was used, the other one was the control experiment, and the system was not used. The type of operation selected for all cases was a cholecystectomy due to the low degree of complexity and complications before, during, and after the surgery. The technique used in the placement of trocars was the French technique, but the results can be extrapolated to any other technique and operation. Results and Conclusion. Four clinicians and ninety-six measurements obtained of twenty-four patients (randomly assigned in each experiment) were involved in these experiments. The final results show an improvement in accuracy and variability of 33% and 63%, respectively, in comparison to traditional methods, demonstrating that the use of an augmented reality system offers advantages for trocar placement in laparoscopic surgery. PMID:24236293

  1. Augmented reality three-dimensional object visualization and recognition with axially distributed sensing.

    PubMed

    Markman, Adam; Shen, Xin; Hua, Hong; Javidi, Bahram

    2016-01-15

    An augmented reality (AR) smartglass display combines real-world scenes with digital information enabling the rapid growth of AR-based applications. We present an augmented reality-based approach for three-dimensional (3D) optical visualization and object recognition using axially distributed sensing (ADS). For object recognition, the 3D scene is reconstructed, and feature extraction is performed by calculating the histogram of oriented gradients (HOG) of a sliding window. A support vector machine (SVM) is then used for classification. Once an object has been identified, the 3D reconstructed scene with the detected object is optically displayed in the smartglasses allowing the user to see the object, remove partial occlusions of the object, and provide critical information about the object such as 3D coordinates, which are not possible with conventional AR devices. To the best of our knowledge, this is the first report on combining axially distributed sensing with 3D object visualization and recognition for applications to augmented reality. The proposed approach can have benefits for many applications, including medical, military, transportation, and manufacturing.

  2. Microscopic augmented-reality indicators for long-term live cell time-lapsed imaging.

    PubMed

    Yun, Kyungwon; Chung, Jungman; Park, Yong; Lee, Byungjoo; Lee, Won Gu; Bang, Hyunwoo

    2013-06-07

    Microscopic observations of cultured cells in many lab-on-a-chip applications mostly utilize digital image acquisition using CCD sensors connected to a personal computer. The functionalities of this digital imaging can be enhanced by implementing computer-vision based augmented reality technologies. In this study, we present a new method for precisely relocating biological specimens under microscopic inspections by using augmented reality patterns, called microscopic augmented reality indicators (μ-ARIs). Since the method only requires sticky films attached under sample containers of any shape, long-term live cell observations can be conducted at much less extra cost than with conventional methods. On these sticky films, multiple arrays of position-indicating patterns were imprinted to provide a reference coordinate system for recording and relocating the accurate position and rotation of the specimen under inspection. This approach can be useful for obtaining the exact locations of individual cells inside biological samples using μ-ARI imprinted transparent films in a rapid and controlled manner.

  3. Augmented reality and cone beam CT guidance for transoral robotic surgery

    PubMed Central

    Richmon, Jeremy D.; Sorger, Jonathan M.; Azizian, Mahdi; Taylor, Russell H.

    2015-01-01

    In transoral robotic surgery preoperative image data do not reflect large deformations of the operative workspace from perioperative setup. To address this challenge, in this study we explore image guidance with cone beam computed tomographic angiography to guide the dissection of critical vascular landmarks and resection of base-of-tongue neoplasms with adequate margins for transoral robotic surgery. We identify critical vascular landmarks from perioperative c-arm imaging to augment the stereoscopic view of a da Vinci si robot in addition to incorporating visual feedback from relative tool positions. Experiments resecting base-of-tongue mock tumors were conducted on a series of ex vivo and in vivo animal models comparing the proposed workflow for video augmentation to standard non-augmented practice and alternative, fluoroscopy-based image guidance. Accurate identification of registered augmented critical anatomy during controlled arterial dissection and en bloc mock tumor resection was possible with the augmented reality system. The proposed image-guided robotic system also achieved improved resection ratios of mock tumor margins (1.00) when compared to control scenarios (0.0) and alternative methods of image guidance (0.58). The experimental results show the feasibility of the proposed workflow and advantages of cone beam computed tomography image guidance through video augmentation of the primary stereo endoscopy as compared to control and alternative navigation methods. PMID:26531203

  4. Augmented reality and cone beam CT guidance for transoral robotic surgery.

    PubMed

    Liu, Wen P; Richmon, Jeremy D; Sorger, Jonathan M; Azizian, Mahdi; Taylor, Russell H

    2015-09-01

    In transoral robotic surgery preoperative image data do not reflect large deformations of the operative workspace from perioperative setup. To address this challenge, in this study we explore image guidance with cone beam computed tomographic angiography to guide the dissection of critical vascular landmarks and resection of base-of-tongue neoplasms with adequate margins for transoral robotic surgery. We identify critical vascular landmarks from perioperative c-arm imaging to augment the stereoscopic view of a da Vinci si robot in addition to incorporating visual feedback from relative tool positions. Experiments resecting base-of-tongue mock tumors were conducted on a series of ex vivo and in vivo animal models comparing the proposed workflow for video augmentation to standard non-augmented practice and alternative, fluoroscopy-based image guidance. Accurate identification of registered augmented critical anatomy during controlled arterial dissection and en bloc mock tumor resection was possible with the augmented reality system. The proposed image-guided robotic system also achieved improved resection ratios of mock tumor margins (1.00) when compared to control scenarios (0.0) and alternative methods of image guidance (0.58). The experimental results show the feasibility of the proposed workflow and advantages of cone beam computed tomography image guidance through video augmentation of the primary stereo endoscopy as compared to control and alternative navigation methods.

  5. Augmented reality system for MR-guided interventions: phantom studies and first animal test

    NASA Astrophysics Data System (ADS)

    Vogt, Sebastian; Wacker, Frank; Khamene, Ali; Elgort, Daniel R.; Sielhorst, Tobias; Niemann, Heinrich; Duerk, Jeff; Lewin, Jonathan S.; Sauer, Frank

    2004-05-01

    We developed an augmented reality navigation system for MR-guided interventions. A head-mounted display provides in real-time a stereoscopic video-view of the patient, which is augmented with three-dimensional medical information to perform MR-guided needle placement procedures. Besides with the MR image information, we augment the scene with 3D graphics representing a forward extension of the needle and the needle itself. During insertion, the needle can be observed virtually at its actual location in real-time, supporting the interventional procedure in an efficient and intuitive way. In this paper we report on quantitative results of AR guided needle placement procedures on gel phantoms with embedded targets of 12mm and 6mm diameter; we furthermore evaluate our first animal experiment involving needle insertion into deep lying anatomical structures of a pig.

  6. Planning, simulation, and augmented reality for robotic cardiac procedures: The STARS system of the ChIR team.

    PubMed

    Coste-Manière, Eve; Adhami, Louaï; Mourgues, Fabien; Carpentier, Alain

    2003-04-01

    This paper presents STARS (Simulation and Transfer Architecture for Robotic Surgery), a versatile system that aims at enhancing minimally invasive robotic surgery through patient-dependent optimized planning, realistic simulation, safe supervision, and augmented reality. The underlying architecture of the proposed approach is presented, then each component is detailed. An experimental validation is conducted on a dog for a coronary bypass intervention using the Da Vinci(TM) surgical system focusing on planing, registration, and augmented reality trials.

  7. Development of an Augmented Reality Rare Book and Manuscript for Special Library Collection (AR Rare-BM)

    NASA Astrophysics Data System (ADS)

    Parhizkar, Behrang; Badioze Zaman, Halimah

    This research aims to study the development of augmented reality of rare books or manuscripts of special collections in the libraries. Augmented reality has the ability to enhance users' perception of and interaction with the real world. Libraries has to ensure that this special collection is well handled as these rare books and manuscripts are priceless as they represent the inheritance of each nation. The use of augmented reality will be able to model these valuable manuscripts and rare books and appear as augmented reality to ensure that the collection can be better maintained. Users will be able to open the augmented rare book, and flip the pages, as well as read the contents of the rare books and manuscripts using the peripheral equipment such as the HMD or the Marker. The AR Rare-BM developed is modeled as an augmented reality that allows users to put the augmented rare book on his palm or table and manipulate it while reading. Users can also leave a bookmark in the AR Rare-BM after reading so that they can read their favourite sections again at a later date.

  8. In Vivo versus Augmented Reality Exposure in the Treatment of Small Animal Phobia: A Randomized Controlled Trial

    PubMed Central

    Botella, Cristina; Pérez-Ara, M. Ángeles; Bretón-López, Juana; Quero, Soledad; García-Palacios, Azucena; Baños, Rosa María

    2016-01-01

    Although in vivo exposure is the treatment of choice for specific phobias, some acceptability problems have been associated with it. Virtual Reality exposure has been shown to be as effective as in vivo exposure, and it is widely accepted for the treatment of specific phobias, but only preliminary data are available in the literature about the efficacy of Augmented Reality. The purpose of the present study was to examine the efficacy and acceptance of two treatment conditions for specific phobias in which the exposure component was applied in different ways: In vivo exposure (N = 31) versus an Augmented Reality system (N = 32) in a randomized controlled trial. “One-session treatment” guidelines were followed. Participants in the Augmented Reality condition significantly improved on all the outcome measures at post-treatment and follow-ups. When the two treatment conditions were compared, some differences were found at post-treatment, favoring the participants who received in vivo exposure. However, these differences disappeared at the 3- and 6-month follow-ups. Regarding participants’ expectations and satisfaction with the treatment, very positive ratings were reported in both conditions. In addition, participants from in vivo exposure condition considered the treatment more useful for their problem whereas participants from Augmented Reality exposure considered the treatment less aversive. Results obtained in this study indicate that Augmented Reality exposure is an effective treatment for specific phobias and well accepted by the participants. PMID:26886423

  9. In Vivo versus Augmented Reality Exposure in the Treatment of Small Animal Phobia: A Randomized Controlled Trial.

    PubMed

    Botella, Cristina; Pérez-Ara, M Ángeles; Bretón-López, Juana; Quero, Soledad; García-Palacios, Azucena; Baños, Rosa María

    2016-01-01

    Although in vivo exposure is the treatment of choice for specific phobias, some acceptability problems have been associated with it. Virtual Reality exposure has been shown to be as effective as in vivo exposure, and it is widely accepted for the treatment of specific phobias, but only preliminary data are available in the literature about the efficacy of Augmented Reality. The purpose of the present study was to examine the efficacy and acceptance of two treatment conditions for specific phobias in which the exposure component was applied in different ways: In vivo exposure (N = 31) versus an Augmented Reality system (N = 32) in a randomized controlled trial. "One-session treatment" guidelines were followed. Participants in the Augmented Reality condition significantly improved on all the outcome measures at post-treatment and follow-ups. When the two treatment conditions were compared, some differences were found at post-treatment, favoring the participants who received in vivo exposure. However, these differences disappeared at the 3- and 6-month follow-ups. Regarding participants' expectations and satisfaction with the treatment, very positive ratings were reported in both conditions. In addition, participants from in vivo exposure condition considered the treatment more useful for their problem whereas participants from Augmented Reality exposure considered the treatment less aversive. Results obtained in this study indicate that Augmented Reality exposure is an effective treatment for specific phobias and well accepted by the participants.

  10. Real-time self-calibration of a tracked augmented reality display

    NASA Astrophysics Data System (ADS)

    Baum, Zachary; Lasso, Andras; Ungi, Tamas; Fichtinger, Gabor

    2016-03-01

    PURPOSE: Augmented reality systems have been proposed for image-guided needle interventions but they have not become widely used in clinical practice due to restrictions such as limited portability, low display refresh rates, and tedious calibration procedures. We propose a handheld tablet-based self-calibrating image overlay system. METHODS: A modular handheld augmented reality viewbox was constructed from a tablet computer and a semi-transparent mirror. A consistent and precise self-calibration method, without the use of any temporary markers, was designed to achieve an accurate calibration of the system. Markers attached to the viewbox and patient are simultaneously tracked using an optical pose tracker to report the position of the patient with respect to a displayed image plane that is visualized in real-time. The software was built using the open-source 3D Slicer application platform's SlicerIGT extension and the PLUS toolkit. RESULTS: The accuracy of the image overlay with image-guided needle interventions yielded a mean absolute position error of 0.99 mm (95th percentile 1.93 mm) in-plane of the overlay and a mean absolute position error of 0.61 mm (95th percentile 1.19 mm) out-of-plane. This accuracy is clinically acceptable for tool guidance during various procedures, such as musculoskeletal injections. CONCLUSION: A self-calibration method was developed and evaluated for a tracked augmented reality display. The results show potential for the use of handheld image overlays in clinical studies with image-guided needle interventions.

  11. Augmented reality for the assessment of children's spatial memory in real settings.

    PubMed

    Juan, M-Carmen; Mendez-Lopez, Magdalena; Perez-Hernandez, Elena; Albiol-Perez, Sergio

    2014-01-01

    Short-term memory can be defined as the capacity for holding a small amount of information in mind in an active state for a short period of time. Although some instruments have been developed to study spatial short-term memory in real environments, there are no instruments that are specifically designed to assess visuospatial short-term memory in an attractive way to children. In this paper, we present the ARSM (Augmented Reality Spatial Memory) task, the first Augmented Reality task that involves a user's movement to assess spatial short-term memory in healthy children. The experimental procedure of the ARSM task was designed to assess the children's skill to retain visuospatial information. They were individually asked to remember the real place where augmented reality objects were located. The children (N = 76) were divided into two groups: preschool (5-6 year olds) and primary school (7-8 year olds). We found a significant improvement in ARSM task performance in the older group. The correlations between scores for the ARSM task and traditional procedures were significant. These traditional procedures were the Dot Matrix subtest for the assessment of visuospatial short-term memory of the computerized AWMA-2 battery and a parent's questionnaire about a child's everyday spatial memory. Hence, we suggest that the ARSM task has high verisimilitude with spatial short-term memory skills in real life. In addition, we evaluated the ARSM task's usability and perceived satisfaction. The study revealed that the younger children were more satisfied with the ARSM task. This novel instrument could be useful in detecting visuospatial short-term difficulties that affect specific developmental navigational disorders and/or school academic achievement.

  12. Augmented Reality for the Assessment of Children's Spatial Memory in Real Settings

    PubMed Central

    Juan, M.-Carmen; Mendez-Lopez, Magdalena; Perez-Hernandez, Elena; Albiol-Perez, Sergio

    2014-01-01

    Short-term memory can be defined as the capacity for holding a small amount of information in mind in an active state for a short period of time. Although some instruments have been developed to study spatial short-term memory in real environments, there are no instruments that are specifically designed to assess visuospatial short-term memory in an attractive way to children. In this paper, we present the ARSM (Augmented Reality Spatial Memory) task, the first Augmented Reality task that involves a user's movement to assess spatial short-term memory in healthy children. The experimental procedure of the ARSM task was designed to assess the children's skill to retain visuospatial information. They were individually asked to remember the real place where augmented reality objects were located. The children (N = 76) were divided into two groups: preschool (5–6 year olds) and primary school (7–8 year olds). We found a significant improvement in ARSM task performance in the older group. The correlations between scores for the ARSM task and traditional procedures were significant. These traditional procedures were the Dot Matrix subtest for the assessment of visuospatial short-term memory of the computerized AWMA-2 battery and a parent's questionnaire about a child's everyday spatial memory. Hence, we suggest that the ARSM task has high verisimilitude with spatial short-term memory skills in real life. In addition, we evaluated the ARSM task's usability and perceived satisfaction. The study revealed that the younger children were more satisfied with the ARSM task. This novel instrument could be useful in detecting visuospatial short-term difficulties that affect specific developmental navigational disorders and/or school academic achievement. PMID:25438146

  13. A Novel Augmented Reality Navigation System for Endoscopic Sinus and Skull Base Surgery: A Feasibility Study

    PubMed Central

    Li, Liang; Yang, Jian; Chu, Yakui; Wu, Wenbo; Xue, Jin; Liang, Ping; Chen, Lei

    2016-01-01

    Objective To verify the reliability and clinical feasibility of a self-developed navigation system based on an augmented reality technique for endoscopic sinus and skull base surgery. Materials and Methods In this study we performed a head phantom and cadaver experiment to determine the display effect and accuracy of our navigational system. We compared cadaver head-based simulated operations, the target registration error, operation time, and National Aeronautics and Space Administration Task Load Index scores of our navigation system to conventional navigation systems. Results The navigation system developed in this study has a novel display mode capable of fusing endoscopic images to three-dimensional (3-D) virtual images. In the cadaver head experiment, the target registration error was 1.28 ± 0.45 mm, which met the accepted standards of a navigation system used for nasal endoscopic surgery. Compared with conventional navigation systems, the new system was more effective in terms of operation time and the mental workload of surgeons, which is especially important for less experienced surgeons. Conclusion The self-developed augmented reality navigation system for endoscopic sinus and skull base surgery appears to have advantages that outweigh those of conventional navigation systems. We conclude that this navigational system will provide rhinologists with more intuitive and more detailed imaging information, thus reducing the judgment time and mental workload of surgeons when performing complex sinus and skull base surgeries. Ultimately, this new navigational system has potential to increase the quality of surgeries. In addition, the augmented reality navigational system could be of interest to junior doctors being trained in endoscopic techniques because it could speed up their learning. However, it should be noted that the navigation system serves as an adjunct to a surgeon’s skills and knowledge, not as a substitute. PMID:26757365

  14. Beam steering for virtual/augmented reality displays with a cycloidal diffractive waveplate.

    PubMed

    Chen, Haiwei; Weng, Yishi; Xu, Daming; Tabiryan, Nelson V; Wu, Shin-Tson

    2016-04-04

    We proposed a switchable beam steering device with cycloidal diffractive waveplate (CDW) for eye tracking in a virtual reality (VR) or augmented reality (AR) display system. Such a CDW diffracts the incident circularly polarized light to the first order with over 95% efficiency. To convert the input linearly polarized light to right-handed or left-handed circular polarization, we developed a broadband polarization switch consisting of a twisted nematic liquid crystal cell and an achromatic quarter-wave retardation film. By cascading 2-3 CDWs together, multiple diffraction angles can be achieved. To suppress the color dispersion, we proposed two approaches to obtain the same diffraction angle for red, green, and blue LEDs-based full color displays. Our device exhibits several advantages, such as high diffraction efficiency, fast response time, low power consumption, and low cost. It holds promise for the emerging VR/AR displays.

  15. Low cost augmented reality for training of MRI-guided needle biopsy of the spine.

    PubMed

    George, Sandeep; Kesavadas, Thenkurussi

    2008-01-01

    In needle biopsy of the spine, an Augmented Reality (AR) image guidance system can be very effective in ensuring that while targeting the lesion with the biopsy needle, vital organs near the spine are not damaged and that the approach path is accurate. This procedure requires skill that is hard to master on patients. In this paper, we present a low cost AR based training set-up which consists of a software that uses one static single-camera tracking mechanism to locate the biopsy needle in the patient and which then augments the camera feed of the patient with virtual data providing real-time guidance to the surgeon for insertion of the biopsy needle. The setup is implemented using a phantom model consisting of a set of carefully modeled holes to simulate the needle insertion task. The lack of requirement of elaborate infrared tracking systems and high computing power makes this system very effective for educational and training purposes.

  16. GyroWand: An Approach to IMU-Based Raycasting for Augmented Reality.

    PubMed

    Hincapié-Ramos, Juan David; Özacar, Kasim; Irani, Pourang P; Kitamura, Yoshifumi

    2016-01-01

    Optical see-through head-mounted displays enable augmented reality (AR) applications that display virtual objects overlaid on the real world. At the core of this new generation of devices are low-cost tracking technologies that allow us to interpret users' motion in the real world in relation to the virtual content for the purposes of navigation and interaction. The advantages of pervasive tracking come at the cost of limiting interaction possibilities, however. To address these challenges the authors introduce GyroWand, a raycasting technique for AR HMDs using inertial measurement unit (IMU) rotational data from a handheld controller.

  17. A Spatial Augmented Reality rehab system for post-stroke hand rehabilitation.

    PubMed

    Mousavi Hondori, Hossein; Khademi, Maryam; Dodakian, Lucy; Cramer, Steven C; Lopes, Cristina Videira

    2013-01-01

    This paper features a Spatial Augmented Reality system for rehabilitation of hand and arm movement. The table-top home-based system tracks a subject's hand and creates a virtual audio-visual interface for performing rehabilitation-related tasks that involve wrist, elbow, and shoulder movements. It measures range, speed, and smoothness of movements locally and can send the real-time photos and data to the clinic for further assessment. To evaluate the system, it was tested on two normal subjects and proved functional.

  18. Acceptable distortion and magnification of images on reflective surfaces in an augmented reality system

    NASA Astrophysics Data System (ADS)

    Yamamoto, Shoji; Hosokawa, Natsumi; Yokoya, Mayu; Tsumura, Norimichi

    2016-12-01

    In this paper, we investigated the consistency of visual perception for the change of reflection images in an augmented reality setting. Reflection images with distortion and magnification were generated by changing the capture position of the environment map. Observers evaluated the distortion and magnification in reflection images where the reflected objects were arranged symmetrically or asymmetrically. Our results confirmed that the observers' visual perception was more sensitive to changes in distortion than in magnification in the reflection images. Moreover, the asymmetrical arrangement of reflected objects effectively expands the acceptable range of distortion compared with the symmetrical arrangement.

  19. AUVA - Augmented Reality Empowers Visual Analytics to explore Medical Curriculum Data.

    PubMed

    Nifakos, Sokratis; Vaitsis, Christos; Zary, Nabil

    2015-01-01

    Medical curriculum data play a key role in the structure and the organization of medical programs in Universities around the world. The effective processing and usage of these data may improve the educational environment of medical students. As a consequence, the new generation of health professionals would have improved skills from the previous ones. This study introduces the process of enhancing curriculum data by the use of augmented reality technology as a management and presentation tool. The final goal is to enrich the information presented from a visual analytics approach applied on medical curriculum data and to sustain low levels of complexity of understanding these data.

  20. The Effects of Augmented Reality-based Otago Exercise on Balance, Gait, and Falls Efficacy of Elderly Women.

    PubMed

    Yoo, Ha-Na; Chung, Eunjung; Lee, Byoung-Hee

    2013-07-01

    [Purpose] The purpose of this study was to determine the effects of augmented reality-based Otago exercise on balance, gait, and falls efficacy of elderly women. [Subjects] The subjects were 21 elderly women, who were randomly divided into two groups: an augmented reality-based Otago exercise group of 10 subjects and an Otago exercise group of 11 subjects. [Methods] All subjects were evaluated for balance (Berg Balance Scale, BBS), gait parameters (velocity, cadence, step length, and stride length), and falls efficacy. Within 12 weeks, Otago exercise for muscle strengthening and balance training was conducted three times, for a period of 60 minutes each, and subjects in the experimental group performed augmented reality-based Otago exercise. [Results] Following intervention, the augmented reality-based Otago exercise group showed significant increases in BBS, velocity, cadence, step length (right side), stride length (right side and left side) and falls efficacy. [Conclusion] The results of this study suggest the feasibility and suitability of this augmented reality-based Otago exercise for elderly women.

  1. Using augmented reality as a clinical support tool to assist combat medics in the treatment of tension pneumothoraces.

    PubMed

    Wilson, Kenneth L; Doswell, Jayfus T; Fashola, Olatokunbo S; Debeatham, Wayne; Darko, Nii; Walker, Travelyan M; Danner, Omar K; Matthews, Leslie R; Weaver, William L

    2013-09-01

    This study was to extrapolate potential roles of augmented reality goggles as a clinical support tool assisting in the reduction of preventable causes of death on the battlefield. Our pilot study was designed to improve medic performance in accurately placing a large bore catheter to release tension pneumothorax (prehospital setting) while using augmented reality goggles. Thirty-four preclinical medical students recruited from Morehouse School of Medicine performed needle decompressions on human cadaver models after hearing a brief training lecture on tension pneumothorax management. Clinical vignettes identifying cadavers as having life-threatening tension pneumothoraces as a consequence of improvised explosive device attacks were used. Study group (n = 13) performed needle decompression using augmented reality goggles whereas the control group (n = 21) relied solely on memory from the lecture. The two groups were compared according to their ability to accurately complete the steps required to decompress a tension pneumothorax. The medical students using augmented reality goggle support were able to treat the tension pneumothorax on the human cadaver models more accurately than the students relying on their memory (p < 0.008). Although the augmented reality group required more time to complete the needle decompression intervention (p = 0.0684), this did not reach statistical significance.

  2. Measuring the Usability of Augmented Reality e-Learning Systems: A User-Centered Evaluation Approach

    NASA Astrophysics Data System (ADS)

    Pribeanu, Costin; Balog, Alexandru; Iordache, Dragoş Daniel

    The development of Augmented Reality (AR) systems is creating new challenges and opportunities for the designers of e-learning systems. The mix of real and virtual requires appropriate interaction techniques that have to be evaluated with users in order to avoid usability problems. Formative usability aims at finding usability problems as early as possible in the development life cycle and is suitable to support the development of such novel interactive systems. This work presents an approach to the user-centered usability evaluation of an e-learning scenario for Biology developed on an Augmented Reality educational platform. The evaluation has been carried on during and after a summer school held within the ARiSE research project. The basic idea was to perform usability evaluation twice. In this respect, we conducted user testing with a small number of students during the summer school in order to get a fast feedback from users having good knowledge in Biology. Then, we repeated the user testing in different conditions and with a relatively larger number of representative users. In this paper we describe both experiments and compare the usability evaluation results.

  3. Optical augmented reality assisted navigation system for neurosurgery teaching and planning

    NASA Astrophysics Data System (ADS)

    Wu, Hui-Qun; Geng, Xing-Yun; Wang, Li; Zhang, Yuan-Peng; Jiang, Kui; Tang, Le-Min; Zhou, Guo-Min; Dong, Jian-Cheng

    2013-07-01

    This paper proposed a convenient navigation system for neurosurgeon's pre-operative planning and teaching with augmented reality (AR) technique, which maps the three-dimensional reconstructed virtual anatomy structures onto a skull model. This system included two parts, a virtual reality system and a skull model scence. In our experiment, a 73 year old right-handed man initially diagnosed with astrocytoma was selected as an example to vertify our system. His imaging data from different modalities were registered and the skull soft tissue, brain and inside vessels as well as tumor were reconstructed. Then the reconstructed models were overlayed on the real scence. Our findings showed that the reconstructed tissues were augmented into the real scence and the registration results were in good alignment. The reconstructed brain tissue was well distributed in the skull cavity. The probe was used by a neurosurgeon to explore the surgical pathway which could be directly posed into the tumor while not injuring important vessels. In this way, the learning cost for students and patients' education about surgical risks reduced. Therefore, this system could be a selective protocol for image guided surgery(IGS), and is promising for neurosurgeon's pre-operative planning and teaching.

  4. Endoscopic feature tracking for augmented-reality assisted prosthesis selection in mitral valve repair

    NASA Astrophysics Data System (ADS)

    Engelhardt, Sandy; Kolb, Silvio; De Simone, Raffaele; Karck, Matthias; Meinzer, Hans-Peter; Wolf, Ivo

    2016-03-01

    Mitral valve annuloplasty describes a surgical procedure where an artificial prosthesis is sutured onto the anatomical structure of the mitral annulus to re-establish the valve's functionality. Choosing an appropriate commercially available ring size and shape is a difficult decision the surgeon has to make intraoperatively according to his experience. In our augmented-reality framework, digitalized ring models are superimposed onto endoscopic image streams without using any additional hardware. To place the ring model on the proper position within the endoscopic image plane, a pose estimation is performed that depends on the localization of sutures placed by the surgeon around the leaflet origins and punctured through the stiffer structure of the annulus. In this work, the tissue penetration points are tracked by the real-time capable Lucas Kanade optical flow algorithm. The accuracy and robustness of this tracking algorithm is investigated with respect to the question whether outliers influence the subsequent pose estimation. Our results suggest that optical flow is very stable for a variety of different endoscopic scenes and tracking errors do not affect the position of the superimposed virtual objects in the scene, making this approach a viable candidate for annuloplasty augmented reality-enhanced decision support.

  5. Hybrid diffractive-refractive optical system design of head-mounted display for augmented reality

    NASA Astrophysics Data System (ADS)

    Zhang, Huijuan

    2005-02-01

    An optical see-through head-mounted display for augmented reality is designed in this paper. Considering the factors, such as the optical performance, the utilization ratios of energy of real world and virtual world, the feelings of users when he wears it and etc., a structure of the optical see-through is adopted. With the characteristics of the particular negative dispersive and the power of realizing random-phase modulation, the diffractive surface is helpful for optical system of reducing weight, simplifying structure and etc., and a diffractive surface is introduced in our optical system. The optical system with 25 mm eye relief, 12 mm exit pupil and 20° (H)x15.4° (V) field-of-view is designed. The utilization ratios of energy of real world and virtual world are 1/4 and 1/2, respectively. The angular resolution of display is 0.27 mrad and it less than that of the minimum of human eyes. The diameter of this system is less than 46mm, and it applies the binocular. This diffractive-refractive optical system of see-through head-mounted display not only satisfies the demands of user"s factors in structure, but also with high resolution, very small chromatic aberration and distortion, and satisfies the need of augmented reality. In the end, the parameters of the diffractive surface are discussed.

  6. Augmented reality warnings in vehicles: Effects of modality and specificity on effectiveness.

    PubMed

    Schwarz, Felix; Fastenmeier, Wolfgang

    2017-04-01

    In the future, vehicles will be able to warn drivers of hidden dangers before they are visible. Specific warning information about these hazards could improve drivers' reactions and the warning effectiveness, but could also impair them, for example, by additional cognitive-processing costs. In a driving simulator study with 88 participants, we investigated the effects of modality (auditory vs. visual) and specificity (low vs. high) on warning effectiveness. For the specific warnings, we used augmented reality as an advanced technology to display the additional auditory or visual warning information. Part one of the study concentrates on the effectiveness of necessary warnings and part two on the drivers' compliance despite false alarms. For the first warning scenario, we found several positive main effects of specificity. However, subsequent effects of specificity were moderated by the modality of the warnings. The specific visual warnings were observed to have advantages over the three other warning designs concerning gaze and braking reaction times, passing speeds and collision rates. Besides the true alarms, braking reaction times as well as subjective evaluation after these warnings were still improved despite false alarms. The specific auditory warnings were revealed to have only a few advantages, but also several disadvantages. The results further indicate that the exact coding of additional information, beyond its mere amount and modality, plays an important role. Moreover, the observed advantages of the specific visual warnings highlight the potential benefit of augmented reality coding to improve future collision warnings.

  7. A study of students' motivation using the augmented reality science textbook

    NASA Astrophysics Data System (ADS)

    Gopalan, Valarmathie; Zulkifli, Abdul Nasir; Bakar, Juliana Aida Abu

    2016-08-01

    Science plays a major role in assisting Malaysia to achieve the developed nation status by 2020. However, over a few decades, Malaysia is facing a downward trend in the number of students pursuing careers and higher education in science related fields. Since school is the first platform where students learn science, a new learning approach needs to be introduced to motivate them towards science learning. The aim of this study is to determine whether the intervention of the enhanced science textbook using augmented reality contributes to the learning process of lower secondary school students in science. The study was carried out among a sample of 70 lower secondary school students. Pearson Correlation and Regression analyses were used to determine the effects of ease of use, engaging, enjoyment and fun on students' motivation in using the augmented reality science textbook for science learning. The results provide empirical support for the positive and statistically significant relationship between engaging, enjoyment and fun and students' motivation for science learning. However, Ease of use is not significant but positively correlated to Motivation.

  8. Real-time markerless tracking for augmented reality: the virtual visual servoing framework.

    PubMed

    Comport, Andrew I; Marchand, Eric; Pressigout, Muriel; Chaumette, François

    2006-01-01

    Tracking is a very important research subject in a real-time augmented reality context. The main requirements for trackers are high accuracy and little latency at a reasonable cost. In order to address these issues, a real-time, robust, and efficient 3D model-based tracking algorithm is proposed for a "video see through" monocular vision system. The tracking of objects in the scene amounts to calculating the pose between the camera and the objects. Virtual objects can then be projected into the scene using the pose. Here, nonlinear pose estimation is formulated by means of a virtual visual servoing approach. In this context, the derivation of point-to-curves interaction matrices are given for different 3D geometrical primitives including straight lines, circles, cylinders, and spheres. A local moving edges tracker is used in order to provide real-time tracking of points normal to the object contours. Robustness is obtained by integrating an M-estimator into the visual control law via an iteratively reweighted least squares implementation. This approach is then extended to address the 3D model-free augmented reality problem. The method presented in this paper has been validated on several complex image sequences including outdoor environments. Results show the method to be robust to occlusion, changes in illumination, and mistracking.

  9. Using augmented reality in AIRBUS A400M shop floor assembly work instructions

    NASA Astrophysics Data System (ADS)

    Serván, J.; Mas, F.; Menéndez, J. L.; Ríos, J.

    2012-04-01

    The assembly of components in the aerospace industry is currently supported through procedures based on the generation of work instructions. This documentation describes both the sequence of operations to be performed by operators, and fundamental and critical parameters of operation (drawings of components, torques to be applied, sealing system characteristics or paste, etc.). Currently, workers use this information to ensure that the tasks are performed correctly. However, sometimes the documentation is difficult to manage, either by the difficulty of interpreting the drawings or because the process is too complex, for example in the installation of electrical harnesses. This document shows the results of the Project MOON (asseMbly Oriented authOring augmeNted reality) developed by AIRBUS Military in 2010. MOON uses 3D information from the iDMU (industrial Digital Mock-Up) to generate assembly instructions by applying Augmented Reality technology. A demonstrator was developed for the electrical harness routing in the frame 36 of the AIRBUS A400M. The techniques and methods here described are 'patent pending'.

  10. Augmented reality and photogrammetry: A synergy to visualize physical and virtual city environments

    NASA Astrophysics Data System (ADS)

    Portalés, Cristina; Lerma, José Luis; Navarro, Santiago

    2010-01-01

    Close-range photogrammetry is based on the acquisition of imagery to make accurate measurements and, eventually, three-dimensional (3D) photo-realistic models. These models are a photogrammetric product per se. They are usually integrated into virtual reality scenarios where additional data such as sound, text or video can be introduced, leading to multimedia virtual environments. These environments allow users both to navigate and interact on different platforms such as desktop PCs, laptops and small hand-held devices (mobile phones or PDAs). In very recent years, a new technology derived from virtual reality has emerged: Augmented Reality (AR), which is based on mixing real and virtual environments to boost human interactions and real-life navigations. The synergy of AR and photogrammetry opens up new possibilities in the field of 3D data visualization, navigation and interaction far beyond the traditional static navigation and interaction in front of a computer screen. In this paper we introduce a low-cost outdoor mobile AR application to integrate buildings of different urban spaces. High-accuracy 3D photo-models derived from close-range photogrammetry are integrated in real (physical) urban worlds. The augmented environment that is presented herein requires for visualization a see-through video head mounted display (HMD), whereas user's movement navigation is achieved in the real world with the help of an inertial navigation sensor. After introducing the basics of AR technology, the paper will deal with real-time orientation and tracking in combined physical and virtual city environments, merging close-range photogrammetry and AR. There are, however, some software and complex issues, which are discussed in the paper.

  11. Interactive Learning Environment: Web-based Virtual Hydrological Simulation System using Augmented and Immersive Reality

    NASA Astrophysics Data System (ADS)

    Demir, I.

    2014-12-01

    Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The hydrological simulation system is a web-based 3D interactive learning environment for teaching hydrological processes and concepts. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create or load predefined scenarios, control environmental parameters, and evaluate environmental mitigation alternatives. The web-based simulation system provides an environment for students to learn about the hydrological processes (e.g. flooding and flood damage), and effects of development and human activity in the floodplain. The system utilizes latest web technologies and graphics processing unit (GPU) for water simulation and object collisions on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various visualization and interaction modes.

  12. Toward long-term and accurate augmented-reality for monocular endoscopic videos.

    PubMed

    Puerto-Souza, Gustavo A; Cadeddu, Jeffrey A; Mariottini, Gian-Luca

    2014-10-01

    By overlaying preoperative radiological 3-D models onto the intraoperative laparoscopic video, augmented-reality (AR) displays promise to increase surgeons' visual awareness of high-risk surgical targets (e.g., the location of a tumor). Existing AR surgical systems lack in robustness and accuracy because of the many challenges in endoscopic imagery, such as frequent changes in illumination, rapid camera motions, prolonged organ occlusions, and tissue deformations. The frequent occurrence of these events can cause the loss of image (anchor) points, and thus, the loss of the AR display after a few frames. In this paper, we present the design of a new AR system that represents a first step toward long term and accurate augmented surgical display for monocular (calibrated and uncalibrated) endoscopic videos. Our system uses correspondence-search methods, and a new weighted sliding-window registration approach, to automatically and accurately recover the overlay by predicting the image locations of a high number of anchor points that were lost after a sudden image change. The effectiveness of the proposed system in maintaining a long term (over 2 min) and accurate (less than 1 mm) augmentation has been documented over a set of real partial-nephrectomy laparascopic videos.

  13. Possible applications of the LEAP motion controller for more interactive simulated experiments in augmented or virtual reality

    NASA Astrophysics Data System (ADS)

    Wozniak, Peter; Vauderwange, Oliver; Mandal, Avikarsha; Javahiraly, Nicolas; Curticapean, Dan

    2016-09-01

    Practical exercises are a crucial part of many curricula. Even simple exercises can improve the understanding of the underlying subject. Most experimental setups require special hardware. To carry out e. g. a lens experiments the students need access to an optical bench, various lenses, light sources, apertures and a screen. In our previous publication we demonstrated the use of augmented reality visualization techniques in order to let the students prepare with a simulated experimental setup. Within the context of our intended blended learning concept we want to utilize augmented or virtual reality techniques for stationary laboratory exercises. Unlike applications running on mobile devices, stationary setups can be extended more easily with additional interfaces and thus allow for more complex interactions and simulations in virtual reality (VR) and augmented reality (AR). The most significant difference is the possibility to allow interactions beyond touching a screen. The LEAP Motion controller is a small inexpensive device that allows for the tracking of the user's hands and fingers in three dimensions. It is conceivable to allow the user to interact with the simulation's virtual elements by the user's very hand position, movement and gesture. In this paper we evaluate possible applications of the LEAP Motion controller for simulated experiments in augmented and virtual reality. We pay particular attention to the devices strengths and weaknesses and want to point out useful and less useful application scenarios.

  14. Injected Water Augments Cooling In Turboshaft Engine

    NASA Technical Reports Server (NTRS)

    Biesiadny, Thomas J.; Berger, Brett; Klann, Gary A.; Clark, David A.

    1989-01-01

    Report describes experiments in which water injected into compressor-bleed cooling air of aircraft turboshaft engine. Injection of water previously suggested as way to provide additional cooling needed to sustain operation at power levels higher than usual. Involves turbine-inlet temperatures high enough to shorten lives of first-stage high-pressure turbine blades. Latent heat of vaporization of injected water serves as additional heat sink to maintain blades at design operating temperatures during high-power operation.

  15. Effect of augmented visual feedback from a virtual reality simulation system on manual dexterity training.

    PubMed

    Wierinck, E; Puttemans, V; Swinnen, S; van Steenberghe, D

    2005-02-01

    Little research has been published about the impact of simulation technology on the learning process of novel motor skills. Especially the role of augmented feedback (FB) on the quality of performance and the transfer of the acquired behaviour to a no-augmented FB condition require further investigation. Therefore, novice dental students were randomly assigned to one of three groups and given the task of drilling a geometrical class 1 cavity. The FB group trained under augmented visual FB conditions, provided by the virtual reality (VR) system (DentSim). The no-FB group practised under normal vision conditions, in the absence of augmented FB. A control group performed the test sessions without participating in any training programme. All preparations were evaluated by the VR grading system according to four traditional (outline shape, floor depth, floor smoothness and wall inclination), and two critical, criteria (pulp exposure and damage to adjacent teeth). Performance analyses revealed an overall trend towards significant improvement with training for the experimental groups. The FB group obtained the highest scores. It scored better for floor depth (P < 0.001), whilst the no-FB group was best for floor smoothness (P < 0.005). However, at the retention tests, the FB group demonstrated inferior performance in comparison with the no-FB group. The transfer test on a traditional unit revealed no significant differences between the training groups. Consequently, drilling experience on a VR system under the condition of frequently provided FB and lack of any tutorial input was considered to be not beneficial to learning. The present data are discussed in view of the guidance hypothesis of FB, which refers to the apprentice's dependence on FB.

  16. Thrust augmentation nozzle (TAN) concept for rocket engine booster applications

    NASA Astrophysics Data System (ADS)

    Forde, Scott; Bulman, Mel; Neill, Todd

    2006-07-01

    Aerojet used the patented thrust augmented nozzle (TAN) concept to validate a unique means of increasing sea-level thrust in a liquid rocket booster engine. We have used knowledge gained from hypersonic Scramjet research to inject propellants into the supersonic region of the rocket engine nozzle to significantly increase sea-level thrust without significantly impacting specific impulse. The TAN concept overcomes conventional engine limitations by injecting propellants and combusting in an annular region in the divergent section of the nozzle. This injection of propellants at moderate pressures allows for obtaining high thrust at takeoff without overexpansion thrust losses. The main chamber is operated at a constant pressure while maintaining a constant head rise and flow rate of the main propellant pumps. Recent hot-fire tests have validated the design approach and thrust augmentation ratios. Calculations of nozzle performance and wall pressures were made using computational fluid dynamics analyses with and without thrust augmentation flow, resulting in good agreement between calculated and measured quantities including augmentation thrust. This paper describes the TAN concept, the test setup, test results, and calculation results.

  17. A novel 3D guidance system using augmented reality for percutaneous vertebroplasty: technical note.

    PubMed

    Abe, Yuichiro; Sato, Shigenobu; Kato, Koji; Hyakumachi, Takahiko; Yanagibashi, Yasushi; Ito, Manabu; Abumi, Kuniyoshi

    2013-10-01

    Augmented reality (AR) is an imaging technology by which virtual objects are overlaid onto images of real objects captured in real time by a tracking camera. This study aimed to introduce a novel AR guidance system called virtual protractor with augmented reality (VIPAR) to visualize a needle trajectory in 3D space during percutaneous vertebroplasty (PVP). The AR system used for this study comprised a head-mount display (HMD) with a tracking camera and a marker sheet. An augmented scene was created by overlaying the preoperatively generated needle trajectory path onto a marker detected on the patient using AR software, thereby providing the surgeon with augmented views in real time through the HMD. The accuracy of the system was evaluated by using a computer-generated simulation model in a spine phantom and also evaluated clinically in 5 patients. In the 40 spine phantom trials, the error of the insertion angle (EIA), defined as the difference between the attempted angle and the insertion angle, was evaluated using 3D CT scanning. Computed tomography analysis of the 40 spine phantom trials showed that the EIA in the axial plane significantly improved when VIPAR was used compared with when it was not used (0.96° ± 0.61° vs 4.34° ± 2.36°, respectively). The same held true for EIA in the sagittal plane (0.61° ± 0.70° vs 2.55° ± 1.93°, respectively). In the clinical evaluation of the AR system, 5 patients with osteoporotic vertebral fractures underwent VIPAR-guided PVP from October 2011 to May 2012. The postoperative EIA was evaluated using CT. The clinical results of the 5 patients showed that the EIA in all 10 needle insertions was 2.09° ± 1.3° in the axial plane and 1.98° ± 1.8° in the sagittal plane. There was no pedicle breach or leakage of polymethylmethacrylate. VIPAR was successfully used to assist in needle insertion during PVP by providing the surgeon with an ideal insertion point and needle trajectory through the HMD. The findings indicate

  18. Obstacle Marking and Vehicle Guidance Science and Technology Objective (OMVG-STO) Augmented Reality for Enhanced Command and Control and Mobility

    DTIC Science & Technology

    2006-01-01

    OBSTACLE MARKING AND VEHICLE GUIDANCE SCIENCE AND TECHNOLOGY OBJECTIVE (OMVG-STO) AUGMENTED REALITY FOR ENHANCED COMMAND AND CONTROL AND MOBILITY...TITLE AND SUBTITLE Obstacle Marking and Vehicle Guidance Science and Technology Objective (OMVG-STO) Augmented Reality for Enhanced Command and Control

  19. Studienlandschaft Schwingbachtal: an out-door full-scale learning tool newly equipped with augmented reality

    NASA Astrophysics Data System (ADS)

    Aubert, A. H.; Schnepel, O.; Kraft, P.; Houska, T.; Plesca, I.; Orlowski, N.; Breuer, L.

    2015-11-01

    This paper addresses education and communication in hydrology and geosciences. Many approaches can be used, such as the well-known seminars, modelling exercises and practical field work but out-door learning in our discipline is a must, and this paper focuses on the recent development of a new out-door learning tool at the landscape scale. To facilitate improved teaching and hands-on experience, we designed the Studienlandschaft Schwingbachtal. Equipped with field instrumentation, education trails, and geocache, we now implemented an augmented reality App, adding virtual teaching objects on the real landscape. The App development is detailed, to serve as methodology for people wishing to implement such a tool. The resulting application, namely the Schwingbachtal App, is described as an example. We conclude that such an App is useful for communication and education purposes, making learning pleasant, and offering personalized options.

  20. New augmented reality and robotic based methods for head-surgery.

    PubMed

    Wörn, H; Aschke, M; Kahrs, L A

    2005-09-01

    Within the framework of the collaborative research centre "Information Technology in Medicine--Computer and Sensor-Aided Surgery" (SFB414) new methods for intraoperative computer assistance of surgical procedures are being developed. The developed tools will be controlled by an intraoperative host which provides interfaces to the electronic health record (EHR) and intraoperative computer assisted instruments.The interaction is based on standardised communication protocols. Plug & work functions will allow easy integration and configuration of new components. Intraoperative systems currently under development are intraoperative augmented reality (AR) using a projector and via a microscope, a planning system for definition of complex trajectories and a surgical robot system. The developed systems are under clinical evaluation and showing promising results in their application.

  1. Augmented Reality to Preserve Hidden Vestiges in Historical Cities. a Case Study

    NASA Astrophysics Data System (ADS)

    Martínez, J. L.; Álvareza, S.; Finat, J.; Delgado, F. J.; Finat, J.

    2015-02-01

    Mobile devices provide an increasingly sophisticated support to enhanced experiences and understanding the remote past in an interactive way. The use of augmented reality technologies allows to develop mobile applications for indoor exploration of virtually reconstructed archaeological places. In our work we have built a virtual reconstruction of a Roman Villa with data arising from an urgent partial excavation which were performed in order to build a car parking in the historical city of Valladolid (Spain). In its current state, the archaeological site is covered by an urban garden. Localization and tracking are performed using a combination of GPS and inertial sensors of the mobile device. In this work we prove how to perform an interactive navigation around the 3D virtual model showing an interpretation of the way it was. The user experience is enhanced by answering some simple questions, performing minor tasks and puzzles which are presented with multimedia contents linked to key features of the archaeological site.

  2. A Real-Time Augmented Reality System to See-Through Cars.

    PubMed

    Rameau, Francois; Ha, Hyowon; Joo, Kyungdon; Choi, Jinsoo; Park, Kibaek; Kweon, In So

    2016-11-01

    One of the most hazardous driving scenario is the overtaking of a slower vehicle, indeed, in this case the front vehicle (being overtaken) can occlude an important part of the field of view of the rear vehicle's driver. This lack of visibility is the most probable cause of accidents in this context. Recent research works tend to prove that augmented reality applied to assisted driving can significantly reduce the risk of accidents. In this paper, we present a real-time marker-less system to see through cars. For this purpose, two cars are equipped with cameras and an appropriate wireless communication system. The stereo vision system mounted on the front car allows to create a sparse 3D map of the environment where the rear car can be localized. Using this inter-car pose estimation, a synthetic image is generated to overcome the occlusion and to create a seamless see-through effect which preserves the structure of the scene.

  3. An augmented reality system in lymphatico-venous anastomosis surgery†

    PubMed Central

    Nishimoto, Soh; Tonooka, Maki; Fujita, Kazutoshi; Sotsuka, Yohei; Fujiwara, Toshihiro; Kawai, Kenichiro; Kakibuchi, Masao

    2016-01-01

    Indocyanine green lymphography, displayed as infrared image, is very useful in identifying lymphatic vessels during surgeries. Surgeons refer the infrared image on the displays as they proceed the operation. Those displays are usually placed on the walls or besides the operation tables. The surgeons cannot watch the infrared image and the operation field simultaneously. They have to move their heads and visual lines. An augmented reality system was developed for simultaneous referring of the infrared image, overlaid on real operation field view. A surgeon wore a see-through eye-glasses type display during lymphatico-venous anastomosis surgery. Infrared image was transferred wirelessly to the display. The surgeon was able to recognize fluorescently shining lymphatic vessels projected on the glasses and dissect them out. PMID:27154749

  4. A Context-Aware Method for Authentically Simulating Outdoors Shadows for Mobile Augmented Reality.

    PubMed

    Barreira, Joao; Bessa, Maximino; Barbosa, Luis; Magalhaes, Luis

    2017-03-01

    Visual coherence between virtual and real objects is a major issue in creating convincing augmented reality (AR) applications. To achieve this seamless integration, actual light conditions must be determined in real time to ensure that virtual objects are correctly illuminated and cast consistent shadows. In this paper, we propose a novel method to estimate daylight illumination and use this information in outdoor AR applications to render virtual objects with coherent shadows. The illumination parameters are acquired in real time from context-aware live sensor data. The method works under unprepared natural conditions. We also present a novel and rapid implementation of a state-of-the-art skylight model, from which the illumination parameters are derived. The Sun's position is calculated based on the user location and time of day, with the relative rotational differences estimated from a gyroscope, compass and accelerometer. The results illustrated that our method can generate visually credible AR scenes with consistent shadows rendered from recovered illumination.

  5. Design and Implementation of a GPS Guidance System for Agricultural Tractors Using Augmented Reality Technology

    PubMed Central

    Santana-Fernández, Javier; Gómez-Gil, Jaime; del-Pozo-San-Cirilo, Laura

    2010-01-01

    Current commercial tractor guidance systems present to the driver information to perform agricultural tasks in the best way. This information generally includes a treated zones map referenced to the tractor’s position. Unlike actual guidance systems where the tractor driver must mentally associate treated zone maps and the plot layout, this paper presents a guidance system that using Augmented Reality (AR) technology, allows the tractor driver to see the real plot though eye monitor glasses with the treated zones in a different color. The paper includes a description of the system hardware and software, a real test done with image captures seen by the tractor driver, and a discussion predicting that the historical evolution of guidance systems could involve the use of AR technology in the agricultural guidance and monitoring systems. PMID:22163479

  6. Jedi training: playful evaluation of head-mounted augmented reality display systems

    NASA Astrophysics Data System (ADS)

    Ozbek, Christopher S.; Giesler, Bjorn; Dillmann, Ruediger

    2004-05-01

    A fundamental decision in building augmented reality (AR) systems is how to accomplish the combining of the real and virtual worlds. Nowadays this key-question boils down to the two alternatives video-see-through (VST) vs. optical-see-through (OST). Both systems have advantages and disadvantages in areas like production-simplicity, resolution, flexibility in composition strategies, field of view etc. To provide additional decision criteria for high dexterity, accuracy tasks and subjective user-acceptance a gaming environment was programmed that allowed good evaluation of hand-eye coordination, and that was inspired by the Star Wars movies. During an experimentation session with more than thirty participants a preference for optical-see-through glasses in conjunction with infra-red-tracking was found. Especially the high-computational demand for video-capture, processing and the resulting drop in frame rate emerged as a key-weakness of the VST-system.

  7. A Protein in the palm of your hand through augmented reality.

    PubMed

    Berry, Colin; Board, Jason

    2014-01-01

    Understanding of proteins and other biological macromolecules must be based on an appreciation of their 3-dimensional shape and the fine details of their structure. Conveying these details in a clear and stimulating fashion can present challenges using conventional approaches and 2-dimensional monitors and projectors. Here we describe a method for the production of 3-D interactive images of protein structures that can be manipulated in real time through the use of augmented reality software. Users first see a real-time image of themselves using the computer's camera, then, when they hold up a trigger image, a model of a molecule appears automatically in the video. This model rotates and translates in space in response to movements of the trigger card. The system described has been optimized to allow customization for the display of user-selected structures to create engaging, educational visualizations to explore 3-D structures.

  8. Designing a mobile augmented reality tool for the locative visualisation of biomedical knowledge.

    PubMed

    Kilby, Jess; Gray, Kathleen; Elliott, Kristine; Waycott, Jenny; Sanchez, Fernando Martin; Dave, Bharat

    2013-01-01

    Mobile augmented reality (MAR) may offer new and engaging ways to support consumer participation in health. We report on design-based research into a MAR application for smartphones and tablets, intended to improve public engagement with biomedical research in a specific urban precinct. Following a review of technical capabilities and organizational and locative design considerations, we worked with staff of four research institutes to elicit their ideas about information and interaction functionalities of a shared MAR app. The results were promising, supporting the development of a prototype and initial field testing with these staff. Evidence from this project may point the way toward user-centred design of MAR services that will enable more widespread adoption of the technology in other healthcare and biomedical research contexts.

  9. Design and implementation of a GPS guidance system for agricultural tractors using augmented reality technology.

    PubMed

    Santana-Fernández, Javier; Gómez-Gil, Jaime; del-Pozo-San-Cirilo, Laura

    2010-01-01

    Current commercial tractor guidance systems present to the driver information to perform agricultural tasks in the best way. This information generally includes a treated zones map referenced to the tractor's position. Unlike actual guidance systems where the tractor driver must mentally associate treated zone maps and the plot layout, this paper presents a guidance system that using Augmented Reality (AR) technology, allows the tractor driver to see the real plot though eye monitor glasses with the treated zones in a different color. The paper includes a description of the system hardware and software, a real test done with image captures seen by the tractor driver, and a discussion predicting that the historical evolution of guidance systems could involve the use of AR technology in the agricultural guidance and monitoring systems.

  10. New education system for construction of optical holography setup - Tangible learning with Augmented Reality

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Takeshi; Yoshikawa, Hiroshi

    2013-02-01

    In case of teaching optical system construction, it is difficult to prepare the optical components for the attendance student. However the tangible learning is very important to master the optical system construction. It helps learners understand easily to use an inexpensive learning system that provides optical experiments experiences. Therefore, we propose the new education system for construction of optical setup with the augmented reality. To use the augmented reality, the proposed system can simulate the optical system construction by the direct hand control. Also, this system only requires an inexpensive web camera, printed makers and a personal computer. Since this system does not require the darkroom and the expensive optical equipments, the learners can study anytime, anywhere when they want to do. In this paper, we developed the system that can teach the optical system construction of the Denisyuk hologram and 2-step transmission type hologram. For the tangible learning and the easy understanding, the proposed system displays the CG objects of the optical components on the markers which are controlled by the learner's hands. The proposed system does not only display the CG object, but also display the light beam which is controlled by the optical components. To display the light beam that is hard to be seen directly, the learners can confirm about what is happening by the own manipulation. For the construction of optical holography setup, we arrange a laser, mirrors, a PBS (polarizing beam splitter), lenses, a polarizer, half-wave plates, spatial filters, an optical power meter and a recording plate. After the construction, proposed system can check optical setup correctly. In comparison with the learners who only read a book, the learners who use the system can construct the optical holography setup more quickly and correctly.

  11. Learning anatomy via mobile augmented reality: Effects on achievement and cognitive load.

    PubMed

    Küçük, Sevda; Kapakin, Samet; Göktaş, Yüksel

    2016-10-01

    Augmented reality (AR), a new generation of technology, has attracted the attention of educators in recent years. In this study, a MagicBook was developed for a neuroanatomy topic by using mobile augmented reality (mAR) technology. This technology integrates virtual learning objects into the real world and allow users to interact with the environment using mobile devices. The purpose of this study was to determine the effects of learning anatomy via mAR on medical students' academic achievement and cognitive load. The mixed method was applied in the study. The random sample consisted of 70 second-year undergraduate medical students: 34 in an experimental group and 36 in a control group. Academic achievement test and cognitive load scale were used as data collection tool. A one-way MANOVA test was used for analysis. The experimental group, which used mAR applications, reported higher achievement and lower cognitive load. The use of mAR applications in anatomy education contributed to the formation of an effective and productive learning environment. Student cognitive load decreased as abstract information became concrete in printed books via multimedia materials in mAR applications. Additionally, students were able to access the materials in the MagicBook anytime and anywhere they wanted. The mobile learning approach helped students learn better by exerting less cognitive effort. Moreover, the sensory experience and real time interaction with environment may provide learning satisfaction and enable students to structure their knowledge to complete the learning tasks. Anat Sci Educ 9: 411-421. © 2016 American Association of Anatomists.

  12. Surgical Navigation Technology Based on Augmented Reality and Integrated 3D Intraoperative Imaging

    PubMed Central

    Elmi-Terander, Adrian; Skulason, Halldor; Söderman, Michael; Racadio, John; Homan, Robert; Babic, Drazenko; van der Vaart, Nijs; Nachabe, Rami

    2016-01-01

    Study Design. A cadaveric laboratory study. Objective. The aim of this study was to assess the feasibility and accuracy of thoracic pedicle screw placement using augmented reality surgical navigation (ARSN). Summary of Background Data. Recent advances in spinal navigation have shown improved accuracy in lumbosacral pedicle screw placement but limited benefits in the thoracic spine. 3D intraoperative imaging and instrument navigation may allow improved accuracy in pedicle screw placement, without the use of x-ray fluoroscopy, and thus opens the route to image-guided minimally invasive therapy in the thoracic spine. Methods. ARSN encompasses a surgical table, a motorized flat detector C-arm with intraoperative 2D/3D capabilities, integrated optical cameras for augmented reality navigation, and noninvasive patient motion tracking. Two neurosurgeons placed 94 pedicle screws in the thoracic spine of four cadavers using ARSN on one side of the spine (47 screws) and free-hand technique on the contralateral side. X-ray fluoroscopy was not used for either technique. Four independent reviewers assessed the postoperative scans, using the Gertzbein grading. Morphometric measurements of the pedicles axial and sagittal widths and angles, as well as the vertebrae axial and sagittal rotations were performed to identify risk factors for breaches. Results. ARSN was feasible and superior to free-hand technique with respect to overall accuracy (85% vs. 64%, P < 0.05), specifically significant increases of perfectly placed screws (51% vs. 30%, P < 0.05) and reductions in breaches beyond 4 mm (2% vs. 25%, P < 0.05). All morphometric dimensions, except for vertebral body axial rotation, were risk factors for larger breaches when performed with the free-hand method. Conclusion. ARSN without fluoroscopy was feasible and demonstrated higher accuracy than free-hand technique for thoracic pedicle screw placement. Level of Evidence: N/A PMID:27513166

  13. Weather Observers: A Manipulative Augmented Reality System for Weather Simulations at Home, in the Classroom, and at a Museum

    ERIC Educational Resources Information Center

    Hsiao, Hsien-Sheng; Chang, Cheng-Sian; Lin, Chien-Yu; Wang, Yau-Zng

    2016-01-01

    This study focused on how to enhance the interactivity and usefulness of augmented reality (AR) by integrating manipulative interactive tools with a real-world environment. A manipulative AR (MAR) system, which included 3D interactive models and manipulative aids, was designed and developed to teach the unit "Understanding Weather" in a…

  14. Delivering Educational Multimedia Contents through an Augmented Reality Application: A Case Study on Its Impact on Knowledge Acquisition and Retention

    ERIC Educational Resources Information Center

    Perez-Lopez, David; Contero, Manuel

    2013-01-01

    This paper presents a study to analyze the use of augmented reality (AR) for delivering multimedia content to support the teaching and learning process of the digestive and circulatory systems at the primary school level, and its impact on knowledge retention. Our AR application combines oral explanations and 3D models and animations of anatomical…

  15. Effects of an Augmented Reality-Based Educational Game on Students' Learning Achievements and Attitudes in Real-World Observations

    ERIC Educational Resources Information Center

    Hwang, Gwo-Jen; Wu, Po-Han; Chen, Chi-Chang; Tu, Nien-Ting

    2016-01-01

    Augmented reality (AR) has been recognized as a potential technology to help students link what they are observing in the real world to their prior knowledge. One of the most challenging issues of AR-based learning is the provision of effective strategy to help students focus on what they need to observe in the field. In this study, a competitive…

  16. Development of an IOS App Using Situated Learning, Communities of Practice, and Augmented Reality for Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Clarkson, Jessica

    2014-01-01

    This paper presents the development process and framework used to construct a transportation app that uses situated learning, augmented reality, and communities of practice. Autism spectrum disorder (ASD) is a neurodevelopmental disorder that can cause social impairments as well as the limit the potential for the individual to achieve independence…

  17. An Augmented Reality-Based Mobile Learning System to Improve Students' Learning Achievements and Motivations in Natural Science Inquiry Activities

    ERIC Educational Resources Information Center

    Chiang, Tosti H. C.; Yang, Stephen J. H.; Hwang, Gwo-Jen

    2014-01-01

    In this study, an augmented reality-based mobile learning system is proposed for conducting inquiry-based learning activities. An experiment has been conducted to examine the effectiveness of the proposed approach in terms of learning achievements and motivations. The subjects were 57 fourth graders from two classes taught by the same teacher in…

  18. Augmented Reality as a Navigation Tool to Employment Opportunities for Postsecondary Education Students with Intellectual Disabilities and Autism

    ERIC Educational Resources Information Center

    McMahon, Don; Cihak, David F.; Wright, Rachel

    2015-01-01

    The purpose of this study was to examine the effects of location-based augmented reality navigation compared to Google Maps and paper maps as navigation aids for students with disabilities. The participants in this single subject study were three college students with intellectual disability and one college student with autism spectrum disorder.…

  19. The Use of Augmented Reality-Enhanced Reading Books for Vocabulary Acquisition with Students Who Are Diagnosed with Special Needs

    ERIC Educational Resources Information Center

    Fecich, Samantha J.

    2014-01-01

    During this collective case study, I explored the use of augmented reality books on an iPad 2 with students diagnosed with disabilities. Students in this study attended a high school life skills class in a rural school district during the fall 2013 semester. Four students participated in this study, two males and two females. Specifically, the…

  20. An Investigation of University Students' Collaborative Inquiry Learning Behaviors in an Augmented Reality Simulation and a Traditional Simulation

    ERIC Educational Resources Information Center

    Wang, Hung-Yuan; Duh, Henry Been-Lirn; Li, Nai; Lin, Tzung-Jin; Tsai, Chin-Chung

    2014-01-01

    The purpose of this study is to investigate and compare students' collaborative inquiry learning behaviors and their behavior patterns in an augmented reality (AR) simulation system and a traditional 2D simulation system. Their inquiry and discussion processes were analyzed by content analysis and lag sequential analysis (LSA). Forty…

  1. Pilot Study Using the Augmented Reality Sandbox to Teach Topographic Maps and Surficial Processes in Introductory Geology Labs

    ERIC Educational Resources Information Center

    Woods, Terri L.; Reed, Sarah; Hsi, Sherry; Woods, John A.; Woods, Michael R.

    2016-01-01

    Spatial thinking is often challenging for introductory geology students. A pilot study using the Augmented Reality sandbox (AR sandbox) suggests it can be a powerful tool for bridging the gap between two-dimensional (2D) representations and real landscapes, as well as enhancing the spatial thinking and modeling abilities of students. The AR…

  2. Employing Augmented-Reality-Embedded Instruction to Disperse the Imparities of Individual Differences in Earth Science Learning

    ERIC Educational Resources Information Center

    Chen, Cheng-ping; Wang, Chang-Hwa

    2015-01-01

    Studies have proven that merging hands-on and online learning can result in an enhanced experience in learning science. In contrast to traditional online learning, multiple in-classroom activities may be involved in an augmented-reality (AR)-embedded e-learning process and thus could reduce the effects of individual differences. Using a…

  3. Applying Augmented Reality to a Mobile-Assisted Learning System for Martial Arts Using Kinect Motion Capture

    ERIC Educational Resources Information Center

    Hsu, Wen-Chun; Shih, Ju-Ling

    2016-01-01

    In this study, to learn the routine of Tantui, a branch of martial arts was taken as an object of research. Fitts' stages of motor learning and augmented reality (AR) were applied to a 3D mobile-assisted learning system for martial arts, which was characterized by free viewing angles. With the new system, learners could rotate the viewing angle of…

  4. Augmented reality visualization in head and neck surgery: an overview of recent findings in sentinel node biopsy and future perspectives.

    PubMed

    Profeta, Andrea Corrado; Schilling, Clare; McGurk, Mark

    2016-07-01

    "Augmented reality visualisation", in which the site of an operation is merged with computer-generated graphics, provides a way to view the relevant part of the patient's body in better detail. We describe its role in relation to sentinel lymph node biopsy (SLNB), current advancements, and future directions in the excision of tumours in early-stage cancers of the head and neck.

  5. Using Tele-Coaching to Increase Behavior-Specific Praise Delivered by Secondary Teachers in an Augmented Reality Learning Environment

    ERIC Educational Resources Information Center

    Elford, Martha Denton

    2013-01-01

    This study analyzes the effects of real-time feedback on teacher behavior in an augmented reality simulation environment. Real-time feedback prompts teachers to deliver behavior-specific praise to students in the TeachLivE KU Lab as an evidence-based practice known to decrease disruptive behavior in inclusive classrooms. All educators face the…

  6. Developing an Interactive Augmented Reality System as a Complement to Plant Education and Comparing Its Effectiveness with Video Learning

    ERIC Educational Resources Information Center

    Chang, Rong-Chi; Chung, Liang-Yi; Huang, Yong-Ming

    2016-01-01

    The learning of plants has garnered considerable attention in recent years, but students often lack the motivation to learn about the process of plant growth. Also, students are not able to apply what they have learned in class in the form of observation, since plant growth takes a long time. In this study, we use augmented reality (AR) technology…

  7. Exploring the Impact of Varying Levels of Augmented Reality to Teach Probability and Sampling with a Mobile Device

    ERIC Educational Resources Information Center

    Conley, Quincy

    2013-01-01

    Statistics is taught at every level of education, yet teachers often have to assume their students have no knowledge of statistics and start from scratch each time they set out to teach statistics. The motivation for this experimental study comes from interest in exploring educational applications of augmented reality (AR) delivered via mobile…

  8. A Comparison Study of Augmented Reality versus Interactive Simulation Technology to Support Student Learning of a Socio-Scientific Issue

    ERIC Educational Resources Information Center

    Chang, Hsin-Yi; Hsu, Ying-Shao; Wu, Hsin-Kai

    2016-01-01

    We investigated the impact of an augmented reality (AR) versus interactive simulation (IS) activity incorporated in a computer learning environment to facilitate students' learning of a socio-scientific issue (SSI) on nuclear power plants and radiation pollution. We employed a quasi-experimental research design. Two classes (a total of 45…

  9. A Mixed Methods Assessment of Students' Flow Experiences during a Mobile Augmented Reality Science Game

    ERIC Educational Resources Information Center

    Bressler, D. M.; Bodzin, A. M.

    2013-01-01

    Current studies have reported that secondary students are highly engaged while playing mobile augmented reality (AR) learning games. Some researchers have posited that players' engagement may indicate a flow experience, but no research results have confirmed this hypothesis with vision-based AR learning games. This study investigated factors…

  10. An augmented reality system validation for the treatment of cockroach phobia.

    PubMed

    Bretón-López, Juani; Quero, Soledad; Botella, Cristina; García-Palacios, Azucena; Baños, Rosa Maria; Alcañiz, Mariano

    2010-12-01

    Augmented reality (AR) is a new technology in which various virtual elements are incorporated into the user's perception of the real world. The most significant aspect of AR is that the virtual elements add relevant and helpful information to the real scene. AR shares some important characteristics with virtual reality as applied in clinical psychology. However, AR offers additional features that might be crucial for treating certain problems. An AR system designed to treat insect phobia has been used for treating phobia of small animals, and positive preliminary data about the global efficacy of the system have been obtained. However, it is necessary to determine the capacity of similar AR systems and their elements that are designed to evoke anxiety in participants; this is achieved by testing the correspondence between the inclusion of feared stimuli and the induction of anxiety. The objective of the present work is to validate whether the stimuli included in the AR-Insect Phobia system are capable of inducing anxiety in six participants diagnosed with cockroach phobia. Results support the adequacy of each element of the system in inducing anxiety in all participants.

  11. Working alliance inventory applied to virtual and augmented reality (WAI-VAR): psychometrics and therapeutic outcomes

    PubMed Central

    Miragall, Marta; Baños, Rosa M.; Cebolla, Ausiàs; Botella, Cristina

    2015-01-01

    This study examines the psychometric properties of the Working Alliance Inventory-Short (WAI-S) adaptation to Virtual Reality (VR) and Augmented Reality (AR) therapies (WAI-VAR). The relationship between the therapeutic alliance (TA) with VR and AR and clinically significant change (CSC) is also explored. Seventy-five patients took part in this study (74.7% women, Mage = 34.41). Fear of flying and adjustment disorder patients received VR therapy, and cockroach phobia patients received AR therapy. Psychometric properties, CSC, one-way ANOVA, Spearman’s Correlations and Multiple Regression were calculated. The WAI-VAR showed a unidimensional structure, high internal consistency and adequate convergent validity. “Not changed” patients scored lower on the WAI-VAR than “improved” and “recovered” patients. Correlation between the WAI-VAR and CSC was moderate. The best fitting model for predicting CSC was a linear combination of the TA with therapist (WAI-S) and the TA with VR and AR (WAI-VAR), due to the latter variable slightly increased the percentage of variability accounted for in CSC. The WAI-VAR is the first validated instrument to measure the TA with VR and AR in research and clinical practice. This study reveals the importance of the quality of the TA with technologies in achieving positive outcomes in the therapy. PMID:26500589

  12. Evaluation of wearable haptic systems for the fingers in Augmented Reality applications.

    PubMed

    Maisto, Maurizio; Pacchierotti, Claudio; Chinello, Francesco; Salvietti, Gionata; De Luca, Alessandro; Prattichizzo, Domenico

    2017-04-05

    Although Augmented Reality (AR) has been around for almost five decades, only recently we have witnessed AR systems and applications entering in our everyday life. Representative examples of this technological revolution are the smartphone games "Pok´emon GO" and "Ingress" or the Google Translate real-time sign interpretation app. Even if AR applications are already quite compelling and widespread, users are still not able to physically interact with the computer-generated reality. In this respect, wearable haptics can provide the compelling illusion of touching the superimposed virtual objects without constraining the motion or the workspace of the user. In this paper, we present the experimental evaluation of two wearable haptic interfaces for the fingers in three AR scenarios, enrolling 38 participants. In the first experiment, subjects were requested to write on a virtual board using a real chalk. The haptic devices provided the interaction forces between the chalk and the board. In the second experiment, subjects were asked to pick and place virtual and real objects. The haptic devices provided the interaction forces due to the weight of the virtual objects. In the third experiment, subjects were asked to balance a virtual sphere on a real cardboard. The haptic devices provided the interaction forces due to the weight of the virtual sphere rolling on the cardboard. Providing haptic feedback through the considered wearable device significantly improved the performance of all the considered tasks. Moreover, subjects significantly preferred conditions providing wearable haptic feedback.

  13. Working alliance inventory applied to virtual and augmented reality (WAI-VAR): psychometrics and therapeutic outcomes.

    PubMed

    Miragall, Marta; Baños, Rosa M; Cebolla, Ausiàs; Botella, Cristina

    2015-01-01

    This study examines the psychometric properties of the Working Alliance Inventory-Short (WAI-S) adaptation to Virtual Reality (VR) and Augmented Reality (AR) therapies (WAI-VAR). The relationship between the therapeutic alliance (TA) with VR and AR and clinically significant change (CSC) is also explored. Seventy-five patients took part in this study (74.7% women, M age = 34.41). Fear of flying and adjustment disorder patients received VR therapy, and cockroach phobia patients received AR therapy. Psychometric properties, CSC, one-way ANOVA, Spearman's Correlations and Multiple Regression were calculated. The WAI-VAR showed a unidimensional structure, high internal consistency and adequate convergent validity. "Not changed" patients scored lower on the WAI-VAR than "improved" and "recovered" patients. Correlation between the WAI-VAR and CSC was moderate. The best fitting model for predicting CSC was a linear combination of the TA with therapist (WAI-S) and the TA with VR and AR (WAI-VAR), due to the latter variable slightly increased the percentage of variability accounted for in CSC. The WAI-VAR is the first validated instrument to measure the TA with VR and AR in research and clinical practice. This study reveals the importance of the quality of the TA with technologies in achieving positive outcomes in the therapy.

  14. Testing and evaluation of a wearable augmented reality system for natural outdoor environments

    NASA Astrophysics Data System (ADS)

    Roberts, David; Menozzi, Alberico; Cook, James; Sherrill, Todd; Snarski, Stephen; Russler, Pat; Clipp, Brian; Karl, Robert; Wenger, Eric; Bennett, Matthew; Mauger, Jennifer; Church, William; Towles, Herman; MacCabe, Stephen; Webb, Jeffrey; Lupo, Jasper; Frahm, Jan-Michael; Dunn, Enrique; Leslie, Christopher; Welch, Greg

    2013-05-01

    This paper describes performance evaluation of a wearable augmented reality system for natural outdoor environments. Applied Research Associates (ARA), as prime integrator on the DARPA ULTRA-Vis (Urban Leader Tactical, Response, Awareness, and Visualization) program, is developing a soldier-worn system to provide intuitive `heads-up' visualization of tactically-relevant geo-registered icons. Our system combines a novel pose estimation capability, a helmet-mounted see-through display, and a wearable processing unit to accurately overlay geo-registered iconography (e.g., navigation waypoints, sensor points of interest, blue forces, aircraft) on the soldier's view of reality. We achieve accurate pose estimation through fusion of inertial, magnetic, GPS, terrain data, and computer-vision inputs. We leverage a helmet-mounted camera and custom computer vision algorithms to provide terrain-based measurements of absolute orientation (i.e., orientation of the helmet with respect to the earth). These orientation measurements, which leverage mountainous terrain horizon geometry and mission planning landmarks, enable our system to operate robustly in the presence of external and body-worn magnetic disturbances. Current field testing activities across a variety of mountainous environments indicate that we can achieve high icon geo-registration accuracy (<10mrad) using these vision-based methods.

  15. Analysis of Mental Workload in Online Shopping: Are Augmented and Virtual Reality Consistent?

    PubMed Central

    Zhao, Xiaojun; Shi, Changxiu; You, Xuqun; Zong, Chenming

    2017-01-01

    A market research company (Nielsen) reported that consumers in the Asia-Pacific region have become the most active group in online shopping. Focusing on augmented reality (AR), which is one of three major techniques used to change the method of shopping in the future, this study used a mixed design to discuss the influences of the method of online shopping, user gender, cognitive style, product value, and sensory channel on mental workload in virtual reality (VR) and AR situations. The results showed that males’ mental workloads were significantly higher than females’. For males, high-value products’ mental workload was significantly higher than that of low-value products. In the VR situation, the visual mental workload of field-independent and field-dependent consumers showed a significant difference, but the difference was reduced under audio–visual conditions. In the AR situation, the visual mental workload of field-independent and field-dependent consumers showed a significant difference, but the difference increased under audio–visual conditions. This study provided a psychological study of online shopping with AR and VR technology with applications in the future. Based on the perspective of embodied cognition, AR online shopping may be potential focus of research and market application. For the future design of online shopping platforms and the updating of user experience, this study provides a reference. PMID:28184207

  16. Analysis of Mental Workload in Online Shopping: Are Augmented and Virtual Reality Consistent?

    PubMed

    Zhao, Xiaojun; Shi, Changxiu; You, Xuqun; Zong, Chenming

    2017-01-01

    A market research company (Nielsen) reported that consumers in the Asia-Pacific region have become the most active group in online shopping. Focusing on augmented reality (AR), which is one of three major techniques used to change the method of shopping in the future, this study used a mixed design to discuss the influences of the method of online shopping, user gender, cognitive style, product value, and sensory channel on mental workload in virtual reality (VR) and AR situations. The results showed that males' mental workloads were significantly higher than females'. For males, high-value products' mental workload was significantly higher than that of low-value products. In the VR situation, the visual mental workload of field-independent and field-dependent consumers showed a significant difference, but the difference was reduced under audio-visual conditions. In the AR situation, the visual mental workload of field-independent and field-dependent consumers showed a significant difference, but the difference increased under audio-visual conditions. This study provided a psychological study of online shopping with AR and VR technology with applications in the future. Based on the perspective of embodied cognition, AR online shopping may be potential focus of research and market application. For the future design of online shopping platforms and the updating of user experience, this study provides a reference.

  17. Engineering and Language Discourse Collaboration: Practice Realities

    ERIC Educational Resources Information Center

    Harran, Marcelle

    2011-01-01

    This article describes a situated engineering project at a South African HE institution which is underpinned by collaboration between Applied Language Studies (DALS) and Mechanical Engineering. The collaboration requires language practitioners and engineering experts to negotiate and collaborate on academic literacies practices, discourse…

  18. Way-Finding Assistance System for Underground Facilities Using Augmented Reality

    NASA Astrophysics Data System (ADS)

    Yokoi, K.; Yabuki, N.; Fukuda, T.; Michikawa, T.; Motamedi, A.

    2015-05-01

    Way-finding is one of main challenges for pedestrians in large subterranean spaces with complex network of connected labyrinths. This problem is caused by the loss of their sense of directions and orientation due to the lack of landmarks that are occluded by ceilings, walls, and skyscraper. This paper introduces an assistance system for way-finding problem in large subterranean spaces using Augmented Reality (AR). It suggests displaying known landmarks that are invisible in indoor environments on tablet/handheld devices to assist users with relative positioning and indoor way-finding. The location and orientation of the users can be estimated by the indoor positioning systems and sensors available in the common tablet or smartphones devices. The constructed 3D model of a chosen landmark that is in the field of view of the handheld's camera is augmented on the camera's video feed. A prototype system has been implemented to demonstrate the efficiency of the proposed system for way-finding.

  19. Precise Haptic Device Co-Location for Visuo-Haptic Augmented Reality.

    PubMed

    Eck, Ulrich; Pankratz, Frieder; Sandor, Christian; Klinker, Gudrun; Laga, Hamid

    2015-12-01

    Visuo-haptic augmented reality systems enable users to see and touch digital information that is embedded in the real world. PHANToM haptic devices are often employed to provide haptic feedback. Precise co-location of computer-generated graphics and the haptic stylus is necessary to provide a realistic user experience. Previous work has focused on calibration procedures that compensate the non-linear position error caused by inaccuracies in the joint angle sensors. In this article we present a more complete procedure that additionally compensates for errors in the gimbal sensors and improves position calibration. The proposed procedure further includes software-based temporal alignment of sensor data and a method for the estimation of a reference for position calibration, resulting in increased robustness against haptic device initialization and external tracker noise. We designed our procedure to require minimal user input to maximize usability. We conducted an extensive evaluation with two different PHANToMs, two different optical trackers, and a mechanical tracker. Compared to state-of-the-art calibration procedures, our approach significantly improves the co-location of the haptic stylus. This results in higher fidelity visual and haptic augmentations, which are crucial for fine-motor tasks in areas such as medical training simulators, assembly planning tools, or rapid prototyping applications.

  20. Augmented reality with image registration, vision correction and sunlight readability via liquid crystal devices.

    PubMed

    Wang, Yu-Jen; Chen, Po-Ju; Liang, Xiao; Lin, Yi-Hsin

    2017-03-27

    Augmented reality (AR), which use computer-aided projected information to augment our sense, has important impact on human life, especially for the elder people. However, there are three major challenges regarding the optical system in the AR system, which are registration, vision correction, and readability under strong ambient light. Here, we solve three challenges simultaneously for the first time using two liquid crystal (LC) lenses and polarizer-free attenuator integrated in optical-see-through AR system. One of the LC lens is used to electrically adjust the position of the projected virtual image which is so-called registration. The other LC lens with larger aperture and polarization independent characteristic is in charge of vision correction, such as myopia and presbyopia. The linearity of lens powers of two LC lenses is also discussed. The readability of virtual images under strong ambient light is solved by electrically switchable transmittance of the LC attenuator originating from light scattering and light absorption. The concept demonstrated in this paper could be further extended to other electro-optical devices as long as the devices exhibit the capability of phase modulations and amplitude modulations.

  1. There's an app for that shirt! Evaluation of augmented reality tracking methods on deformable surfaces for fashion design

    NASA Astrophysics Data System (ADS)

    Ruzanka, Silvia; Chang, Ben; Behar, Katherine

    2013-03-01

    In this paper we present appARel, a creative research project at the intersection of augmented reality, fashion, and performance art. appARel is a mobile augmented reality application that transforms otherwise ordinary garments with 3D animations and modifications. With appARel, entire fashion collections can be uploaded in a smartphone application, and "new looks" can be downloaded in a software update. The project will culminate in a performance art fashion show, scheduled for March 2013. appARel includes textile designs incorporating fiducial markers, garment designs that incorporate multiple markers with the human body, and iOS and Android apps that apply different augments, or "looks", to a garment. We discuss our philosophy for combining computer-generated and physical objects; and share the challenges we encountered in applying fiduciary markers to the 3D curvatures of the human body.

  2. Magnetohydrodynamic Augmentation of Pulse Detonation Rocket Engines (Preprint)

    DTIC Science & Technology

    2010-09-28

    system will either be over- or under-expanded for the majority of the cycle , with en- ergy being used without maximum gain. Magnetohydrodynamic ( MHD ...to their potentially superior performance over constant pressure cycle engines. Yet due to its unsteady chamber pressure, the PDE system will either...be over- or under-expanded for the majority of the cycle , with energy being used without maximum gain. Magnetohydrodynamic ( MHD ) augmentation offers

  3. Systems Engineering: From Dream to Reality

    DTIC Science & Technology

    2011-04-01

    Systems Engineering (2) • A project is a veritable "Tower of Babel " • Potentially dozens engineering specialist – SE provides linkage to enable them to...the system and find the "right approach". • The "right approach" usually comes after multiple "wrong approaches". • The "right approach" is

  4. Two Innovative Steps for Training on Maintenance: 'VIRMAN' Spanish Project based on Virtual Reality 'STARMATE' European Project based on Augmented Reality

    SciTech Connect

    Gonzalez Anez, Francisco

    2002-07-01

    This paper presents two development projects (STARMATE and VIRMAN) focused on supporting training on maintenance. Both projects aim at specifying, designing, developing, and demonstrating prototypes allowing computer guided maintenance of complex mechanical elements using Augmented and Virtual Reality techniques. VIRMAN is a Spanish development project. The objective is to create a computer tool for maintenance training course elaborations and training delivery based on 3D virtual reality models of complex components. The training delivery includes 3D record displays on maintenance procedures with all complementary information for intervention understanding. Users are requested to perform the maintenance intervention trying to follow up the procedure. Users can be evaluated about the level of knowledge achieved. Instructors can check the evaluation records left during the training sessions. VIRMAN is simple software supported by a regular computer and can be used in an Internet framework. STARMATE is a forward step in the area of virtual reality. STARMATE is a European Commission project in the frame of 'Information Societies Technologies'. A consortium of five companies and one research institute shares their expertise in this new technology. STARMATE provides two main functionalities (1) user assistance for achieving assembly/de-assembly and following maintenance procedures, and (2) workforce training. The project relies on Augmented Reality techniques, which is a growing area in Virtual Reality research. The idea of Augmented Reality is to combine a real scene, viewed by the user, with a virtual scene, generated by a computer, augmenting the reality with additional information. The user interface is see-through goggles, headphones, microphone and an optical tracking system. All these devices are integrated in a helmet connected with two regular computers. The user has his hands free for performing the maintenance intervention and he can navigate in the virtual

  5. Development of CFD model for augmented core tripropellant rocket engine

    NASA Technical Reports Server (NTRS)

    Jones, Kenneth M.

    1994-01-01

    The Space Shuttle era has made major advances in technology and vehicle design to the point that the concept of a single-stage-to-orbit (SSTO) vehicle appears more feasible. NASA presently is conducting studies into the feasibility of certain advanced concept rocket engines that could be utilized in a SSTO vehicle. One such concept is a tripropellant system which burns kerosene and hydrogen initially and at altitude switches to hydrogen. This system will attain a larger mass fraction because LOX-kerosene engines have a greater average propellant density and greater thrust-to-weight ratio. This report describes the investigation to model the tripropellant augmented core engine. The physical aspects of the engine, the CFD code employed, and results of the numerical model for a single modular thruster are discussed.

  6. Virtual Planning, Control, and Machining for a Modular-Based Automated Factory Operation in an Augmented Reality Environment

    PubMed Central

    Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S.; Phoon, Sin Ye

    2016-01-01

    This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively. PMID:27271840

  7. Virtual Planning, Control, and Machining for a Modular-Based Automated Factory Operation in an Augmented Reality Environment.

    PubMed

    Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S; Phoon, Sin Ye

    2016-06-07

    This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively.

  8. Virtual Planning, Control, and Machining for a Modular-Based Automated Factory Operation in an Augmented Reality Environment

    NASA Astrophysics Data System (ADS)

    Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S.; Phoon, Sin Ye

    2016-06-01

    This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively.

  9. Augmentation of engineered cartilage to bone integration using hydroxyapatite.

    PubMed

    Dua, Rupak; Centeno, Jerry; Ramaswamy, Sharan

    2014-07-01

    Articular cartilage injuries occur frequently in the knee joint. Photopolymerizable cartilage tissue engineering approaches appear promising; however, fundamentally, forming a stable interface between the subchondral bone and tissue engineered cartilage components remains a major challenge. We investigated the utility of hydroxyapatite (HA) nanoparticles to promote controlled bone-growth across the bone-cartilage interface in an in vitro engineered tissue model system using bone marrow derived stem cells. Samples incorporated with HA demonstrated significantly higher interfacial shear strength (at the junction between engineered cartilage and engineered bone) compared with the constructs without HA (p < 0.05), after 28 days of culture. Interestingly, this increased interfacial shear strength due to the presence of HA was observed as early as 7 days and appeared to have sustained itself for an additional three weeks without interacting with strength increases attributable to subsequent secretion of engineered tissue matrix. Histological evidence showed that there was ∼7.5% bone in-growth into the cartilage region from the bone side. The mechanism of enhanced engineered cartilage to bone integration with HA incorporation appeared to be facilitated by the deposition of calcium phosphate in the transition zone. These findings indicate that controlled bone in-growth using HA incorporation permits more stable anchorage of the injectable hydrogel-based engineered cartilage construct via augmented integration between bone and cartilage.

  10. Design of Mobile Augmented Reality in Health Care Education: A Theory-Driven Framework

    PubMed Central

    Lilienthal, Anneliese; Shluzas, Lauren Aquino; Masiello, Italo; Zary, Nabil

    2015-01-01

    Background Augmented reality (AR) is increasingly used across a range of subject areas in health care education as health care settings partner to bridge the gap between knowledge and practice. As the first contact with patients, general practitioners (GPs) are important in the battle against a global health threat, the spread of antibiotic resistance. AR has potential as a practical tool for GPs to combine learning and practice in the rational use of antibiotics. Objective This paper was driven by learning theory to develop a mobile augmented reality education (MARE) design framework. The primary goal of the framework is to guide the development of AR educational apps. This study focuses on (1) identifying suitable learning theories for guiding the design of AR education apps, (2) integrating learning outcomes and learning theories to support health care education through AR, and (3) applying the design framework in the context of improving GPs’ rational use of antibiotics. Methods The design framework was first constructed with the conceptual framework analysis method. Data were collected from multidisciplinary publications and reference materials and were analyzed with directed content analysis to identify key concepts and their relationships. Then the design framework was applied to a health care educational challenge. Results The proposed MARE framework consists of three hierarchical layers: the foundation, function, and outcome layers. Three learning theories—situated, experiential, and transformative learning—provide foundational support based on differing views of the relationships among learning, practice, and the environment. The function layer depends upon the learners’ personal paradigms and indicates how health care learning could be achieved with MARE. The outcome layer analyzes different learning abilities, from knowledge to the practice level, to clarify learning objectives and expectations and to avoid teaching pitched at the wrong level

  11. A virtual reality endoscopic simulator augments general surgery resident cancer education as measured by performance improvement.

    PubMed

    White, Ian; Buchberg, Brian; Tsikitis, V Liana; Herzig, Daniel O; Vetto, John T; Lu, Kim C

    2014-06-01

    Colorectal cancer is the second most common cause of death in the USA. The need for screening colonoscopies, and thus adequately trained endoscopists, particularly in rural areas, is on the rise. Recent increases in required endoscopic cases for surgical resident graduation by the Surgery Residency Review Committee (RRC) further emphasize the need for more effective endoscopic training during residency to determine if a virtual reality colonoscopy simulator enhances surgical resident endoscopic education by detecting improvement in colonoscopy skills before and after 6 weeks of formal clinical endoscopic training. We conducted a retrospective review of prospectively collected surgery resident data on an endoscopy simulator. Residents performed four different clinical scenarios on the endoscopic simulator before and after a 6-week endoscopic training course. Data were collected over a 5-year period from 94 different residents performing a total of 795 colonoscopic simulation scenarios. Main outcome measures included time to cecal intubation, "red out" time, and severity of simulated patient discomfort (mild, moderate, severe, extreme) during colonoscopy scenarios. Average time to intubation of the cecum was 6.8 min for those residents who had not undergone endoscopic training versus 4.4 min for those who had undergone endoscopic training (p < 0.001). Residents who could be compared against themselves (pre vs. post-training), cecal intubation times decreased from 7.1 to 4.3 min (p < 0.001). Post-endoscopy rotation residents caused less severe discomfort during simulated colonoscopy than pre-endoscopy rotation residents (4 vs. 10%; p = 0.004). Virtual reality endoscopic simulation is an effective tool for both augmenting surgical resident endoscopy cancer education and measuring improvement in resident performance after formal clinical endoscopic training.

  12. Shaping Watersheds Exhibit: An Interactive, Augmented Reality Sandbox for Advancing Earth Science Education

    NASA Astrophysics Data System (ADS)

    Reed, S. E.; Kreylos, O.; Hsi, S.; Kellogg, L. H.; Schladow, G.; Yikilmaz, M. B.; Segale, H.; Silverman, J.; Yalowitz, S.; Sato, E.

    2014-12-01

    One of the challenges involved in learning earth science is the visualization of processes which occur over large spatial and temporal scales. Shaping Watersheds is an interactive 3D exhibit developed with support from the National Science Foundation by a team of scientists, science educators, exhibit designers, and evaluation professionals, in an effort to improve public understanding and stewardship of freshwater ecosystems. The hands-on augmented reality sandbox allows users to create topographic models by shaping real "kinetic" sand. The exhibit is augmented in real time by the projection of a color elevation map and contour lines which exactly match the sand topography, using a closed loop of a Microsoft Kinect 3D camera, simulation and visualization software, and a data projector. When an object (such as a hand) is sensed at a particular height above the sand surface, virtual rain appears as a blue visualization on the surface and a flow simulation (based on a depth-integrated version of the Navier-Stokes equations) moves the water across the landscape. The blueprints and software to build the sandbox are freely available online (http://3dh2o.org/71/) under the GNU General Public License, together with a facilitator's guide and a public forum (with how-to documents and FAQs). Using these resources, many institutions (20 and counting) have built their own exhibits to teach a wide variety of topics (ranging from watershed stewardship, hydrology, geology, topographic map reading, and planetary science) in a variety of venues (such as traveling science exhibits, K-12 schools, university earth science departments, and museums). Additional exhibit extensions and learning modules are planned such as tsunami modeling and prediction. Moreover, a study is underway at the Lawrence Hall of Science to assess how various aspects of the sandbox (such as visualization color scheme and level of interactivity) affect understanding of earth science concepts.

  13. Scientist and Engineer Shortage: Myth or Reality?

    ERIC Educational Resources Information Center

    Post, Jan F.

    2006-01-01

    With clockwork regularity, the real or perceived shortage of scientists and engineers in the US pops up as a topic of debate in academic and industry circles. Discussions of an imminent shortage have deep impact for education, career prospects, immigration, and "The American Dream." The purpose of this article is twofold. First, it poses a…

  14. An augmented reality home-training system based on the mirror training and imagery approach.

    PubMed

    Trojan, Jörg; Diers, Martin; Fuchs, Xaver; Bach, Felix; Bekrater-Bodmann, Robin; Foell, Jens; Kamping, Sandra; Rance, Mariela; Maaß, Heiko; Flor, Herta

    2014-09-01

    Mirror training and movement imagery have been demonstrated to be effective in treating several clinical conditions, such as phantom limb pain, stroke-induced hemiparesis, and complex regional pain syndrome. This article presents an augmented reality home-training system based on the mirror and imagery treatment approaches for hand training. A head-mounted display equipped with cameras captures one hand held in front of the body, mirrors this hand, and displays it in real time in a set of four different training tasks: (1) flexing fingers in a predefined sequence, (2) moving the hand into a posture fitting into a silhouette template, (3) driving a "Snake" video game with the index finger, and (4) grasping and moving a virtual ball. The system records task performance and transfers these data to a central server via the Internet, allowing monitoring of training progress. We evaluated the system by having 7 healthy participants train with it over the course of ten sessions of 15-min duration. No technical problems emerged during this time. Performance indicators showed that the system achieves a good balance between relatively easy and more challenging tasks and that participants improved significantly over the training sessions. This suggests that the system is well suited to maintain motivation in patients, especially when it is used for a prolonged period of time.

  15. Stereoscopic augmented reality using ultrasound volume rendering for laparoscopic surgery in children

    NASA Astrophysics Data System (ADS)

    Oh, Jihun; Kang, Xin; Wilson, Emmanuel; Peters, Craig A.; Kane, Timothy D.; Shekhar, Raj

    2014-03-01

    In laparoscopic surgery, live video provides visualization of the exposed organ surfaces in the surgical field, but is unable to show internal structures beneath those surfaces. The laparoscopic ultrasound is often used to visualize the internal structures, but its use is limited to intermittent confirmation because of the need for an extra hand to maneuver the ultrasound probe. Other limitations of using ultrasound are the difficulty of interpretation and the need for an extra port. The size of the ultrasound transducer may also be too large for its usage in small children. In this paper, we report on an augmented reality (AR) visualization system that features continuous hands-free volumetric ultrasound scanning of the surgical anatomy and video imaging from a stereoscopic laparoscope. The acquisition of volumetric ultrasound image is realized by precisely controlling a back-and-forth movement of an ultrasound transducer mounted on a linear slider. Furthermore, the ultrasound volume is refreshed several times per minute. This scanner will sit outside of the body in the envisioned use scenario and could be even integrated into the operating table. An overlay of the maximum intensity projection (MIP) of ultrasound volume on the laparoscopic stereo video through geometric transformations features an AR visualization system particularly suitable for children, because ultrasound is radiation-free and provides higher-quality images in small patients. The proposed AR representation promises to be better than the AR representation using ultrasound slice data.

  16. Robust and efficient fiducial tracking for augmented reality in HD-laparoscopic video streams

    NASA Astrophysics Data System (ADS)

    Mueller, M.; Groch, A.; Baumhauer, M.; Maier-Hein, L.; Teber, D.; Rassweiler, J.; Meinzer, H.-P.; Wegner, In.

    2012-02-01

    Augmented Reality (AR) is a convenient way of porting information from medical images into the surgical field of view and can deliver valuable assistance to the surgeon, especially in laparoscopic procedures. In addition, high definition (HD) laparoscopic video devices are a great improvement over the previously used low resolution equipment. However, in AR applications that rely on real-time detection of fiducials from video streams, the demand for efficient image processing has increased due to the introduction of HD devices. We present an algorithm based on the well-known Conditional Density Propagation (CONDENSATION) algorithm which can satisfy these new demands. By incorporating a prediction around an already existing and robust segmentation algorithm, we can speed up the whole procedure while leaving the robustness of the fiducial segmentation untouched. For evaluation purposes we tested the algorithm on recordings from real interventions, allowing for a meaningful interpretation of the results. Our results show that we can accelerate the segmentation by a factor of 3.5 on average. Moreover, the prediction information can be used to compensate for fiducials that are temporarily occluded or out of scope, providing greater stability.

  17. Application of the Augmented Reality in prototyping the educational simulator in sport - the example of judo

    NASA Astrophysics Data System (ADS)

    Cieślńiski, Wojciech B.; Sobecki, Janusz; Piepiora, Paweł A.; Piepiora, Zbigniew N.; Witkowski, Kazimierz

    2016-04-01

    The mental training (Galloway, 2011) is one of the measures of the psychological preparation in sport. Especially such as the judo discipline requires the mental training, due to the fact that the judo is a combat sport, the direct, physical confrontation of two opponents. Hence the mental preparation should be an essential element of preparing for the sports fight. In the article are described the basics of the AR systems and presents selected elements of the AR systems: sight glasses Vuzix glasses, Kinect sensor and an interactive floor Multitap. Next, there are proposed the scenarios for using the AR in the mental training which are based on using both Vuzix glasses type head as well as the interactive floor Multitap. All options, except for the last, are provided for using the Kinect sensor. In addition, these variants differ as to the primary user of the system. It can be an competitor, his coach the competitor and the coach at the same time. In the end of the article are presented methods of exploring, both, the effectiveness and usefulness, and/or the User Experience of the proposed prototypes. There are presented three educational training simulator prototype models in sport (judo) describing their functionality based on the theory of sports training (the cyclical nature of sports training) and the theory of subtle interactions, enabling an explanation of the effects of sports training using the augmented reality technology.

  18. Affordances and Limitations of Immersive Participatory Augmented Reality Simulations for Teaching and Learning

    NASA Astrophysics Data System (ADS)

    Dunleavy, Matt; Dede, Chris; Mitchell, Rebecca

    2009-02-01

    The purpose of this study was to document how teachers and students describe and comprehend the ways in which participating in an augmented reality (AR) simulation aids or hinders teaching and learning. Like the multi-user virtual environment (MUVE) interface that underlies Internet games, AR is a good medium for immersive collaborative simulation, but has different strengths and limitations than MUVEs. Within a design-based research project, the researchers conducted multiple qualitative case studies across two middle schools (6th and 7th grade) and one high school (10th grade) in the northeastern United States to document the affordances and limitations of AR simulations from the student and teacher perspective. The researchers collected data through formal and informal interviews, direct observations, web site posts, and site documents. Teachers and students reported that the technology-mediated narrative and the interactive, situated, collaborative problem solving affordances of the AR simulation were highly engaging, especially among students who had previously presented behavioral and academic challenges for the teachers. However, while the AR simulation provided potentially transformative added value, it simultaneously presented unique technological, managerial, and cognitive challenges to teaching and learning.

  19. Hand gesture guided robot-assisted surgery based on a direct augmented reality interface.

    PubMed

    Wen, Rong; Tay, Wei-Liang; Nguyen, Binh P; Chng, Chin-Boon; Chui, Chee-Kong

    2014-09-01

    Radiofrequency (RF) ablation is a good alternative to hepatic resection for treatment of liver tumors. However, accurate needle insertion requires precise hand-eye coordination and is also affected by the difficulty of RF needle navigation. This paper proposes a cooperative surgical robot system, guided by hand gestures and supported by an augmented reality (AR)-based surgical field, for robot-assisted percutaneous treatment. It establishes a robot-assisted natural AR guidance mechanism that incorporates the advantages of the following three aspects: AR visual guidance information, surgeon's experiences and accuracy of robotic surgery. A projector-based AR environment is directly overlaid on a patient to display preoperative and intraoperative information, while a mobile surgical robot system implements specified RF needle insertion plans. Natural hand gestures are used as an intuitive and robust method to interact with both the AR system and surgical robot. The proposed system was evaluated on a mannequin model. Experimental results demonstrated that hand gesture guidance was able to effectively guide the surgical robot, and the robot-assisted implementation was found to improve the accuracy of needle insertion. This human-robot cooperative mechanism is a promising approach for precise transcutaneous ablation therapy.

  20. Composite lay-up process with application of elements of augmented reality

    NASA Astrophysics Data System (ADS)

    Novák-Marcinčin, Jozef; Barna, Jozef; Janák, Miroslav; Fečová, Veronika; Nováková-Marcinčinová, L'udmila

    2012-03-01

    This article deals with virtual tools based on principles of open source philosophy in implementation area of composite lay-up technology. It describes virtual software and hardware elements that are necessary for work in augmented reality environment. In the beginning it focuses on general problems of applications of virtual components and in composite lay-up process. It clarifies the fundamental philosophy of new created application and the process called visual scripting that was used for program development. Further it provides closer view on particular logical sections, where necessary data are harvested and compared with values from virtual arrays. Also the new device is described with adjustment of operating desk, what enables detailed control of any realized manufacturing process. This positioning table can determine and set the position of the working plane using the commands in computer interface or manual changes. Information about exact position of individual layers are obtained in real time thanks to the built-in sensors. One of them manages the data change of the desk position (X, Y, Z), other checks the rotation around main axis situated in the center of the table. New software consists of 4 main logical areas with data packets comming from internal computer components as well as from external devices. In the end the displaying section is able to realize the movement process of virtual item (composite layer) according to its trajectory. Article presents new attitude in realization of composite lay-up process. Finally it deals with possible future improvements and other application possibilities.

  1. Matching and reaching depth judgments with real and augmented reality targets.

    PubMed

    Swan, J Edward; Singh, Gurjot; Ellis, Stephen R

    2015-11-01

    Many compelling augmented reality (AR) applications require users to correctly perceive the location of virtual objects, some with accuracies as tight as 1 mm. However, measuring the perceived depth of AR objects at these accuracies has not yet been demonstrated. In this paper, we address this challenge by employing two different depth judgment methods, perceptual matching and blind reaching, in a series of three experiments, where observers judged the depth of real and AR target objects presented at reaching distances. Our experiments found that observers can accurately match the distance of a real target, but when viewing an AR target through collimating optics, their matches systematically overestimate the distance by 0.5 to 4.0 cm. However, these results can be explained by a model where the collimation causes the eyes' vergence angle to rotate outward by a constant angular amount. These findings give error bounds for using collimating AR displays at reaching distances, and suggest that for these applications, AR displays need to provide an adjustable focus. Our experiments further found that observers initially reach ∼4 cm too short, but reaching accuracy improves with both consistent proprioception and corrective visual feedback, and eventually becomes nearly as accurate as matching.

  2. An Augmented Reality-Based Approach for Surgical Telementoring in Austere Environments.

    PubMed

    Andersen, Dan; Popescu, Voicu; Cabrera, Maria Eugenia; Shanghavi, Aditya; Mullis, Brian; Marley, Sherri; Gomez, Gerardo; Wachs, Juan P

    2017-03-01

    Telementoring can improve treatment of combat trauma injuries by connecting remote experienced surgeons with local less-experienced surgeons in an austere environment. Current surgical telementoring systems force the local surgeon to regularly shift focus away from the operating field to receive expert guidance, which can lead to surgery delays or even errors. The System for Telementoring with Augmented Reality (STAR) integrates expert-created annotations directly into the local surgeon's field of view. The local surgeon views the operating field by looking at a tablet display suspended between the patient and the surgeon that captures video of the surgical field. The remote surgeon remotely adds graphical annotations to the video. The annotations are sent back and displayed to the local surgeon while being automatically anchored to the operating field elements they describe. A technical evaluation demonstrates that STAR robustly anchors annotations despite tablet repositioning and occlusions. In a user study, participants used either STAR or a conventional telementoring system to precisely mark locations on a surgical simulator under a remote surgeon's guidance. Participants who used STAR completed the task with fewer focus shifts and with greater accuracy. The STAR reduces the local surgeon's need to shift attention during surgery, allowing him or her to continuously work while looking "through" the tablet screen.

  3. Holographic display for see-through augmented reality using mirror-lens holographic optical element.

    PubMed

    Li, Gang; Lee, Dukho; Jeong, Youngmo; Cho, Jaebum; Lee, Byoungho

    2016-06-01

    A holographic display system for realizing a three-dimensional optical see-through augmented reality (AR) is proposed. A multi-functional holographic optical element (HOE), which simultaneously performs the optical functions of a mirror and a lens, is adopted in the system. In the proposed method, a mirror that is used to guide the light source into a reflection type spatial light modulator (SLM) and a lens that functions as Fourier transforming optics are recorded on a single holographic recording material by utilizing an angular multiplexing technique of volume hologram. The HOE is transparent and performs the optical functions just for Bragg matched condition. Therefore, the real-world scenes that are usually distorted by a Fourier lens or an SLM in the conventional holographic display can be observed without visual disturbance by using the proposed mirror-lens HOE (MLHOE). Furthermore, to achieve an optimized optical recording condition of the MLHOE, the optical characteristics of the holographic material are measured. The proposed holographic AR display system is verified experimentally.

  4. Image guidance of breast cancer surgery using 3-D ultrasound images and augmented reality visualization.

    PubMed

    Sato, Y; Nakamoto, M; Tamaki, Y; Sasama, T; Sakita, I; Nakajima, Y; Monden, M; Tamura, S

    1998-10-01

    This paper describes augmented reality visualization for the guidance of breast-conservative cancer surgery using ultrasonic images acquired in the operating room just before surgical resection. By combining an optical three-dimensional (3-D) position sensor, the position and orientation of each ultrasonic cross section are precisely measured to reconstruct geometrically accurate 3-D tumor models from the acquired ultrasonic images. Similarly, the 3-D position and orientation of a video camera are obtained to integrate video and ultrasonic images in a geometrically accurate manner. Superimposing the 3-D tumor models onto live video images of the patient's breast enables the surgeon to perceive the exact 3-D position of the tumor, including irregular cancer invasions which cannot be perceived by touch, as if it were visible through the breast skin. Using the resultant visualization, the surgeon can determine the region for surgical resection in a more objective and accurate manner, thereby minimizing the risk of a relapse and maximizing breast conservation. The system was shown to be effective in experiments using phantom and clinical data.

  5. Augmented reality technology for day/night situational awareness for the dismounted Soldier

    NASA Astrophysics Data System (ADS)

    Gans, Eric; Roberts, David; Bennett, Matthew; Towles, Herman; Menozzi, Alberico; Cook, James; Sherrill, Todd

    2015-05-01

    This paper describes Applied Research Associates' (ARA) recent advances in Soldier augmented reality (AR) technology. Our AR technology, called ARC4, delivers heads-up situational awareness to the dismounted warfighter, enabling non-line-of-sight team coordination in distributed operations. ARC4 combines compact head tracking sensors with advanced pose estimation algorithms, network management software, and an intuitive AR visualization interface to overlay tactical iconic information accurately on the user's real-world view. The technology supports heads-up navigation, blue-force tracking, target handoff, image sharing, and tagging of features in the environment. It integrates seamlessly with established network protocols (e.g., Cursor-on-Target) and Command and Control software tools (e.g., Nett Warrior, Android Tactical Assault Kit) and interfaces with a wide range of daytime see-through displays and night vision goggles to deliver real-time actionable intelligence, day or night. We describe our pose estimation framework, which fuses inertial data, magnetometer data, GPS, DTED, and digital imagery to provide measurements of the operator's precise orientation. These measurements leverage mountainous terrain horizon geometry, known landmarks, and sun position, enabling ARC4 to achieve significant improvements in accuracy compared to conventional INS/GPS solutions of similar size, weight, and power. We detail current research and development efforts toward helmet-based and handheld AR systems for operational use cases and describe extensions to immersive training applications.

  6. Providing IoT Services in Smart Cities through Dynamic Augmented Reality Markers.

    PubMed

    Chaves-Diéguez, David; Pellitero-Rivero, Alexandre; García-Coego, Daniel; González-Castaño, Francisco Javier; Rodríguez-Hernández, Pedro Salvador; Piñeiro-Gómez, Óscar; Gil-Castiñeira, Felipe; Costa-Montenegro, Enrique

    2015-07-03

    Smart cities are expected to improve the quality of life of citizens by relying on new paradigms, such as the Internet of Things (IoT) and its capacity to manage and interconnect thousands of sensors and actuators scattered across the city. At the same time, mobile devices widely assist professional and personal everyday activities. A very good example of the potential of these devices for smart cities is their powerful support for intuitive service interfaces (such as those based on augmented reality (AR)) for non-expert users. In our work, we consider a scenario that combines IoT and AR within a smart city maintenance service to improve the accessibility of sensor and actuator devices in the field, where responsiveness is crucial. In it, depending on the location and needs of each service, data and commands will be transported by an urban communications network or consulted on the spot. Direct AR interaction with urban objects has already been described; it usually relies on 2D visual codes to deliver object identifiers (IDs) to the rendering device to identify object resources. These IDs allow information about the objects to be retrieved from a remote server. In this work, we present a novel solution that replaces static AR markers with dynamic markers based on LED communication, which can be decoded through cameras embedded in smartphones. These dynamic markers can directly deliver sensor information to the rendering device, on top of the object ID, without further network interaction.

  7. Learning Optimized Local Difference Binaries for Scalable Augmented Reality on Mobile Devices.

    PubMed

    Xin Yang; Kwang-Ting Cheng

    2014-06-01

    The efficiency, robustness and distinctiveness of a feature descriptor are critical to the user experience and scalability of a mobile augmented reality (AR) system. However, existing descriptors are either too computationally expensive to achieve real-time performance on a mobile device such as a smartphone or tablet, or not sufficiently robust and distinctive to identify correct matches from a large database. As a result, current mobile AR systems still only have limited capabilities, which greatly restrict their deployment in practice. In this paper, we propose a highly efficient, robust and distinctive binary descriptor, called Learning-based Local Difference Binary (LLDB). LLDB directly computes a binary string for an image patch using simple intensity and gradient difference tests on pairwise grid cells within the patch. To select an optimized set of grid cell pairs, we densely sample grid cells from an image patch and then leverage a modified AdaBoost algorithm to automatically extract a small set of critical ones with the goal of maximizing the Hamming distance between mismatches while minimizing it between matches. Experimental results demonstrate that LLDB is extremely fast to compute and to match against a large database due to its high robustness and distinctiveness. Compared to the state-of-the-art binary descriptors, primarily designed for speed, LLDB has similar efficiency for descriptor construction, while achieving a greater accuracy and faster matching speed when matching over a large database with 2.3M descriptors on mobile devices.

  8. An augmented reality platform for planning of minimally invasive cardiac surgeries

    NASA Astrophysics Data System (ADS)

    Chen, Elvis C. S.; Sarkar, Kripasindhu; Baxter, John S. H.; Moore, John; Wedlake, Chris; Peters, Terry M.

    2012-02-01

    One of the fundamental components in all Image Guided Surgery (IGS) applications is a method for presenting information to the surgeon in a simple, effective manner. This paper describes the first steps in our new Augmented Reality (AR) information delivery program. The system makes use of new "off the shelf" AR glasses that are both light-weight and unobtrusive, with adequate resolution for many IGS applications. Our first application is perioperative planning of minimally invasive robot-assisted cardiac surgery. In this procedure, a combination of tracking technologies and intraoperative ultrasound is used to map the migration of cardiac targets prior to selection of port locations for trocars that enter the chest. The AR glasses will then be used to present this heart migration data to the surgeon, overlaid onto the patients chest. The current paper describes the calibration process for the AR glasses, their integration into our IGS framework for minimally invasive robotic cardiac surgery, and preliminary validation of the system. Validation results indicate a mean 3D triangulation error of 2.9 +/- 3.3mm, 2D projection error of 2.1 +/- 2.1 pixels, and Normalized Stereo Calibration Error of 3.3.

  9. ARBOOK: Development and Assessment of a Tool Based on Augmented Reality for Anatomy

    NASA Astrophysics Data System (ADS)

    Ferrer-Torregrosa, J.; Torralba, J.; Jimenez, M. A.; García, S.; Barcia, J. M.

    2015-02-01

    The evolution of technologies and the development of new tools with educational purposes are growing up. This work presents the experience of a new tool based on augmented reality (AR) focusing on the anatomy of the lower limb. ARBOOK was constructed and developed based on TC and MRN images, dissections and drawings. For ARBOOK evaluation, a specific questionnaire of three blocks was performed and validated according to the Delphi method. The questionnaire included motivation and attention tasks, autonomous work and three-dimensional interpretation tasks. A total of 211 students from 7 public and private Spanish universities were divided in two groups. Control group received standard teaching sessions supported by books, and video. The ARBOOK group received the same standard sessions but additionally used the ARBOOK tool. At the end of the training, a written test on lower limb anatomy was done by students. Statistically significant better scorings for the ARBOOK group were found on attention-motivation, autonomous work and three-dimensional comprehension tasks. Additionally, significantly better scoring was obtained by the ARBOOK group in the written test. The results strongly suggest that the use of AR is suitable for anatomical purposes. Concretely, the results indicate how this technology is helpful for student motivation, autonomous work or spatial interpretation. The use of this type of technologies must be taken into account even more at the present moment, when new technologies are naturally incorporated to our current lives.

  10. Augmented reality system for oral surgery using 3D auto stereoscopic visualization.

    PubMed

    Tran, Huy Hoang; Suenaga, Hideyuki; Kuwana, Kenta; Masamune, Ken; Dohi, Takeyoshi; Nakajima, Susumu; Liao, Hongen

    2011-01-01

    We present an augmented reality system for oral and maxillofacial surgery in this paper. Instead of being displayed on a separated screen, three-dimensional (3D) virtual presentations of osseous structures and soft tissues are projected onto the patient's body, providing surgeons with exact knowledge of depth information of high risk tissues inside the bone. We employ a 3D integral imaging technique which produce motion parallax in both horizontal and vertical direction over a wide viewing area in this study. In addition, surgeons are able to check the progress of the operation in real-time through an intuitive 3D based interface which is content-rich, hardware accelerated. These features prevent surgeons from penetrating into high risk areas and thus help improve the quality of the operation. Operational tasks such as hole drilling, screw fixation were performed using our system and showed an overall positional error of less than 1 mm. Feasibility of our system was also verified with a human volunteer experiment.

  11. Stereovision and augmented reality for closed-loop control of grasping in hand prostheses

    NASA Astrophysics Data System (ADS)

    Markovic, Marko; Dosen, Strahinja; Cipriani, Christian; Popovic, Dejan; Farina, Dario

    2014-08-01

    Objective. Technologically advanced assistive devices are nowadays available to restore grasping, but effective and effortless control integrating both feed-forward (commands) and feedback (sensory information) is still missing. The goal of this work was to develop a user friendly interface for the semi-automatic and closed-loop control of grasping and to test its feasibility. Approach. We developed a controller based on stereovision to automatically select grasp type and size and augmented reality (AR) to provide artificial proprioceptive feedback. The system was experimentally tested in healthy subjects using a dexterous hand prosthesis to grasp a set of daily objects. The subjects wore AR glasses with an integrated stereo-camera pair, and triggered the system via a simple myoelectric interface. Main results. The results demonstrated that the subjects got easily acquainted with the semi-autonomous control. The stereovision grasp decoder successfully estimated the grasp type and size in realistic, cluttered environments. When allowed (forced) to correct the automatic system decisions, the subjects successfully utilized the AR feedback and achieved close to ideal system performance. Significance. The new method implements a high level, low effort control of complex functions in addition to the low level closed-loop control. The latter is achieved by providing rich visual feedback, which is integrated into the real life environment. The proposed system is an effective interface applicable with small alterations for many advanced prosthetic and orthotic/therapeutic rehabilitation devices.

  12. A learning performance study between the conventional approach and augmented reality textbook among secondary school students

    NASA Astrophysics Data System (ADS)

    Gopalan, Valarmathie; Zulkifli, Abdul Nasir; Bakar, Juliana Aida Abu

    2016-08-01

    Malaysia is moving towards becoming a developed nation by 2020. As such, the need for adequate human resources in science-related fields is one of the requirements to achieve a developed nation status. Unfortunately, there is a downward trend in the number of students pursuing the science stream at the secondary school level. This paper introduces an enhanced science textbook using Augmented Reality (eSTAR) that is intended to motivate students to be interested in science. The eSTAR was implemented to provide a supplement to the conventional science teaching and learning methods in the secondary schools. A learning performance study with a control group was conducted to investigate the effectiveness of the eSTAR for science learning among a sample of 140 Form Two secondary school students. The results indicate that the learning performance of the students in both groups had a significant difference in mean scores between the pre-test and post-test. Students using the eSTAR have a better score in the post-test and eventually resulted in a better learning performance compared to those who were exposed to the conventional science learning. Overall, the results show that the students benefited from the use of the conventional and eSTAR learning approaches.

  13. 3D augmented reality for improving social acceptance and public participation in wind farms planning

    NASA Astrophysics Data System (ADS)

    Grassi, S.; Klein, T. M.

    2016-09-01

    Wind energy is one of the most important source of renewable energy characterized by a significant growth in the last decades and giving a more and more relevant contribution to the energy supply. One of the main disadvantages of a faster integration of wind energy into the energy mix is related to the visual impact of wind turbines on the landscape. In addition, the siting of new massive infrastructures has the potential to threaten a community's well-being if new projects are perceived being unfair. The public perception of the impact of wind turbines on the landscape is also crucial for their acceptance. The implementation of wind energy projects is hampered often because of a lack of planning or communication tools enabling a more transparent and efficient interaction between all stakeholders involved in the projects (i.e. developers, local communities and administrations, NGOs, etc.). Concerning the visual assessment of wind farms, a critical gap lies in effective visualization tools to improve the public perception of alternative wind turbines layouts. In this paper, we describe the advantages of a 3D dynamical and interactive visualization platform for an augmented reality to support wind energy planners in order to enhance the social acceptance of new wind energy projects.

  14. Augmented Reality for Searching Potential Assets in Medan using GPS based Tracking

    NASA Astrophysics Data System (ADS)

    Muchtar, M. A.; Syahputra, M. F.; Syahputra, N.; Ashrafia, S.; Rahmat, R. F.

    2017-01-01

    Every city is required to introduce its variety of potential assets so that the people know how to utilize or to develop their area. Potential assets include infrastructure, facilities, people, communities, organizations, customs that affects the characteristics and the way of life in Medan. Due to lack of socialization and information, most of people in Medan only know a few parts of the assets. Recently, so many mobile apps provide search and mapping locations used to find the location of potential assets in user’s area. However, the available information, such as text and digital maps, sometimes do not much help the user clearly and dynamically. Therefore, Augmented Reality technology able to display information in real world vision is implemented in this research so that the information can be more interactive and easily understood by user. This technology will be implemented in mobile apps using GPS based tracking and define the coordinate of user’s smartphone as a marker so that it can help people dynamically and easily find the location of potential assets in the nearest area based on the direction of user’s view on camera.

  15. Augmented Reality Tool for the Situational Awareness Improvement of UAV Operators.

    PubMed

    Ruano, Susana; Cuevas, Carlos; Gallego, Guillermo; García, Narciso

    2017-02-06

    Unmanned Aerial Vehicles (UAVs) are being extensively used nowadays. Therefore, pilots of traditional aerial platforms should adapt their skills to operate them from a Ground Control Station (GCS). Common GCSs provide information in separate screens: one presents the video stream while the other displays information about the mission plan and information coming from other sensors. To avoid the burden of fusing information displayed in the two screens, an Augmented Reality (AR) tool is proposed in this paper. The AR system has two functionalities for Medium-Altitude Long-Endurance (MALE) UAVs: route orientation and target identification. Route orientation allows the operator to identify the upcoming waypoints and the path that the UAV is going to follow. Target identification allows a fast target localization, even in the presence of occlusions. The AR tool is implemented following the North Atlantic Treaty Organization (NATO) standards so that it can be used in different GCSs. The experiments show how the AR tool improves significantly the situational awareness of the UAV operators.

  16. Laparoscopic stereoscopic augmented reality: toward a clinically viable electromagnetic tracking solution.

    PubMed

    Liu, Xinyang; Kang, Sukryool; Plishker, William; Zaki, George; Kane, Timothy D; Shekhar, Raj

    2016-10-01

    The purpose of this work was to develop a clinically viable laparoscopic augmented reality (AR) system employing stereoscopic (3-D) vision, laparoscopic ultrasound (LUS), and electromagnetic (EM) tracking to achieve image registration. We investigated clinically feasible solutions to mount the EM sensors on the 3-D laparoscope and the LUS probe. This led to a solution of integrating an externally attached EM sensor near the imaging tip of the LUS probe, only slightly increasing the overall diameter of the probe. Likewise, a solution for mounting an EM sensor on the handle of the 3-D laparoscope was proposed. The spatial image-to-video registration accuracy of the AR system was measured to be [Formula: see text] and [Formula: see text] for the left- and right-eye channels, respectively. The AR system contributed 58-ms latency to stereoscopic visualization. We further performed an animal experiment to demonstrate the use of the system as a visualization approach for laparoscopic procedures. In conclusion, we have developed an integrated, compact, and EM tracking-based stereoscopic AR visualization system, which has the potential for clinical use. The system has been demonstrated to achieve clinically acceptable accuracy and latency. This work is a critical step toward clinical translation of AR visualization for laparoscopic procedures.

  17. Affordances of Augmented Reality in Science Learning: Suggestions for Future Research

    NASA Astrophysics Data System (ADS)

    Cheng, Kun-Hung; Tsai, Chin-Chung

    2013-08-01

    Augmented reality (AR) is currently considered as having potential for pedagogical applications. However, in science education, research regarding AR-aided learning is in its infancy. To understand how AR could help science learning, this review paper firstly has identified two major approaches of utilizing AR technology in science education, which are named as image- based AR and location- based AR. These approaches may result in different affordances for science learning. It is then found that students' spatial ability, practical skills, and conceptual understanding are often afforded by image-based AR and location-based AR usually supports inquiry-based scientific activities. After examining what has been done in science learning with AR supports, several suggestions for future research are proposed. For example, more research is required to explore learning experience (e.g., motivation or cognitive load) and learner characteristics (e.g., spatial ability or perceived presence) involved in AR. Mixed methods of investigating learning process (e.g., a content analysis and a sequential analysis) and in-depth examination of user experience beyond usability (e.g., affective variables of esthetic pleasure or emotional fulfillment) should be considered. Combining image-based and location-based AR technology may bring new possibility for supporting science learning. Theories including mental models, spatial cognition, situated cognition, and social constructivist learning are suggested for the profitable uses of future AR research in science education.

  18. Evaluation of an augmented virtual reality and haptic control interface for psychomotor training.

    PubMed

    Kaber, David; Tupler, Larry A; Clamann, Michael; Gil, Guk-Ho; Zhu, Biwen; Swangnetr, Manida; Jeon, Wooram; Zhang, Yu; Qin, Xiaofeng; Ma, Wenqi; Lee, Yuan-Shin

    2014-01-01

    This study investigated the design of a virtual reality (VR) simulation integrating a haptic control interface for motor skill training. Twenty-four healthy participants were tested and trained in standardized psychomotor control tasks using native and VR forms with their nondominant hands in order to identify VR design features that might serve to accelerate motor learning. The study was also intended to make preliminary observations on the degree of specific motor skill development that can be achieved with a VR-based haptic simulation. Results revealed significant improvements in test performance following training for the VR with augmented haptic features with insignificant findings for the native task and VR with basic haptic features. Although performance during training was consistently better with the native task, a correspondence between the VR training and test task interfaces led to greater improvement in test performance as reported by a difference between baseline and post-test scores. These findings support use of VR-based haptic simulations of standardized psychomotor tests for motor skill training, including visual and haptic enhancements for effective pattern recognition and discrete movement of objects. The results may serve as an applicable guide for design of future haptic VR features.

  19. From Motion to Photons in 80 Microseconds: Towards Minimal Latency for Virtual and Augmented Reality.

    PubMed

    Lincoln, Peter; Blate, Alex; Singh, Montek; Whitted, Turner; State, Andrei; Lastra, Anselmo; Fuchs, Henry

    2016-04-01

    We describe an augmented reality, optical see-through display based on a DMD chip with an extremely fast (16 kHz) binary update rate. We combine the techniques of post-rendering 2-D offsets and just-in-time tracking updates with a novel modulation technique for turning binary pixels into perceived gray scale. These processing elements, implemented in an FPGA, are physically mounted along with the optical display elements in a head tracked rig through which users view synthetic imagery superimposed on their real environment. The combination of mechanical tracking at near-zero latency with reconfigurable display processing has given us a measured average of 80 µs of end-to-end latency (from head motion to change in photons from the display) and also a versatile test platform for extremely-low-latency display systems. We have used it to examine the trade-offs between image quality and cost (i.e. power and logical complexity) and have found that quality can be maintained with a fairly simple display modulation scheme.

  20. Clinical application of navigation surgery using augmented reality in the abdominal field.

    PubMed

    Okamoto, Tomoyoshi; Onda, Shinji; Yanaga, Katsuhiko; Suzuki, Naoki; Hattori, Asaki

    2015-04-01

    This article presents general principles and recent advancements in the clinical application of augmented reality-based navigation surgery (AR based NS) for abdominal procedures and includes a description of our clinical trial and subsequent outcomes. Moreover, current problems and future aspects are discussed. The development of AR-based NS in the abdomen is delayed compared with another field because of the problem of intraoperative organ deformations or the existence of established modalities. Although there are a few reports on the clinical use of AR-based NS for digestive surgery, sophisticated technologies in urology have often been reported. However, the rapid widespread use of video- or robot assisted surgeries requires this technology. We have worked to develop a system of AR-based NS for hepatobiliary and pancreatic surgery. Then we developed a short rigid scope that enables surgeons to obtain 3D view. We recently focused on pancreatic surgery, because intraoperative organ shifting is minimal. The position of each organ in overlaid image almost corresponded with that of the actual organ with about 5 mm of mean registration errors. Intraoperative information generated from this system provided us with useful navigation. However, AR-based NS has several problems to overcome such as organ deformity, evaluation of utility, portability or cost.

  1. An augmented reality system for patient-specific guidance of cardiac catheter ablation procedures.

    PubMed

    De Buck, Stijn; Maes, Frederik; Ector, Joris; Bogaert, Jan; Dymarkowski, Steven; Heidbüchel, Hein; Suetens, Paul

    2005-11-01

    We present a system to assist in the treatment of cardiac arrhythmias by catheter ablation. A patient-specific three-dimensional (3-D) anatomical model, constructed from magnetic resonance images, is merged with fluoroscopic images in an augmented reality environment that enables the transfer of electrocardiography (ECG) measurements and cardiac activation times onto the model. Accurate mapping is realized through the combination of: a new calibration technique, adapted to catheter guided treatments; a visual matching registration technique, allowing the electrophysiologist to align the model with contrast-enhanced images; and the use of virtual catheters, which enable the annotation of multiple ECG measurements on the model. These annotations can be visualized by color coding on the patient model. We provide an accuracy analysis of each of these components independently. Based on simulation and experiments, we determined a segmentation error of 0.6 mm, a calibration error in the order of 1 mm and a target registration error of 1.04 +/- 0.45 mm. The system provides a 3-D visualization of the cardiac activation pattern which may facilitate and improve diagnosis and treatment of the arrhytmia. Because of its low cost and similar advantages we believe our approach can compete with existing commercial solutions, which rely on dedicated hardware and costly catheters. We provide qualitative results of the first clinical use of the system in 11 ablation procedures.

  2. Real-time geometric registration using feature landmark database for augmented reality applications

    NASA Astrophysics Data System (ADS)

    Taketomi, Takafumi; Sato, Tomokazu; Yokoya, Naokazu

    2009-02-01

    In the field of augmented reality, it is important to solve a geometric registration problem between real and virtual worlds. To solve this problem, many kinds of image based online camera parameter estimation methods have been proposed. As one of these methods, we have been proposed a feature landmark based camera parameter estimation method. In this method, extrinsic camera parameters are estimated from corresponding landmarks and image features. Although the method can work in large and complex environments, our previous method cannot work in real-time due to high computational cost in matching process. Additionally, initial camera parameters for the first frame must be given manually. In this study, we realize real-time and manual-initialization free camera parameter estimation based on feature landmark database. To reduce the computational cost of the matching process, the number of matching candidates is reduced by using priorities of landmarks that are determined from previously captured video sequences. Initial camera parameter for the first frame is determined by a voting scheme for the target space using matching candidates. To demonstrate the effectiveness of the proposed method, applications of landmark based real-time camera parameter estimation are demonstrated in outdoor environments.

  3. Providing IoT Services in Smart Cities through Dynamic Augmented Reality Markers

    PubMed Central

    Chaves-Diéguez, David; Pellitero-Rivero, Alexandre; García-Coego, Daniel; González-Castaño, Francisco Javier; Rodríguez-Hernández, Pedro Salvador; Piñeiro-Gómez, Óscar; Gil-Castiñeira, Felipe; Costa-Montenegro, Enrique

    2015-01-01

    Smart cities are expected to improve the quality of life of citizens by relying on new paradigms, such as the Internet of Things (IoT) and its capacity to manage and interconnect thousands of sensors and actuators scattered across the city. At the same time, mobile devices widely assist professional and personal everyday activities. A very good example of the potential of these devices for smart cities is their powerful support for intuitive service interfaces (such as those based on augmented reality (AR)) for non-expert users. In our work, we consider a scenario that combines IoT and AR within a smart city maintenance service to improve the accessibility of sensor and actuator devices in the field, where responsiveness is crucial. In it, depending on the location and needs of each service, data and commands will be transported by an urban communications network or consulted on the spot. Direct AR interaction with urban objects has already been described; it usually relies on 2D visual codes to deliver object identifiers (IDs) to the rendering device to identify object resources. These IDs allow information about the objects to be retrieved from a remote server. In this work, we present a novel solution that replaces static AR markers with dynamic markers based on LED communication, which can be decoded through cameras embedded in smartphones. These dynamic markers can directly deliver sensor information to the rendering device, on top of the object ID, without further network interaction. PMID:26151215

  4. Augmented Reality Tool for the Situational Awareness Improvement of UAV Operators

    PubMed Central

    Ruano, Susana; Cuevas, Carlos; Gallego, Guillermo; García, Narciso

    2017-01-01

    Unmanned Aerial Vehicles (UAVs) are being extensively used nowadays. Therefore, pilots of traditional aerial platforms should adapt their skills to operate them from a Ground Control Station (GCS). Common GCSs provide information in separate screens: one presents the video stream while the other displays information about the mission plan and information coming from other sensors. To avoid the burden of fusing information displayed in the two screens, an Augmented Reality (AR) tool is proposed in this paper. The AR system has two functionalities for Medium-Altitude Long-Endurance (MALE) UAVs: route orientation and target identification. Route orientation allows the operator to identify the upcoming waypoints and the path that the UAV is going to follow. Target identification allows a fast target localization, even in the presence of occlusions. The AR tool is implemented following the North Atlantic Treaty Organization (NATO) standards so that it can be used in different GCSs. The experiments show how the AR tool improves significantly the situational awareness of the UAV operators. PMID:28178189

  5. D Building Reconstruction by Multiview Images and the Integrated Application with Augmented Reality

    NASA Astrophysics Data System (ADS)

    Hwang, Jin-Tsong; Chu, Ting-Chen

    2016-10-01

    This study presents an approach wherein photographs with a high degree of overlap are clicked using a digital camera and used to generate three-dimensional (3D) point clouds via feature point extraction and matching. To reconstruct a building model, an unmanned aerial vehicle (UAV) is used to click photographs from vertical shooting angles above the building. Multiview images are taken from the ground to eliminate the shielding effect on UAV images caused by trees. Point clouds from the UAV and multiview images are generated via Pix4Dmapper. By merging two sets of point clouds via tie points, the complete building model is reconstructed. The 3D models are reconstructed using AutoCAD 2016 to generate vectors from the point clouds; SketchUp Make 2016 is used to rebuild a complete building model with textures. To apply 3D building models in urban planning and design, a modern approach is to rebuild the digital models; however, replacing the landscape design and building distribution in real time is difficult as the frequency of building replacement increases. One potential solution to these problems is augmented reality (AR). Using Unity3D and Vuforia to design and implement the smartphone application service, a markerless AR of the building model can be built. This study is aimed at providing technical and design skills related to urban planning, urban designing, and building information retrieval using AR.

  6. Real-time in situ three-dimensional integral videography and surgical navigation using augmented reality: a pilot study

    PubMed Central

    Suenaga, Hideyuki; Hoang Tran, Huy; Liao, Hongen; Masamune, Ken; Dohi, Takeyoshi; Hoshi, Kazuto; Mori, Yoshiyuki; Takato, Tsuyoshi

    2013-01-01

    To evaluate the feasibility and accuracy of a three-dimensional augmented reality system incorporating integral videography for imaging oral and maxillofacial regions, based on preoperative computed tomography data. Three-dimensional surface models of the jawbones, based on the computed tomography data, were used to create the integral videography images of a subject's maxillofacial area. The three-dimensional augmented reality system (integral videography display, computed tomography, a position tracker and a computer) was used to generate a three-dimensional overlay that was projected on the surgical site via a half-silvered mirror. Thereafter, a feasibility study was performed on a volunteer. The accuracy of this system was verified on a solid model while simulating bone resection. Positional registration was attained by identifying and tracking the patient/surgical instrument's position. Thus, integral videography images of jawbones, teeth and the surgical tool were superimposed in the correct position. Stereoscopic images viewed from various angles were accurately displayed. Change in the viewing angle did not negatively affect the surgeon's ability to simultaneously observe the three-dimensional images and the patient, without special glasses. The difference in three-dimensional position of each measuring point on the solid model and augmented reality navigation was almost negligible (<1 mm); this indicates that the system was highly accurate. This augmented reality system was highly accurate and effective for surgical navigation and for overlaying a three-dimensional computed tomography image on a patient's surgical area, enabling the surgeon to understand the positional relationship between the preoperative image and the actual surgical site, with the naked eye. PMID:23703710

  7. Real-time in situ three-dimensional integral videography and surgical navigation using augmented reality: a pilot study.

    PubMed

    Suenaga, Hideyuki; Hoang Tran, Huy; Liao, Hongen; Masamune, Ken; Dohi, Takeyoshi; Hoshi, Kazuto; Mori, Yoshiyuki; Takato, Tsuyoshi

    2013-06-01

    To evaluate the feasibility and accuracy of a three-dimensional augmented reality system incorporating integral videography for imaging oral and maxillofacial regions, based on preoperative computed tomography data. Three-dimensional surface models of the jawbones, based on the computed tomography data, were used to create the integral videography images of a subject's maxillofacial area. The three-dimensional augmented reality system (integral videography display, computed tomography, a position tracker and a computer) was used to generate a three-dimensional overlay that was projected on the surgical site via a half-silvered mirror. Thereafter, a feasibility study was performed on a volunteer. The accuracy of this system was verified on a solid model while simulating bone resection. Positional registration was attained by identifying and tracking the patient/surgical instrument's position. Thus, integral videography images of jawbones, teeth and the surgical tool were superimposed in the correct position. Stereoscopic images viewed from various angles were accurately displayed. Change in the viewing angle did not negatively affect the surgeon's ability to simultaneously observe the three-dimensional images and the patient, without special glasses. The difference in three-dimensional position of each measuring point on the solid model and augmented reality navigation was almost negligible (<1 mm); this indicates that the system was highly accurate. This augmented reality system was highly accurate and effective for surgical navigation and for overlaying a three-dimensional computed tomography image on a patient's surgical area, enabling the surgeon to understand the positional relationship between the preoperative image and the actual surgical site, with the naked eye.

  8. Mad City Mystery: Developing Scientific Argumentation Skills with a Place-based Augmented Reality Game on Handheld Computers

    NASA Astrophysics Data System (ADS)

    Squire, Kurt D.; Jan, Mingfong

    2007-02-01

    While the knowledge economy has reshaped the world, schools lag behind in producing appropriate learning for this social change. Science education needs to prepare students for a future world in which multiple representations are the norm and adults are required to "think like scientists." Location-based augmented reality games offer an opportunity to create a "post-progressive" pedagogy in which students are not only immersed in authentic scientific inquiry, but also required to perform in adult scientific discourses. This cross-case comparison as a component of a design-based research study investigates three cases (roughly 28 students total) where an Augmented Reality curriculum, Mad City Mystery, was used to support learning in environmental science. We investigate whether augmented reality games on handhelds can be used to engage students in scientific thinking (particularly argumentation), how game structures affect students' thinking, the impact of role playing on learning, and the role of the physical environment in shaping learning. We argue that such games hold potential for engaging students in meaningful scientific argumentation. Through game play, players are required to develop narrative accounts of scientific phenomena, a process that requires them to develop and argue scientific explanations. We argue that specific game features scaffold this thinking process, creating supports for student thinking non-existent in most inquiry-based learning environments.

  9. Clinical usefulness of augmented reality using infrared camera based real-time feedback on gait function in cerebral palsy: a case study

    PubMed Central

    Lee, Byoung-Hee

    2016-01-01

    [Purpose] This study investigated the effects of real-time feedback using infrared camera recognition technology-based augmented reality in gait training for children with cerebral palsy. [Subjects] Two subjects with cerebral palsy were recruited. [Methods] In this study, augmented reality based real-time feedback training was conducted for the subjects in two 30-minute sessions per week for four weeks. Spatiotemporal gait parameters were used to measure the effect of augmented reality-based real-time feedback training. [Results] Velocity, cadence, bilateral step and stride length, and functional ambulation improved after the intervention in both cases. [Conclusion] Although additional follow-up studies of the augmented reality based real-time feedback training are required, the results of this study demonstrate that it improved the gait ability of two children with cerebral palsy. These findings suggest a variety of applications of conservative therapeutic methods which require future clinical trials. PMID:27190489

  10. Clinical usefulness of augmented reality using infrared camera based real-time feedback on gait function in cerebral palsy: a case study.

    PubMed

    Lee, Byoung-Hee

    2016-04-01

    [Purpose] This study investigated the effects of real-time feedback using infrared camera recognition technology-based augmented reality in gait training for children with cerebral palsy. [Subjects] Two subjects with cerebral palsy were recruited. [Methods] In this study, augmented reality based real-time feedback training was conducted for the subjects in two 30-minute sessions per week for four weeks. Spatiotemporal gait parameters were used to measure the effect of augmented reality-based real-time feedback training. [Results] Velocity, cadence, bilateral step and stride length, and functional ambulation improved after the intervention in both cases. [Conclusion] Although additional follow-up studies of the augmented reality based real-time feedback training are required, the results of this study demonstrate that it improved the gait ability of two children with cerebral palsy. These findings suggest a variety of applications of conservative therapeutic methods which require future clinical trials.

  11. Valorisation of Cultural Heritage Through Virtual Visit and Augmented Reality: the Case of the Abbey of Epau (france)

    NASA Astrophysics Data System (ADS)

    Simonetto, E.; Froment, C.; Labergerie, E.; Ferré, G.; Séchet, B.; Chédorge, H.; Cali, J.; Polidori, L.

    2013-07-01

    Terrestrial Laser Scanning (TLS), 3-D modeling and its Web visualization are the three key steps needed to perform storage and grant-free and wide access to cultural heritage, as highlighted in many recent examples. The goal of this study is to set up 3-D Web resources for "virtually" visiting the exterior of the Abbaye de l'Epau, an old French abbey which has both a rich history and delicate architecture. The virtuality is considered in two ways: the flowing navigation in a virtual reality environment around the abbey and a game activity using augmented reality. First of all, the data acquisition consists in GPS and tacheometry survey, terrestrial laser scanning and photography acquisition. After data pre-processing, the meshed and textured 3-D model is generated using 3-D Reshaper commercial software. The virtual reality visit and augmented reality animation are then created using Unity software. This work shows the interest of such tools in bringing out the regional cultural heritage and making it attractive to the public.

  12. Gas-Generator Augmented Expander Cycle Rocket Engine

    NASA Technical Reports Server (NTRS)

    Greene, William D. (Inventor)

    2011-01-01

    An augmented expander cycle rocket engine includes first and second turbopumps for respectively pumping fuel and oxidizer. A gas-generator receives a first portion of fuel output from the first turbopump and a first portion of oxidizer output from the second turbopump to ignite and discharge heated gas. A heat exchanger close-coupled to the gas-generator receives in a first conduit the discharged heated gas, and transfers heat to an adjacent second conduit carrying fuel exiting the cooling passages of a primary combustion chamber. Heat is transferred to the fuel passing through the cooling passages. The heated fuel enters the second conduit of the heat exchanger to absorb more heat from the first conduit, and then flows to drive a turbine of one or both of the turbopumps. The arrangement prevents the turbopumps exposure to combusted gas that could freeze in the turbomachinery and cause catastrophic failure upon attempted engine restart.

  13. Development and Application of a New Learning Object for Teaching Operative Dentistry Using Augmented Reality.

    PubMed

    Espejo-Trung, Luciana Cardoso; Elian, Silvia Nagib; Luz, Maria Aparecia Alves De Cerqueira

    2015-11-01

    Learning objects (LOs) associated with augmented reality have been used as attractive new technologic tools in the educational process. However, the acceptance of new LOs must be verified with the purpose of using these innovations in the learning process in general. The aim of this study was to develop a new LO and investigate the acceptance of gold onlay in teaching preparation design at a dental school in Brazil. Questionnaires were designed to assess, first, the users' computational ability and knowledge of computers (Q1) and, second, the users' acceptance of the new LO (Q2). For both questionnaires, the internal consistency index was calculated to determine whether the questions were measuring the same construct. The reliability of Q2 was measured with a retest procedure. The LO was tested by dental students (n=28), professors and postgraduate students in dentistry and prosthetics (n=30), and dentists participating in a continuing education or remedial course in dentistry and/or prosthetics (n=19). Analyses of internal consistency (Kappa coefficient and Cronbach's alpha) demonstrated a high degree of confidence in the questionnaires. Tests for simple linear regressions were conducted between the response variable (Q2) and the following explanative variables: the Q1 score, age, gender, and group. The results showed wide acceptance regardless of the subjects' computational ability (p=0.99; R2=0), gender (p=0.27; R2=1.6%), age (p=0.27; R2=0.1%), or group (p=0.53; R2=1.9%). The methodology used enabled the development of an LO with a high index of acceptance for all groups.

  14. FreshAiR and Field Studies—Augmenting Geological Reality with Mobile Devices

    NASA Astrophysics Data System (ADS)

    De Paor, D. G.; Crompton, H.; Dunleavy, M.

    2014-12-01

    During the last decade, mobile devices have fomented a revolution in geological mapping. Present Clinton set the stage for this revolution in the year 2000 when he ordered a cessation to Selective Availability, making reliable GPS available for civilian use. Geologists began using personal digital assistants and ruggedized tablet PCs for geolocation and data recording and the pace of change accelerated with the development of mobile apps such as Google Maps, digital notebooks, and digital compass-clinometers. Despite these changes in map-making technologies, most students continue to learn geology in the field the old-fashioned way, by following a field trip leader as a group and trying to hear and understand lecturettes at the outcrop. In this presentation, we demonstrate the potential of a new Augment Reality (AR) mobile app called "FreshAiR" to change fundamentally the way content-knowledge and learning objectives are delivered to students in the field. FreshAiR, which was developed by co-author and ODU alumnus M.D., triggers content delivery to mobile devices based on proximity. Students holding their mobile devices to the horizon see trigger points superimposed on the field of view of the device's built-in camera. When they walk towards the trigger, information about the location pops up. This can include text, images, movies, and quiz questions (multiple choice and fill-in-the-blank). Students can use the app to reinforce the field trip leader's presentations or they can visit outcrops individuals at different times. This creates the possibility for asynchronous field class, a concept that has profound implications for distance education in the geosciences.

  15. Augmented reality intravenous injection simulator based 3D medical imaging for veterinary medicine.

    PubMed

    Lee, S; Lee, J; Lee, A; Park, N; Lee, S; Song, S; Seo, A; Lee, H; Kim, J-I; Eom, K

    2013-05-01

    Augmented reality (AR) is a technology which enables users to see the real world, with virtual objects superimposed upon or composited with it. AR simulators have been developed and used in human medicine, but not in veterinary medicine. The aim of this study was to develop an AR intravenous (IV) injection simulator to train veterinary and pre-veterinary students to perform canine venipuncture. Computed tomographic (CT) images of a beagle dog were scanned using a 64-channel multidetector. The CT images were transformed into volumetric data sets using an image segmentation method and were converted into a stereolithography format for creating 3D models. An AR-based interface was developed for an AR simulator for IV injection. Veterinary and pre-veterinary student volunteers were randomly assigned to an AR-trained group or a control group trained using more traditional methods (n = 20/group; n = 8 pre-veterinary students and n = 12 veterinary students in each group) and their proficiency at IV injection technique in live dogs was assessed after training was completed. Students were also asked to complete a questionnaire which was administered after using the simulator. The group that was trained using an AR simulator were more proficient at IV injection technique using real dogs than the control group (P ≤ 0.01). The students agreed that they learned the IV injection technique through the AR simulator. Although the system used in this study needs to be modified before it can be adopted for veterinary educational use, AR simulation has been shown to be a very effective tool for training medical personnel. Using the technology reported here, veterinary AR simulators could be developed for future use in veterinary education.

  16. Enhanced Lighting Techniques and Augmented Reality to Improve Human Task Performance

    NASA Technical Reports Server (NTRS)

    Maida, James C.; Bowen, Charles K.; Pace, John W.

    2005-01-01

    One of the most versatile tools designed for use on the International Space Station (ISS) is the Special Purpose Dexterous Manipulator (SPDM) robot. Operators for this system are trained at NASA Johnson Space Center (JSC) using a robotic simulator, the Dexterous Manipulator Trainer (DMT), which performs most SPDM functions under normal static Earth gravitational forces. The SPDM is controlled from a standard Robotic Workstation. A key feature of the SPDM and DMT is the Force/Moment Accommodation (FMA) system, which limits the contact forces and moments acting on the robot components, on its payload, an Orbital Replaceable Unit (ORU), and on the receptacle for the ORU. The FMA system helps to automatically alleviate any binding of the ORU as it is inserted or withdrawn from a receptacle, but it is limited in its correction capability. A successful ORU insertion generally requires that the reference axes of the ORU and receptacle be aligned to within approximately 0.25 inch and 0.5 degree of nominal values. The only guides available for the operator to achieve these alignment tolerances are views from any available video cameras. No special registration markings are provided on the ORU or receptacle, so the operator must use their intrinsic features in the video display to perform the pre-insertion alignment task. Since optimum camera views may not be available, and dynamic orbital lighting conditions may limit viewing periods, long times are anticipated for performing some ORU insertion or extraction operations. This study explored the feasibility of using augmented reality (AR) to assist with SPDM operations. Geometric graphical symbols were overlaid on the end effector (EE) camera view to afford cues to assist the operator in attaining adequate pre-insertion ORU alignment.

  17. Using a collaborative Mobile Augmented Reality learning application (CoMARLA) to improve Improve Student Learning

    NASA Astrophysics Data System (ADS)

    Hanafi, Hafizul Fahri bin; Soh Said, Che; Hanee Ariffin, Asma; Azlan Zainuddin, Nur; Samsuddin, Khairulanuar

    2016-11-01

    This study was carried out to improve student learning in ICT course using a collaborative mobile augmented reality learning application (CoMARLA). This learning application was developed based on the constructivist framework that would engender collaborative learning environment, in which students could learn collaboratively using their mobile phones. The research design was based on the pretest posttest control group design. The dependent variable was students’ learning performance after learning, and the independent variables were learning method and gender. Students’ learning performance before learning was treated as the covariate. The sample of the study comprised 120 non-IT (non-technical) undergraduates, with the mean age of 19.5. They were randomized into two groups, namely the experimental and control group. The experimental group used CoMARLA to learn one of the topics of the ICT Literacy course, namely Computer System; whereas the control group learned using the conventional approach. The research instrument used was a set of multiple-choice questions pertaining to the above topic. Pretesting was carried out before the learning sessions, and posttesting was performed after 6 hours of learning. Using the SPSS, Analysis of Covariance (ANCOVA) was performed on the data. The analysis showed that there were main effects attributed to the learning method and gender. The experimental group outperformed the control group by almost 9%, and male students outstripped their opposite counterparts by as much as 3%. Furthermore, an interaction effect was also observed showing differential performances of male students based on the learning methods, which did not occur among female students. Hence, the tool can be used to help undergraduates learn with greater efficacy when contextualized in an appropriate setting.

  18. AUGMENTED REALITY CUES TO ASSIST OLDER DRIVERS WITH GAP ESTIMATION FOR LEFT-TURNS

    PubMed Central

    Rusch, Michelle L.; Schall, Mark C.; Lee, John D.; Dawson, Jeffrey D.; Rizzo, Matthew

    2014-01-01

    The objective of this study was to assess the effects of augmented reality (AR) cues designed to assist middle-aged and older drivers with a range of UFOV impairments, judging when to make left-turns across oncoming traffic. Previous studies have shown that AR cues can help middle-aged and older drivers respond to potential roadside hazards by increasing hazard detection without interfering with other driving tasks. Intersections pose a critical challenge for cognitively impaired drivers, prone to misjudge time-to-contact with oncoming traffic. We investigated whether AR cues improve or interfere with hazard perception in left-turns across oncoming traffic for drivers with age-related cognitive decline. Sixty-four middle-aged and older drivers with a range of UFOV impairment judged when it would be safe to turn left across oncoming traffic approaching the driver from the opposite direction in a rural stop-sign controlled intersection scenario implemented in a static base driving simulator. Outcome measures used to evaluate the effectiveness of AR cueing included: Time-to-Contact (TTC), Gap Time Variation (GTV), Response Rate, and Gap Response Variation (GRV). All drivers estimated TTCs were shorter in cued than in uncued conditions. In addition, drivers responded more often in cued conditions than in uncued conditions and GRV decreased for all drivers in scenarios that contained AR cues. For both TTC and response rate, drivers also appeared to adjust their behavior to be consistent with the cues, especially drivers with the poorest UFOV scores (matching their behavior to be close to middle-aged drivers). Driver ratings indicated that cueing was not considered to be distracting. Further, various conditions of reliability (e.g., 15% miss rate) did not appear to affect performance or driver ratings. PMID:24950128

  19. Virtual Reality Used to Serve the Glenn Engineering Community

    NASA Technical Reports Server (NTRS)

    Carney, Dorothy V.

    2001-01-01

    There are a variety of innovative new visualization tools available to scientists and engineers for the display and analysis of their models. At the NASA Glenn Research Center, we have an ImmersaDesk, a large, single-panel, semi-immersive display device. This versatile unit can interactively display three-dimensional images in visual stereo. Our challenge is to make this virtual reality platform accessible and useful to researchers. An example of a successful application of this computer technology is the display of blade out simulations. NASA Glenn structural dynamicists, Dr. Kelly Carney and Dr. Charles Lawrence, funded by the Ultra Safe Propulsion Project under Base R&T, are researching blade outs, when turbine engines lose a fan blade during operation. Key objectives of this research include minimizing danger to the aircraft via effective blade containment, predicting destructive loads due to the imbalance following a blade loss, and identifying safe, cost-effective designs and materials for future engines.

  20. Augmented Reality on a C-Arm System: A Preclinical Assessment for Percutaneous Needle Localization.

    PubMed

    Racadio, John M; Nachabe, Rami; Homan, Robert; Schierling, Ross; Racadio, Judy M; Babić, Draženko

    2016-10-01

    Purpose To compare the navigational accuracy and radiation dose during needle localization of targets for augmented reality (AR) with and without motion compensation (MC) versus those for cone-beam computed tomography (CT) with real-time fluoroscopy navigation in a pig model. Materials and Methods This study was approved by the Institutional Animal Care and Use Committee. Three operators each localized 15 targets (bone fragments) approximately 7 cm deep in the paraspinal muscles of nine Yorkshire pigs by using each of the three modalities (AR with and without MC and cone-beam CT with fluoroscopy). Target depth, accuracy (distance between needle tip and target), and radiation dose (dose-area product [DAP]) were recorded for each procedure. Correlation between accuracy and depth of target was assessed by using the Pearson correlation coefficient. Two-way analysis of variance was used for differentiating accuracy and DAPs across navigation techniques and operator backgrounds. Results There was no correlation between depth of target and accuracy. There was no significant difference in accuracy between modalities (mean distance, 3.0 mm ± 1.9 [standard deviation] for cone-beam CT with fluoroscopy, 2.5 mm ± 2.0 for AR, and 3.2 mm ± 2.7 for AR with MC [P = .33]). There was, however, a significant difference in fluoroscopy radiation dose (10.4 Gy · cm(2) ± 10.6 for cone-beam CT fluoroscopy, 2.3 Gy · cm(2) ± 2.4 for AR, and 3.3 Gy · cm(2) ± 4.6 for AR with MC [P < .05]) and therefore in total procedural radiation dose (20.5 Gy · cm(2) ± 13.4 for cone-beam CT fluoroscopy, 12.6 Gy · cm(2) ± 5.3 for AR, 13.6 Gy · cm(2) ± 7.4 for AR with MC [P < .05]). Conclusion Use of an AR C-arm system reduces radiation dose while maintaining navigational accuracy compared with cone-beam CT fluoroscopy during image-guided percutaneous needle placement in a pig model. (©) RSNA, 2016 Online supplemental material is available for this article.

  1. Testing Augmented Reality for Cue Exposure in Obese Patients: An Exploratory Study.

    PubMed

    Pallavicini, Federica; Serino, Silvia; Cipresso, Pietro; Pedroli, Elisa; Chicchi Giglioli, Irene Alice; Chirico, Alice; Manzoni, Gian Mauro; Castelnuovo, Gianluca; Molinari, Enrico; Riva, Giuseppe

    2016-02-01

    Binge eating is one of the key behaviors in relation to the etiology and severity of obesity. Cue exposure with response prevention consists of exposing patients to binge foods while actual eating is not allowed. Augmented reality (AR) has the potential to change the way cue exposure is administered, but very few prior studies have been conducted so far. Starting from these premises, this study was aimed to (a) investigate whether AR foods elicit emotional responses comparable to those produced by the real stimuli, (b) study differences between obese and control participants in terms of emotional responses to food, and (c) compare emotional responses to different categories of foods. To reach these goals, we assess in 15 obese (age, 44.6 ± 13 years; body mass index [BMI], 44.2 ± 8.1) and 15 control participants (age, 43.7 ± 12.8 years; BMI, 21.2 ± 1.4) the emotional responses to high-calorie (savory and sweet) and low-calorie food stimuli, presented through different exposure conditions (real, photographic, and AR). The State-Trait Anxiety Inventory was used for the assessment of state anxiety, and it was administered at the beginning and after the exposure to foods, along with the Visual Analog Scale (VAS) for Hunger and Happiness. To assess the perceived pleasantness, the VAS for Palatability was administered after the exposure to food stimuli. Heart rate, skin conductance response, and facial corrugator supercilii muscle activation were recorded. Although preliminary, the results showed that (a) AR food stimuli were perceived to be as palatable as real stimuli, and they also triggered a similar arousal response; (b) obese individuals showed lower happiness after the exposure to food compared to control participants, with regard to both psychological and physiological responses; and (c) high-calorie savory (vs. low-calorie) food stimuli were perceived by all the participants to be more palatable, and they triggered a greater arousal response.

  2. Applications of Panoramic Images: from 720° Panorama to Interior 3d Models of Augmented Reality

    NASA Astrophysics Data System (ADS)

    Lee, I.-C.; Tsai, F.

    2015-05-01

    A series of panoramic images are usually used to generate a 720° panorama image. Although panoramic images are typically used for establishing tour guiding systems, in this research, we demonstrate the potential of using panoramic images acquired from multiple sites to create not only 720° panorama, but also three-dimensional (3D) point clouds and 3D indoor models. Since 3D modeling is one of the goals of this research, the location of the panoramic sites needed to be carefully planned in order to maintain a robust result for close-range photogrammetry. After the images are acquired, panoramic images are processed into 720° panoramas, and these panoramas which can be used directly as panorama guiding systems or other applications. In addition to these straightforward applications, interior orientation parameters can also be estimated while generating 720° panorama. These parameters are focal length, principle point, and lens radial distortion. The panoramic images can then be processed with closerange photogrammetry procedures to extract the exterior orientation parameters and generate 3D point clouds. In this research, VisaulSFM, a structure from motion software is used to estimate the exterior orientation, and CMVS toolkit is used to generate 3D point clouds. Next, the 3D point clouds are used as references to create building interior models. In this research, Trimble Sketchup was used to build the model, and the 3D point cloud was added to the determining of locations of building objects using plane finding procedure. In the texturing process, the panorama images are used as the data source for creating model textures. This 3D indoor model was used as an Augmented Reality model replacing a guide map or a floor plan commonly used in an on-line touring guide system. The 3D indoor model generating procedure has been utilized in two research projects: a cultural heritage site at Kinmen, and Taipei Main Station pedestrian zone guidance and navigation system. The

  3. Three-Dimensional Path Planning and Guidance of Leg Vascular Based on Improved Ant Colony Algorithm in Augmented Reality.

    PubMed

    Gao, Ming-ke; Chen, Yi-min; Liu, Quan; Huang, Chen; Li, Ze-yu; Zhang, Dian-hua

    2015-11-01

    Preoperative path planning plays a critical role in vascular access surgery. Vascular access surgery has superior difficulties and requires long training periods as well as precise operation. Yet doctors are on different leves, thus bulky size of blood vessels is usually chosen to undergo surgery and other possible optimal path is not considered. Moreover, patients and surgeons will suffer from X-ray radiation during the surgical procedure. The study proposed an improved ant colony algorithm to plan a vascular optimal three-dimensional path with overall consideration of factors such as catheter diameter, vascular length, diameter as well as the curvature and torsion. To protect the doctor and patient from exposing to X-ray long-term, the paper adopted augmented reality technology to register the reconstructed vascular model and physical model meanwhile, locate catheter by the electromagnetic tracking system and used Head Mounted Display to show the planning path in real time and monitor catheter push procedure. The experiment manifests reasonableness of preoperative path planning and proves the reliability of the algorithm. The augmented reality experiment real time and accurately displays the vascular phantom model, planning path and the catheter trajectory and proves the feasibility of this method. The paper presented a useful and feasible surgical scheme which was based on the improved ant colony algorithm to plan vascular three-dimensional path in augmented reality. The study possessed practical guiding significance in preoperative path planning, intraoperative catheter guiding and surgical training, which provided a theoretical method of path planning for vascular access surgery. It was a safe and reliable path planning approach and possessed practical reference value.

  4. Interactive augmented reality using Scratch 2.0 to improve physical activities for children with developmental disabilities.

    PubMed

    Lin, Chien-Yu; Chang, Yu-Ming

    2015-02-01

    This study uses a body motion interactive game developed in Scratch 2.0 to enhance the body strength of children with disabilities. Scratch 2.0, using an augmented-reality function on a program platform, creates real world and virtual reality displays at the same time. This study uses a webcam integration that tracks movements and allows participants to interact physically with the project, to enhance the motivation of children with developmental disabilities to perform physical activities. This study follows a single-case research using an ABAB structure, in which A is the baseline and B is the intervention. The experimental period was 2 months. The experimental results demonstrated that the scores for 3 children with developmental disabilities increased considerably during the intervention phrases. The developmental applications of these results are also discussed.

  5. Combining physical and virtual contexts through augmented reality: design and evaluation of a prototype using a drug box as a marker for antibiotic training.

    PubMed

    Nifakos, Sokratis; Tomson, Tanja; Zary, Nabil

    2014-01-01

    Introduction. Antimicrobial resistance is a global health issue. Studies have shown that improved antibiotic prescription education among healthcare professionals reduces mistakes during the antibiotic prescription process. The aim of this study was to investigate novel educational approaches that through the use of Augmented Reality technology could make use of the real physical context and thereby enrich the educational process of antibiotics prescription. The objective is to investigate which type of information related to antibiotics could be used in an augmented reality application for antibiotics education. Methods. This study followed the Design-Based Research Methodology composed of the following main steps: problem analysis, investigation of information that should be visualized for the training session, and finally the involvement of the end users the development and evaluation processes of the prototype. Results. Two of the most important aspects in the antibiotic prescription process, to represent in an augmented reality application, are the antibiotic guidelines and the side effects. Moreover, this study showed how this information could be visualized from a mobile device using an Augmented Reality scanner and antibiotic drug boxes as markers. Discussion. In this study we investigated the usage of objects from a real physical context such as drug boxes and how they could be used as educational resources. The logical next steps are to examine how this approach of combining physical and virtual contexts through Augmented Reality applications could contribute to the improvement of competencies among healthcare professionals and its impact on the decrease of antibiotics resistance.

  6. Role of cranial and spinal virtual and augmented reality simulation using immersive touch modules in neurosurgical training.

    PubMed

    Alaraj, Ali; Charbel, Fady T; Birk, Daniel; Tobin, Matthew; Tobin, Mathew; Luciano, Cristian; Banerjee, Pat P; Rizzi, Silvio; Sorenson, Jeff; Foley, Kevin; Slavin, Konstantin; Roitberg, Ben

    2013-01-01

    Recent studies have shown that mental script-based rehearsal and simulation-based training improve the transfer of surgical skills in various medical disciplines. Despite significant advances in technology and intraoperative techniques over the last several decades, surgical skills training on neurosurgical operations still carries significant risk of serious morbidity or mortality. Potentially avoidable technical errors are well recognized as contributing to poor surgical outcome. Surgical education is undergoing overwhelming change, as a result of the reduction of work hours and current trends focusing on patient safety and linking reimbursement with clinical outcomes. Thus, there is a need for adjunctive means for neurosurgical training, which is a recent advancement in simulation technology. ImmersiveTouch is an augmented reality system that integrates a haptic device and a high-resolution stereoscopic display. This simulation platform uses multiple sensory modalities, re-creating many of the environmental cues experienced during an actual procedure. Modules available include ventriculostomy, bone drilling, percutaneous trigeminal rhizotomy, and simulated spinal modules such as pedicle screw placement, vertebroplasty, and lumbar puncture. We present our experience with the development of such augmented reality neurosurgical modules and the feedback from neurosurgical residents.

  7. Shifting the paradigm of music instruction: implications of embodiment stemming from an augmented reality guitar learning system

    PubMed Central

    Keebler, Joseph R.; Wiltshire, Travis J.; Smith, Dustin C.; Fiore, Stephen M.; Bedwell, Jeffrey S.

    2014-01-01

    Musical instruction often includes materials that can act as a barrier to learning. New technologies using augmented reality may aid in reducing the initial difficulties involved in learning music by lowering these barriers characteristic of traditional instructional materials. Therefore, this set of studies examined a novel augmented reality guitar learning system (i.e., the Fretlight® guitar) in regards to current theories of embodied music cognition. Specifically, we examined the effects of using this system in comparison to a standard instructional material (i.e., diagrams). First, we review major theories related to musical embodiment and specify a niche within this research space we call embodied music technology for learning. Following, we explicate two parallel experiments that were conducted to address the learning effects of this system. Experiment 1 examined short-term learning effects within one experimental session, while Experiment 2 examined both short-term and long-term effects across two sessions spaced at a 2-week interval. Analyses demonstrated that, for many of our dependent variables, all participants increased in performance across time. Further, the Fretlight® condition consistently led to significantly better outcomes via interactive effects, including significantly better long term retention for the learned information across a 2 week time interval. These results are discussed in the context of embodied cognition theory as it relates to music. Potential limitations and avenues for future research are described. PMID:24999334

  8. Development and human factors analysis of an augmented reality interface for multi-robot tele-operation and control

    NASA Astrophysics Data System (ADS)

    Lee, Sam; Lucas, Nathan P.; Ellis, R. Darin; Pandya, Abhilash

    2012-06-01

    This paper presents a seamlessly controlled human multi-robot system comprised of ground and aerial robots of semiautonomous nature for source localization tasks. The system combines augmented reality interfaces capabilities with human supervisor's ability to control multiple robots. The role of this human multi-robot interface is to allow an operator to control groups of heterogeneous robots in real time in a collaborative manner. It used advanced path planning algorithms to ensure obstacles are avoided and that the operators are free for higher-level tasks. Each robot knows the environment and obstacles and can automatically generate a collision-free path to any user-selected target. It displayed sensor information from each individual robot directly on the robot in the video view. In addition, a sensor data fused AR view is displayed which helped the users pin point source information or help the operator with the goals of the mission. The paper studies a preliminary Human Factors evaluation of this system in which several interface conditions are tested for source detection tasks. Results show that the novel Augmented Reality multi-robot control (Point-and-Go and Path Planning) reduced mission completion times compared to the traditional joystick control for target detection missions. Usability tests and operator workload analysis are also investigated.

  9. Shifting the paradigm of music instruction: implications of embodiment stemming from an augmented reality guitar learning system.

    PubMed

    Keebler, Joseph R; Wiltshire, Travis J; Smith, Dustin C; Fiore, Stephen M; Bedwell, Jeffrey S

    2014-01-01

    Musical instruction often includes materials that can act as a barrier to learning. New technologies using augmented reality may aid in reducing the initial difficulties involved in learning music by lowering these barriers characteristic of traditional instructional materials. Therefore, this set of studies examined a novel augmented reality guitar learning system (i.e., the Fretlight® guitar) in regards to current theories of embodied music cognition. Specifically, we examined the effects of using this system in comparison to a standard instructional material (i.e., diagrams). First, we review major theories related to musical embodiment and specify a niche within this research space we call embodied music technology for learning. Following, we explicate two parallel experiments that were conducted to address the learning effects of this system. Experiment 1 examined short-term learning effects within one experimental session, while Experiment 2 examined both short-term and long-term effects across two sessions spaced at a 2-week interval. Analyses demonstrated that, for many of our dependent variables, all participants increased in performance across time. Further, the Fretlight® condition consistently led to significantly better outcomes via interactive effects, including significantly better long term retention for the learned information across a 2 week time interval. These results are discussed in the context of embodied cognition theory as it relates to music. Potential limitations and avenues for future research are described.

  10. Approach and Evaluation of a Mobile Video-Based and Location-Based Augmented Reality Platform for Information Brokerage

    NASA Astrophysics Data System (ADS)

    Dastageeri, H.; Storz, M.; Koukofikis, A.; Knauth, S.; Coors, V.

    2016-09-01

    Providing mobile location-based information for pedestrians faces many challenges. On one hand the accuracy of localisation indoors and outdoors is restricted due to technical limitations of GPS and Beacons. Then again only a small display is available to display information as well as to develop a user interface. Plus, the software solution has to consider the hardware characteristics of mobile devices during the implementation process for aiming a performance with minimum latency. This paper describes our approach by including a combination of image tracking and GPS or Beacons to ensure orientation and precision of localisation. To communicate the information on Points of Interest (POIs), we decided to choose Augmented Reality (AR). For this concept of operations, we used besides the display also the acceleration and positions sensors as a user interface. This paper especially goes into detail on the optimization of the image tracking algorithms, the development of the video-based AR player for the Android platform and the evaluation of videos as an AR element in consideration of providing a good user experience. For setting up content for the POIs or even generate a tour we used and extended the Open Geospatial Consortium (OGC) standard Augmented Reality Markup Language (ARML).

  11. Thrust Augmentation Measurements Using a Pulse Detonation Engine Ejector

    NASA Technical Reports Server (NTRS)

    Santoro, Robert J.; Pal, Sibtosh

    2003-01-01

    The present NASA GRC-funded three-year research project is focused on studying PDE driven ejectors applicable to a hybrid Pulse Detonation/Turbofan Engine. The objective of the study is to characterize the PDE-ejector thrust augmentation. A PDE-ejector system has been designed to provide critical experimental data for assessing the performance enhancements possible with this technology. Completed tasks include demonstration of a thrust stand for measuring average thrust for detonation tube multi-cycle operation, and design of a 72-in.-long, 2.25-in.-diameter (ID) detonation tube and modular ejector assembly. This assembly will allow testing of both straight and contoured ejector geometries. Initial ejectors that have been fabricated are 72-in.-long-constant-diameter tubes (4-, 5-, and 6-in.-diameter) instrumented with high-frequency pressure transducers. The assembly has been designed such that the detonation tube exit can be positioned at various locations within the ejector tube. PDE-ejector system experiments with gaseous ethylene/ nitrogen/oxygen propellants will commence in the very near future. The program benefits from collaborations with Prof. Merkle of University of Tennessee whose PDE-ejector analysis helps guide the experiments. The present research effort will increase the TRL of PDE-ejectors from its current level of 2 to a level of 3.

  12. From urban planning and emergency training to Pokémon Go: applications of virtual reality GIS (VRGIS) and augmented reality GIS (ARGIS) in personal, public and environmental health.

    PubMed

    Kamel Boulos, Maged N; Lu, Zhihan; Guerrero, Paul; Jennett, Charlene; Steed, Anthony

    2017-02-20

    The latest generation of virtual and mixed reality hardware has rekindled interest in virtual reality GIS (VRGIS) and augmented reality GIS (ARGIS) applications in health, and opened up new and exciting opportunities and possibilities for using these technologies in the personal and public health arenas. From smart urban planning and emergency training to Pokémon Go, this article offers a snapshot of some of the most remarkable VRGIS and ARGIS solutions for tackling public and environmental health problems, and bringing about safer and healthier living options to individuals and communities. The article also covers the main technical foundations and issues underpinning these solutions.

  13. Applying a Testing Methodology to Augmented Reality Interfaces to Simulation Systems

    DTIC Science & Technology

    2005-01-01

    facility. mounted display (Sony Glasstron, Microvision Nomad, or Trivisio). This approach integrates spatial information with objects in the...Augmented vision for automotive mainte- nance and repair: The benefits of testing and repair data at point of task (unpublished), Microvision , Inc

  14. Augmented Reality in the Science Museum: Lessons Learned in Scaffolding for Conceptual and Cognitive Learning

    ERIC Educational Resources Information Center

    Yoon, Susan A.; Elinich, Karen; Wang, Joyce; Van Schooneveld, Jacqueline G.

    2012-01-01

    This research follows on previous studies that investigated how digitally augmented devices and knowledge scaffolds enhance learning in a science museum. We investigated what combination of scaffolds could be used in conjunction with the unique characteristics of informal participation to increase conceptual and cognitive outcomes. 307 students…

  15. 3D optical see-through head-mounted display based augmented reality system and its application

    NASA Astrophysics Data System (ADS)

    Zhang, Zhenliang; Weng, Dongdong; Liu, Yue; Xiang, Li

    2015-07-01

    The combination of health and entertainment becomes possible due to the development of wearable augmented reality equipment and corresponding application software. In this paper, we implemented a fast calibration extended from SPAAM for an optical see-through head-mounted display (OSTHMD) which was made in our lab. During the calibration, the tracking and recognition techniques upon natural targets were used, and the spatial corresponding points had been set in dispersed and well-distributed positions. We evaluated the precision of this calibration, in which the view angle ranged from 0 degree to 70 degrees. Relying on the results above, we calculated the position of human eyes relative to the world coordinate system and rendered 3D objects in real time with arbitrary complexity on OSTHMD, which accurately matched the real world. Finally, we gave the degree of satisfaction about our device in the combination of entertainment and prevention of cervical vertebra diseases through user feedbacks.

  16. Development of marker-based tracking methods for augmented reality applied to NPP maintenance work support and its experimental evaluation

    SciTech Connect

    Ishii, H.; Fujino, H.; Bian, Z.; Sekiyama, T.; Shimoda, H.; Yoshikawa, H.

    2006-07-01

    In this study, two types of marker-based tracking methods for Augmented Reality have been developed. One is a method which employs line-shaped markers and the other is a method which employs circular-shaped markers. These two methods recognize the markers by means of image processing and calculate the relative position and orientation between the markers and the camera in real time. The line-shaped markers are suitable to be pasted in the buildings such as NPPs where many pipes and tanks exist. The circular-shaped markers are suitable for the case that there are many obstacles and it is difficult to use line-shaped markers because the obstacles hide the part of the line-shaped markers. Both methods can extend the maximum distance between the markers and the camera compared to the legacy marker-based tracking methods. (authors)

  17. SU-E-J-134: An Augmented-Reality Optical Imaging System for Accurate Breast Positioning During Radiotherapy

    SciTech Connect

    Nazareth, D; Malhotra, H; French, S; Hoffmann, K; Merrow, C

    2014-06-01

    Purpose: Breast radiotherapy, particularly electronic compensation, may involve large dose gradients and difficult patient positioning problems. We have developed a simple self-calibrating augmented-reality system, which assists in accurately and reproducibly positioning the patient, by displaying her live image from a single camera superimposed on the correct perspective projection of her 3D CT data. Our method requires only a standard digital camera capable of live-view mode, installed in the treatment suite at an approximately-known orientation and position (rotation R; translation T). Methods: A 10-sphere calibration jig was constructed and CT imaged to provide a 3D model. The (R,T) relating the camera to the CT coordinate system were determined by acquiring a photograph of the jig and optimizing an objective function, which compares the true image points to points calculated with a given candidate R and T geometry. Using this geometric information, 3D CT patient data, viewed from the camera's perspective, is plotted using a Matlab routine. This image data is superimposed onto the real-time patient image, acquired by the camera, and displayed using standard live-view software. This enables the therapists to view both the patient's current and desired positions, and guide the patient into assuming the correct position. The method was evaluated using an in-house developed bolus-like breast phantom, mounted on a supporting platform, which could be tilted at various angles to simulate treatment-like geometries. Results: Our system allowed breast phantom alignment, with an accuracy of about 0.5 cm and 1 ± 0.5 degree. Better resolution could be possible using a camera with higher-zoom capabilities. Conclusion: We have developed an augmented-reality system, which combines a perspective projection of a CT image with a patient's real-time optical image. This system has the potential to improve patient setup accuracy during breast radiotherapy, and could possibly be

  18. Projector-Based Augmented Reality for Intuitive Intraoperative Guidance in Image-Guided 3D Interstitial Brachytherapy

    SciTech Connect

    Krempien, Robert Hoppe, Harald; Kahrs, Lueder; Daeuber, Sascha; Schorr, Oliver; Eggers, Georg; Bischof, Marc; Munter, Marc W.; Debus, Juergen; Harms, Wolfgang

    2008-03-01

    Purpose: The aim of this study is to implement augmented reality in real-time image-guided interstitial brachytherapy to allow an intuitive real-time intraoperative orientation. Methods and Materials: The developed system consists of a common video projector, two high-resolution charge coupled device cameras, and an off-the-shelf notebook. The projector was used as a scanning device by projecting coded-light patterns to register the patient and superimpose the operating field with planning data and additional information in arbitrary colors. Subsequent movements of the nonfixed patient were detected by means of stereoscopically tracking passive markers attached to the patient. Results: In a first clinical study, we evaluated the whole process chain from image acquisition to data projection and determined overall accuracy with 10 patients undergoing implantation. The described method enabled the surgeon to visualize planning data on top of any preoperatively segmented and triangulated surface (skin) with direct line of sight during the operation. Furthermore, the tracking system allowed dynamic adjustment of the data to the patient's current position and therefore eliminated the need for rigid fixation. Because of soft-part displacement, we obtained an average deviation of 1.1 mm by moving the patient, whereas changing the projector's position resulted in an average deviation of 0.9 mm. Mean deviation of all needles of an implant was 1.4 mm (range, 0.3-2.7 mm). Conclusions: The developed low-cost augmented-reality system proved to be accurate and feasible in interstitial brachytherapy. The system meets clinical demands and enables intuitive real-time intraoperative orientation and monitoring of needle implantation.

  19. A Low-Cost iPhone-Assisted Augmented Reality Solution for the Localization of Intracranial Lesions

    PubMed Central

    Zhu, RuYuan; Chen, XiaoLei; Zhang, Jun

    2016-01-01

    Background Precise location of intracranial lesions before surgery is important, but occasionally difficult. Modern navigation systems are very helpful, but expensive. A low-cost solution that could locate brain lesions and their surface projections in augmented reality would be beneficial. We used an iPhone to partially achieve this goal, and evaluated its accuracy and feasibility in a clinical neurosurgery setting. Methodology/Principal Findings We located brain lesions in 35 patients, and using an iPhone, we depicted the lesion’s surface projection onto the skin of the head. To assess the accuracy of this method, we pasted computed tomography (CT) markers surrounding the depicted lesion boundaries on the skin onto 15 patients. CT scans were then performed with or without contrast enhancement. The deviations (D) between the CT markers and the actual lesion boundaries were measured. We found that 97.7% of the markers displayed a high accuracy level (D ≤ 5mm). In the remaining 20 patients, we compared our iPhone-based method with a frameless neuronavigation system. Four check points were chosen on the skin surrounding the depicted lesion boundaries, to assess the deviations between the two methods. The integrated offset was calculated according to the deviations at the four check points. We found that for the supratentorial lesions, the medial offset between these two methods was 2.90 mm and the maximum offset was 4.2 mm. Conclusions/Significance This low-cost, image-based, iPhone-assisted, augmented reality solution is technically feasible, and helpful for the localization of some intracranial lesions, especially shallow supratentorial intracranial lesions of moderate size. PMID:27454518

  20. Low end interactive image-directed neurosurgery. Update on rudimentary augmented reality used in epilepsy surgery.

    PubMed

    Doyle, W K

    1996-01-01

    Our experience with a very low end interactive image-directed (IIDS) neurosurgical system is presented. The system was developed by the author and consists of a personal desktop computer and a magnetic field digitizer. This low cost solution was pursued as an alternative to available commercial devices which were expensive and not readily modifiable for novel ideas and new applications targeted for Epilepsy surgery. The rationale and description of the system was presented last year at Medicine Meets Virtual Reality III. Included in that detailed report were the fundamental mathematics forming the basics of transformation between the surgical and the digital data spaces. Since then the system has been used in an additional 20 cases now totaling 40 in all. Its advantages and short comings will be described. The theoretical advantages of magnetic field technology over other localization methods is reviewed. Also, our experience with alternative low cost off-the-shelf interfacing devices and other related modifications are described. We have accumulated clinical data to suggest that craniotomy sizes have been reduced, electrode placement has been improved, and that interactive image-directed techniques offer advantages over other common intra-operative localization modalities such as ultrasound. Our conclusion is that interactive image-directed techniques improve neurosurgery and that inexpensive enabling technology is already available providing the technological substrate for low cost devices using virtual reality notions for surgery and medicine. This particular technology offers advantages to traditional surgical techniques demonstrating the attractiveness of rudimentary virtual reality medical applications.

  1. Prototype Development of Low-Cost, Augmented Reality Trainer for Crew Service Weapons

    DTIC Science & Technology

    2008-09-01

    Blending Virtual and Real Worlds .............20 D. GAME ENGINES ......................................21 E. CHAPTER SUMMARY...33 2. Creating Video Texture .......................33 3. Combining Video and Game Engine ..............33 D. ANIMATION...Model...............14 Table 4. Hughes Salvo Model Assumptions..................14 Table 5. Common Commercial Game Engines ..................22

  2. Alien Contact. Examining the Influence of Teacher Mathematics Knowledge for Teaching on Their Implementation of a Mathematical, Augmented Reality Curricular Unit

    ERIC Educational Resources Information Center

    Mitchell, Rebecca Noelle

    2009-01-01

    This paper reports on findings from a five-teacher, exploratory case study, critically observing their implementation of a technology-intensive, augmented reality (AR) mathematics curriculum unit, along with its paper-based control. The unit itself was intended to promote multiple proportional reasoning strategies to urban, public middle school…

  3. The Interaction of Child-Parent Shared Reading with an Augmented Reality (AR) Picture Book and Parents' Conceptions of AR Learning

    ERIC Educational Resources Information Center

    Cheng, Kun-Hung; Tsai, Chin-Chung

    2016-01-01

    Following a previous study (Cheng & Tsai, 2014. "Computers & Education"), this study aimed to probe the interaction of child-parent shared reading with the augmented reality (AR) picture book in more depth. A series of sequential analyses were thus conducted to infer the behavioral transition diagrams and visualize the continuity…

  4. The Effectiveness of Using Augmented Reality Apps in Teaching the English Alphabet to Kindergarten Children: A Case Study in the State of Kuwait

    ERIC Educational Resources Information Center

    Safar, Ammar H.; Al-Jafar, Ali A.; Al-Yousefi, Zainab H.

    2017-01-01

    This experimental research study scrutinized the effectiveness of using augmented reality (AR) applications (apps) as a teaching and learning tool when instructing kindergarten children in the English alphabet in the State of Kuwait. The study compared two groups: (a) experimental, taught using AR apps, and (b) control, taught using traditional…

  5. Development of a surgical navigation system based on augmented reality using an optical see-through head-mounted display.

    PubMed

    Chen, Xiaojun; Xu, Lu; Wang, Yiping; Wang, Huixiang; Wang, Fang; Zeng, Xiangsen; Wang, Qiugen; Egger, Jan

    2015-06-01

    The surgical navigation system has experienced tremendous development over the past decades for minimizing the risks and improving the precision of the surgery. Nowadays, Augmented Reality (AR)-based surgical navigation is a promising technology for clinical applications. In the AR system, virtual and actual reality are mixed, offering real-time, high-quality visualization of an extensive variety of information to the users (Moussa et al., 2012) [1]. For example, virtual anatomical structures such as soft tissues, blood vessels and nerves can be integrated with the real-world scenario in real time. In this study, an AR-based surgical navigation system (AR-SNS) is developed using an optical see-through HMD (head-mounted display), aiming at improving the safety and reliability of the surgery. With the use of this system, including the calibration of instruments, registration, and the calibration of HMD, the 3D virtual critical anatomical structures in the head-mounted display are aligned with the actual structures of patient in real-world scenario during the intra-operative motion tracking process. The accuracy verification experiment demonstrated that the mean distance and angular errors were respectively 0.809±0.05mm and 1.038°±0.05°, which was sufficient to meet the clinical requirements.

  6. Hybrid Reality Lab Capabilities - Video 2

    NASA Technical Reports Server (NTRS)

    Delgado, Francisco J.; Noyes, Matthew

    2016-01-01

    Our Hybrid Reality and Advanced Operations Lab is developing incredibly realistic and immersive systems that could be used to provide training, support engineering analysis, and augment data collection for various human performance metrics at NASA. To get a better understanding of what Hybrid Reality is, let's go through the two most commonly known types of immersive realities: Virtual Reality, and Augmented Reality. Virtual Reality creates immersive scenes that are completely made up of digital information. This technology has been used to train astronauts at NASA, used during teleoperation of remote assets (arms, rovers, robots, etc.) and other activities. One challenge with Virtual Reality is that if you are using it for real time-applications (like landing an airplane) then the information used to create the virtual scenes can be old (i.e. visualized long after physical objects moved in the scene) and not accurate enough to land the airplane safely. This is where Augmented Reality comes in. Augmented Reality takes real-time environment information (from a camera, or see through window, and places digitally created information into the scene so that it matches with the video/glass information). Augmented Reality enhances real environment information collected with a live sensor or viewport (e.g. camera, window, etc.) with the information-rich visualization provided by Virtual Reality. Hybrid Reality takes Augmented Reality even further, by creating a higher level of immersion where interactivity can take place. Hybrid Reality takes Virtual Reality objects and a trackable, physical representation of those objects, places them in the same coordinate system, and allows people to interact with both objects' representations (virtual and physical) simultaneously. After a short period of adjustment, the individuals begin to interact with all the objects in the scene as if they were real-life objects. The ability to physically touch and interact with digitally created

  7. D Survey and Augmented Reality for Cultural Heritage. The Case Study of Aurelian Wall at Castra Praetoria in Rome

    NASA Astrophysics Data System (ADS)

    Canciani, M.; Conigliaro, E.; Del Grasso, M.; Papalini, P.; Saccone, M.

    2016-06-01

    The development of close-range photogrammetry has produced a lot of new possibility to study cultural heritage. 3D data acquired with conventional and low cost cameras can be used to document, investigate the full appearance, materials and conservation status, to help the restoration process and identify intervention priorities. At the same time, with 3D survey a lot of three-dimensional data are collected and analyzed by researchers, but there are a very few possibility of 3D output. The augmented reality is one of this possible output with a very low cost technology but a very interesting result. Using simple mobile technology (for iPad and Android Tablets) and shareware software (in the case presented "Augment") it is possible to share and visualize a large number of 3D models with your own device. The case study presented is a part of an architecture graduate thesis, made in Rome at Department of Architecture of Roma Tre University. We have developed a photogrammetric survey to study the Aurelian Wall at Castra Praetoria in Rome. The surveys of 8000 square meters of surface have allowed to identify stratigraphy and construction phases of a complex portion of Aurelian Wall, specially about the Northern door of Castra. During this study, the data coming out of 3D survey (photogrammetric and topographic), are stored and used to create a reverse 3D model, or virtual reconstruction, of the Northern door of Castra. This virtual reconstruction shows the door in the Tiberian period, nowadays it's totally hidden by a curtain wall but, little and significative architectural details allow to know its original feature. The 3D model of the ancient walls has been mapped with the exact type of bricks and mortar, oriented and scaled according to the existing one to use augmented reality. Finally, two kind of application have been developed, one on site, were you can see superimposed the virtual reconstruction on the existing walls using the image recognition. On the other hand

  8. Telemedicine with mobile devices and augmented reality for early postoperative care.

    PubMed

    Ponce, Brent A; Brabston, Eugene W; Shin Zu; Watson, Shawna L; Baker, Dustin; Winn, Dennis; Guthrie, Barton L; Shenai, Mahesh B

    2016-08-01

    Advanced features are being added to telemedicine paradigms to enhance usability and usefulness. Virtual Interactive Presence (VIP) is a technology that allows a surgeon and patient to interact in a "merged reality" space, to facilitate both verbal, visual, and manual interaction. In this clinical study, a mobile VIP iOS application was introduced into routine post-operative orthopedic and neurosurgical care. Survey responses endorse the usefulness of this tool, as it relates to The virtual interaction provides needed virtual follow-up in instances where in-person follow-up may be limited, and enhances the subjective patient experience.

  9. Virtual-reality-based educational laboratories in fiber optic engineering

    NASA Astrophysics Data System (ADS)

    Hayes, Dana; Turczynski, Craig; Rice, Jonny; Kozhevnikov, Michael

    2014-07-01

    Researchers and educators have observed great potential in virtual reality (VR) technology as an educational tool due to its ability to engage and spark interest in students, thus providing them with a deeper form of knowledge about a subject. The focus of this project is to develop an interactive VR educational module, Laser Diode Characteristics and Coupling to Fibers, to integrate into a fiber optics laboratory course. The developed module features a virtual laboratory populated with realistic models of optical devices in which students can set up and perform an optical experiment dealing with laser diode characteristics and fiber coupling. The module contains three increasingly complex levels for students to navigate through, with a short built-in quiz after each level to measure the student's understanding of the subject. Seventeen undergraduate students learned fiber coupling concepts using the designed computer simulation in a non-immersive desktop virtual environment (VE) condition. The analysis of students' responses on the updated pre- and post tests show statistically significant improvement of the scores for the post-test as compared to the pre-test. In addition, the students' survey responses suggest that they found the module very useful and engaging. The conducted study clearly demonstrated the feasibility of the proposed instructional technology for engineering education, where both the model of instruction and the enabling technology are equally important, in providing a better learning environment to improve students' conceptual understanding as compared to other instructional approaches.

  10. Human Pacman: A Mobile Augmented Reality Entertainment System Based on Physical, Social, and Ubiquitous Computing

    NASA Astrophysics Data System (ADS)

    Cheok, Adrian David

    This chapter details the Human Pacman system to illuminate entertainment computing which ventures to embed the natural physical world seamlessly with a fantasy virtual playground by capitalizing on infrastructure provided by mobile computing, wireless LAN, and ubiquitous computing. With Human Pacman, we have a physical role-playing computer fantasy together with real human-social and mobile-gaming that emphasizes on collaboration and competition between players in a wide outdoor physical area that allows natural wide-area human-physical movements. Pacmen and Ghosts are now real human players in the real world experiencing mixed computer graphics fantasy-reality provided by using the wearable computers on them. Virtual cookies and actual tangible physical objects are incorporated into the game play to provide novel experiences of seamless transitions between the real and virtual worlds. This is an example of a new form of gaming that anchors on physicality, mobility, social interaction, and ubiquitous computing.

  11. The use of virtual reality-based therapy to augment poststroke upper limb recovery

    PubMed Central

    Samuel, Geoffrey S; Choo, Min; Chan, Wai Yin; Kok, Stanley; Ng, Yee Sien

    2015-01-01

    Stroke remains one of the major causes of disability worldwide. This case report illustrates the complementary use of biomechanical and kinematic in-game markers, in addition to standard clinical outcomes, to comprehensively assess and track a patient’s disabilities. A 65-year-old patient was admitted for right-sided weakness and clinically diagnosed with acute ischaemic stroke. She participated in a short trial of standard stroke occupational therapy and physiotherapy with additional daily virtual reality (VR)-based therapy. Outcomes were tracked using kinematic data and conventional clinical assessments. Her Functional Independence Measure score improved from 87 to 113 and Fugl-Meyer motor score improved from 56 to 62, denoting clinically significant improvement. Corresponding kinematic analysis revealed improved hand path ratios and a decrease in velocity peaks. Further research is being undertaken to elucidate the optimal type, timing, setting and duration of VR-based therapy, as well as the use of neuropharmacological adjuncts. PMID:26243983

  12. Linear Narratives, Arbitrary Relationships: Arbitrary Relationships: Mimesis and Direct Communication for Effectively Representing Engineering Realities Multimodally

    ERIC Educational Resources Information Center

    Jeyaraj, Joseph

    2017-01-01

    Engineers communicate multimodally using written and visual communication, but there is not much theorizing on why they do so and how. This essay, therefore, examines why engineers communicate multimodally, what, in the context of representing engineering realities, are the strengths and weaknesses of written and visual communication, and how,…

  13. Automatic localization of endoscope in intraoperative CT image: A simple approach to augmented reality guidance in laparoscopic surgery.

    PubMed

    Bernhardt, Sylvain; Nicolau, Stéphane A; Agnus, Vincent; Soler, Luc; Doignon, Christophe; Marescaux, Jacques

    2016-05-01

    The use of augmented reality in minimally invasive surgery has been the subject of much research for more than a decade. The endoscopic view of the surgical scene is typically augmented with a 3D model extracted from a preoperative acquisition. However, the organs of interest often present major changes in shape and location because of the pneumoperitoneum and patient displacement. There have been numerous attempts to compensate for this distortion between the pre- and intraoperative states. Some have attempted to recover the visible surface of the organ through image analysis and register it to the preoperative data, but this has proven insufficiently robust and may be problematic with large organs. A second approach is to introduce an intraoperative 3D imaging system as a transition. Hybrid operating rooms are becoming more and more popular, so this seems to be a viable solution, but current techniques require yet another external and constraining piece of apparatus such as an optical tracking system to determine the relationship between the intraoperative images and the endoscopic view. In this article, we propose a new approach to automatically register the reconstruction from an intraoperative CT acquisition with the static endoscopic view, by locating the endoscope tip in the volume data. We first describe our method to localize the endoscope orientation in the intraoperative image using standard image processing algorithms. Secondly, we highlight that the axis of the endoscope needs a specific calibration process to ensure proper registration accuracy. In the last section, we present quantitative and qualitative results proving the feasibility and the clinical potential of our approach.

  14. An augmented reality environment for image-guidance of off-pump mitral valve implantation

    NASA Astrophysics Data System (ADS)

    Linte, Christian; Wiles, Andrew D.; Hill, Nick; Moore, John; Wedlake, Chris; Guiraudon, Gerard; Jones, Doug; Bainbridge, Daniel; Peters, Terry M.

    2007-03-01

    Clinical research has been rapidly evolving towards the development of less invasive surgical procedures. We recently embarked on a project to improve intracardiac beating heart interventions. Our novel approach employs new surgical technologies and support from image-guidance via pre-operative and intra-operative imaging (i.e. two-dimensional echocardiography) to substitute for direct vision. Our goal was to develop a versatile system that allowed for safe cardiac port access, and provide sufficient image-guidance with the aid of a virtual reality environment to substitute for the absence of direct vision, while delivering quality therapy to the target. Specific targets included the repair and replacement of heart valves and the repair of septal defects. The ultimate objective was to duplicate the success rate of conventional open-heart surgery, but to do so via a small incision, and to evaluate the efficacy of the procedure as it is performed. This paper describes the software and hardware components, along with the methodology for performing mitral valve replacement as one example of this approach, using ultrasound and virtual tool models to position and fasten the valve in place.

  15. Content Delivery Using Augmented Reality to Enhance Students' Performance in a Building Design and Assembly Project

    ERIC Educational Resources Information Center

    Shirazi, Arezoo; Behzadan, Amir H.

    2015-01-01

    Recent studies suggest that the number of students pursuing science, technology, engineering, and mathematics (STEM) degrees has been generally decreasing. An extensive body of research cites the lack of motivation and engagement in the learning process as a major underlying reason of this decline. It has been discussed that if properly…

  16. From Survey to Education: How Augmented Reality Can Contribute to the Study and Dissemination of Archaeo-astronomy

    NASA Astrophysics Data System (ADS)

    Schiavottiello, N.

    2009-08-01

    The study and practice of archaeo-astronomy comprehend disciplines such as archaeology, positional astronomy, history and the studies of locals mythology as well as technical survey theory and practice. The research often start with an archaeological survey in order to record possible structural orientation of a particular monument towards specific cardinal directions. In a second stage theories about the visible orientations and possible alignments of a specific structure or part of a structure are drawn; often achieved with the use of some in house tools. These tools sometimes remain too ``esoteric'' and not always user friendly, especially if later they would have to be used for education purposes. Moreover they are borrowed from tools used in other disciplines such us astronomical, image processing and architectural software, thus resulting in a complicate process of trying to merge data that should instead be born in the same environment at the first place. Virtual realities have long entered our daily life in research, education and entertainment; those can represent natural models because of their 3D nature of representing data. However on an visual interpretation level what they often represent are displaced models of the reality, whatever viewed on personal computers or with ``immersive'' techniques. These can result very useful at a research stage or in order to show concepts that requires specific point of view, however they often struggle to explore all our senses to the mere detriment of our vision. A possible solution could be achieved by simply visiting the studied site, however when visiting a particular place it is hard to visualize in one simple application environment, all previously pursued analysis. This is necessary in order to discover the meaning of a specific structure and to propose new theories. Augmented reality in this sense could bridge the gap that exist when looking at this particular problem. This can be achieved with the creation

  17. Interactive Near-Field Illumination for Photorealistic Augmented Reality with Varying Materials on Mobile Devices.

    PubMed

    Rohmer, Kai; Buschel, Wolfgang; Dachselt, Raimund; Grosch, Thorsten

    2015-12-01

    At present, photorealistic augmentation is not yet possible since the computational power of mobile devices is insufficient. Even streaming solutions from stationary PCs cause a latency that affects user interactions considerably. Therefore, we introduce a differential rendering method that allows for a consistent illumination of the inserted virtual objects on mobile devices, avoiding delays. The computation effort is shared between a stationary PC and the mobile devices to make use of the capacities available on both sides. The method is designed such that only a minimum amount of data has to be transferred asynchronously between the participants. This allows for an interactive illumination of virtual objects with a consistent appearance under both temporally and spatially varying real illumination conditions. To describe the complex near-field illumination in an indoor scenario, HDR video cameras are used to capture the illumination from multiple directions. In this way, sources of illumination can be considered that are not directly visible to the mobile device because of occlusions and the limited field of view. While our method focuses on Lambertian materials, we also provide some initial approaches to approximate non-diffuse virtual objects and thereby allow for a wider field of application at nearly the same cost.

  18. Lbs Augmented Reality Assistive System for Utilities Infrastructure Management Through Galileo and Egnos

    NASA Astrophysics Data System (ADS)

    Stylianidis, E.; Valaria, E.; Smagas, K.; Pagani, A.; Henriques, J.; Garca, A.; Jimeno, E.; Carrillo, I.; Patias, P.; Georgiadis, C.; Kounoudes, A.; Michail, K.

    2016-06-01

    There is a continuous and increasing demand for solutions, both software and hardware-based, that are able to productively handle underground utilities geospatial data. Innovative approaches that are based on the use of the European GNSS, Galileo and EGNOS, sensor technologies and LBS, are able to monitor, document and manage utility infrastructures' data with an intuitive 3D augmented visualisation and navigation/positioning technology. A software and hardware-based system called LARA, currently under develop- ment through a H2020 co-funded project, aims at meeting that demand. The concept of LARA is to integrate the different innovative components of existing technologies in order to design and develop an integrated navigation/positioning and information system which coordinates GNSS, AR, 3D GIS and geodatabases on a mobile platform for monitoring, documenting and managing utility infrastruc- tures on-site. The LARA system will guide utility field workers to locate the working area by helping them see beneath the ground, rendering the complexity of the 3D models of the underground grid such as water, gas and electricity. The capacity and benefits of LARA are scheduled to be tested in two case studies located in Greece and the United Kingdom with various underground utilities. The paper aspires to present the first results from this initiative. The project leading to this application has received funding from the European GNSS Agency under the European Union's Horizon 2020 research and innovation programme under grant agreement No 641460.

  19. MHD Augmentation of Rocket Engines Using Beamed Energy

    NASA Astrophysics Data System (ADS)

    Lineberry, John T.; Chapman, James N.; Litchford, Ron J.; Jones, Jonathan

    2003-05-01

    MHD technology and fundamental relations that pertain to accelerating a working fluid for propulsion of space vehicles are reviewed. Previous concepts on MHD propulsion have considered use of an on-board power supply to provide the electric power for the MHD thruster which is accompanied by an obvious weight penalty. In this study, an orbiting power station that beams microwave or laser power to the spacecraft is considered which eliminates this penalty making the thruster significantly more effective from the thrust-to-weight viewpoint. The objective of the study was to investigate augmenting a rocket motor to increase the ISP into the 2,500 seconds range using MHD acceleration. Mission scenarios are presented to parametrically compare the MHD augmented motor. Accelerator performance is calculated for an array of cases which vary the mass throughput, magnetic field strength and MHD interaction level. Performance improved with size, magnetic field strength and interaction level, although lower interaction levels can also produce attractive configurations. Accelerator efficiencies are typically 80-90%. The results display a large regime for improved performance in which the extent of the regime is critically dependent upon the weight of the power receiving equipment (rectenna). It is concluded that this system has potential when used with an orbiting power station that transmits power to the space vehicle by microwave radiation or laser beams. The most critical technology improvement needed is a reduced weight rectenna system but more development is also needed on the MHD accelerator, which is currently underway with NASA sponsorship.

  20. Role of Cranial and Spinal Virtual and Augmented Reality Simulation Using Immersive Touch Modules in Neurosurgical Training

    PubMed Central

    Alaraj, Ali; Charbel, Fady T.; Birk, Daniel; Tobin, Mathew; Luciano, Cristian; Banerjee, Pat P.; Rizzi, Silvio; Sorenson, Jeff; Foley, Kevin; Slavin, Konstantin; Roitberg, Ben

    2013-01-01

    Recent studies have shown that mental script-based rehearsal and simulation-based training improves the transfer of surgical skills in various medical disciplines. Despite significant advances in technology and intraoperative techniques over the last several decades, surgical skills training on neurosurgical operations still carries significant risk of serious morbidity or mortality. Potentially avoidable technical errors are well recognized as contributing to poor surgical outcome. Surgical education is undergoing overwhelming change, with reduction of working hours and current trends to focus on patient’s safety and linking reimbursement with clinical outcomes, and there is a need for adjunctive means for neurosurgical training;this has been recent advancement in simulation technology. ImmersiveTouch (IT) is an augmented reality (AR) system that integrates a haptic device and a high-resolution stereoscopic display. This simulation platform utilizes multiple sensory modalities, recreating many of the environmental cues experienced during an actual procedure. Modules available include ventriculostomy, bone drilling, percutaneous trigeminal rhizotomy, in addition to simulated spinal modules such as pedicle screw placement, vertebroplasty, and lumbar puncture. We present our experience with development of such AR neurosurgical modules and the feedback from neurosurgical residents. PMID:23254799

  1. Smart Multi-Level Tool for Remote Patient Monitoring Based on a Wireless Sensor Network and Mobile Augmented Reality

    PubMed Central

    González, Fernando Cornelio Jimènez; Villegas, Osslan Osiris Vergara; Ramírez, Dulce Esperanza Torres; Sánchez, Vianey Guadalupe Cruz; Domínguez, Humberto Ochoa

    2014-01-01

    Technological innovations in the field of disease prevention and maintenance of patient health have enabled the evolution of fields such as monitoring systems. One of the main advances is the development of real-time monitors that use intelligent and wireless communication technology. In this paper, a system is presented for the remote monitoring of the body temperature and heart rate of a patient by means of a wireless sensor network (WSN) and mobile augmented reality (MAR). The combination of a WSN and MAR provides a novel alternative to remotely measure body temperature and heart rate in real time during patient care. The system is composed of (1) hardware such as Arduino microcontrollers (in the patient nodes), personal computers (for the nurse server), smartphones (for the mobile nurse monitor and the virtual patient file) and sensors (to measure body temperature and heart rate), (2) a network layer using WiFly technology, and (3) software such as LabView, Android SDK, and DroidAR. The results obtained from tests show that the system can perform effectively within a range of 20 m and requires ten minutes to stabilize the temperature sensor to detect hyperthermia, hypothermia or normal body temperature conditions. Additionally, the heart rate sensor can detect conditions of tachycardia and bradycardia. PMID:25230306

  2. BoreholeAR: A mobile tablet application for effective borehole database visualization using an augmented reality technology

    NASA Astrophysics Data System (ADS)

    Lee, Sangho; Suh, Jangwon; Park, Hyeong-Dong

    2015-03-01

    Boring logs are widely used in geological field studies since the data describes various attributes of underground and surface environments. However, it is difficult to manage multiple boring logs in the field as the conventional management and visualization methods are not suitable for integrating and combining large data sets. We developed an iPad application to enable its user to search the boring log rapidly and visualize them using the augmented reality (AR) technique. For the development of the application, a standard borehole database appropriate for a mobile-based borehole database management system was designed. The application consists of three modules: an AR module, a map module, and a database module. The AR module superimposes borehole data on camera imagery as viewed by the user and provides intuitive visualization of borehole locations. The map module shows the locations of corresponding borehole data on a 2D map with additional map layers. The database module provides data management functions for large borehole databases for other modules. Field survey was also carried out using more than 100,000 borehole data.

  3. Smart multi-level tool for remote patient monitoring based on a wireless sensor network and mobile augmented reality.

    PubMed

    González, Fernando Cornelio Jiménez; Villegas, Osslan Osiris Vergara; Ramírez, Dulce Esperanza Torres; Sánchez, Vianey Guadalupe Cruz; Domínguez, Humberto Ochoa

    2014-09-16

    Technological innovations in the field of disease prevention and maintenance of patient health have enabled the evolution of fields such as monitoring systems. One of the main advances is the development of real-time monitors that use intelligent and wireless communication technology. In this paper, a system is presented for the remote monitoring of the body temperature and heart rate of a patient by means of a wireless sensor network (WSN) and mobile augmented reality (MAR). The combination of a WSN and MAR provides a novel alternative to remotely measure body temperature and heart rate in real time during patient care. The system is composed of (1) hardware such as Arduino microcontrollers (in the patient nodes), personal computers (for the nurse server), smartphones (for the mobile nurse monitor and the virtual patient file) and sensors (to measure body temperature and heart rate), (2) a network layer using WiFly technology, and (3) software such as LabView, Android SDK, and DroidAR. The results obtained from tests show that the system can perform effectively within a range of 20 m and requires ten minutes to stabilize the temperature sensor to detect hyperthermia, hypothermia or normal body temperature conditions. Additionally, the heart rate sensor can detect conditions of tachycardia and bradycardia.

  4. An Investigation of University Students' Collaborative Inquiry Learning Behaviors in an Augmented Reality Simulation and a Traditional Simulation

    NASA Astrophysics Data System (ADS)

    Wang, Hung-Yuan; Duh, Henry Been-Lirn; Li, Nai; Lin, Tzung-Jin; Tsai, Chin-Chung

    2014-10-01

    The purpose of this study is to investigate and compare students' collaborative inquiry learning behaviors and their behavior patterns in an augmented reality (AR) simulation system and a traditional 2D simulation system. Their inquiry and discussion processes were analyzed by content analysis and lag sequential analysis (LSA). Forty university students were divided into dyads and then randomly assigned into AR group and traditional 2D group to collaboratively conduct an inquiry task about elastic collision. The results of the content analysis and LSA indicated that both systems supported students' collaborative inquiry learning. Particularly, students showed high frequencies on higher-level inquiry behaviors, such as interpreting experimental data or making conclusions, when using these two simulations. By comparing the behavioral patterns, similarities and differences between the two groups were revealed. The AR simulation engaged the students more thoroughly in the inquiry process. Moreover, students in both groups adopted the same approaches to design experiments. Due to the line of AR research is in its initial stage, suggestions for future studies were proposed.

  5. Pico Lantern: Surface reconstruction and augmented reality in laparoscopic surgery using a pick-up laser projector.

    PubMed

    Edgcumbe, Philip; Pratt, Philip; Yang, Guang-Zhong; Nguan, Christopher; Rohling, Robert

    2015-10-01

    The Pico Lantern is a miniature projector developed for structured light surface reconstruction, augmented reality and guidance in laparoscopic surgery. During surgery it will be dropped into the patient and picked up by a laparoscopic tool. While inside the patient it projects a known coded pattern and images onto the surface of the tissue. The Pico Lantern is visually tracked in the laparoscope's field of view for the purpose of stereo triangulation between it and the laparoscope. In this paper, the first application is surface reconstruction. Using a stereo laparoscope and an untracked Pico Lantern, the absolute error for surface reconstruction for a plane, cylinder and ex vivo kidney, is 2.0 mm, 3.0 mm and 5.6 mm, respectively. Using a mono laparoscope and a tracked Pico Lantern for the same plane, cylinder and kidney the absolute error is 1.4 mm, 1.5 mm and 1.5 mm, respectively. These results confirm the benefit of the wider baseline produced by tracking the Pico Lantern. Virtual viewpoint images are generated from the kidney surface data and an in vivo proof-of-concept porcine trial is reported. Surface reconstruction of the neck of a volunteer shows that the pulsatile motion of the tissue overlying a major blood vessel can be detected and displayed in vivo. Future work will integrate the Pico Lantern into standard and robot-assisted laparoscopic surgery.

  6. A real-time 3D end-to-end augmented reality system (and its representation transformations)

    NASA Astrophysics Data System (ADS)

    Tytgat, Donny; Aerts, Maarten; De Busser, Jeroen; Lievens, Sammy; Rondao Alface, Patrice; Macq, Jean-Francois

    2016-09-01

    The new generation of HMDs coming to the market is expected to enable many new applications that allow free viewpoint experiences with captured video objects. Current applications usually rely on 3D content that is manually created or captured in an offline manner. In contrast, this paper focuses on augmented reality applications that use live captured 3D objects while maintaining free viewpoint interaction. We present a system that allows live dynamic 3D objects (e.g. a person who is talking) to be captured in real-time. Real-time performance is achieved by traversing a number of representation formats and exploiting their specific benefits. For instance, depth images are maintained for fast neighborhood retrieval and occlusion determination, while implicit surfaces are used to facilitate multi-source aggregation for both geometry and texture. The result is a 3D reconstruction system that outputs multi-textured triangle meshes at real-time rates. An end-to-end system is presented that captures and reconstructs live 3D data and allows for this data to be used on a networked (AR) device. For allocating the different functional blocks onto the available physical devices, a number of alternatives are proposed considering the available computational power and bandwidth for each of the components. As we will show, the representation format can play an important role in this functional allocation and allows for a flexible system that can support a highly heterogeneous infrastructure.

  7. Employing Augmented-Reality-Embedded Instruction to Disperse the Imparities of Individual Differences in Earth Science Learning

    NASA Astrophysics Data System (ADS)

    Chen, Cheng-ping; Wang, Chang-Hwa

    2015-12-01

    Studies have proven that merging hands-on and online learning can result in an enhanced experience in learning science. In contrast to traditional online learning, multiple in-classroom activities may be involved in an augmented-reality (AR)-embedded e-learning process and thus could reduce the effects of individual differences. Using a three-stage AR-embedded instructional process, we conducted an experiment to investigate the influences of individual differences on learning earth science phenomena of "day, night, and seasons" for junior highs. The mixed-methods sequential explanatory design was employed. In the quantitative phase, factors of learning styles and ICT competences were examined alongside with the overall learning achievement. Independent t tests and ANCOVAs were employed to achieve inferential statistics. The results showed that overall learning achievement was significant for the AR-embedded instruction. Nevertheless, neither of the two learner factors exhibited significant effect on learning achievement. In the qualitative phase, we analyzed student interview records, and a wide variation on student's preferred instructional stages were revealed. These findings could provide an alternative rationale for developing ICT-supported instruction, as our three-stage AR-embedded comprehensive e-learning scheme could enhance instruction adaptiveness to disperse the imparities of individual differences between learners.

  8. Automotive technicians' training as a community-of-practice: implications for the design of an augmented reality teaching aid.

    PubMed

    Anastassova, Margarita; Burkhardt, Jean-Marie

    2009-07-01

    The paper presents an ergonomic analysis carried out in the early phases of an R&D project. The purpose was to investigate the functioning of today's Automotive Service Technicians (ASTs) training in order to inform the design of an Augmented Reality (AR) teaching aid. The first part of the paper presents a literature review of some major problems encountered by ASTs today. The benefits of AR as technological aid are also introduced. Then, the methodology and the results of two case studies are presented. The first study is based on interviews with trainers and trainees; the second one on observations in real training settings. The results support the assumption that today's ASTs' training could be regarded as a community-of-practice (CoP). Therefore, AR could be useful as a collaboration tool, offering a shared virtual representation of real vehicle's parts, which are normally invisible unless dismantled (e.g. the parts of a hydraulic automatic transmission). We conclude on the methods and the technologies to support the automotive CoP.

  9. BodyWindows: enhancing a mannequin with projective augmented reality for exploring anatomy, physiology and medical procedures.

    PubMed

    Samosky, Joseph T; Wang, Bo; Nelson, Douglas A; Bregman, Russell; Hosmer, Andrew; Weaver, Robert A

    2012-01-01

    Augmented reality offers the potential to radically extend and enhance the capabilities of physical medical simulators such as full-body mannequin trainers. We have developed a system that transforms the surface of a mannequin simulator into both a display screen and an input device. The BodyWindows system enables a user to open, size, and reposition multiple viewports onto the simulator body. We demonstrate a dynamic viewport that displays a beating heart. Similar viewports could be used to display real-time physiological responses to interventions the user applies to the mannequin, such as injection of a simulated drug. Viewport windows can be overlapping and show anatomy at different depths, creating the illusion of "cutting" multiple windows into the body to reveal structures at different depths from the surface. The developed low-cost interface employees an IR light pen and the Nintendo Wiimote. We also report experiments using the Microsoft Kinect computer vision sensor to provide a completely hand-gesture based interface.

  10. Into the Wild: Neuroergonomic Differentiation of Hand-Held and Augmented Reality Wearable Displays during Outdoor Navigation with Functional Near Infrared Spectroscopy.

    PubMed

    McKendrick, Ryan; Parasuraman, Raja; Murtza, Rabia; Formwalt, Alice; Baccus, Wendy; Paczynski, Martin; Ayaz, Hasan

    2016-01-01

    Highly mobile computing devices promise to improve quality of life, productivity, and performance. Increased situation awareness and reduced mental workload are two potential means by which this can be accomplished. However, it is difficult to measure these concepts in the "wild". We employed ultra-portable battery operated and wireless functional near infrared spectroscopy (fNIRS) to non-invasively measure hemodynamic changes in the brain's Prefrontal cortex (PFC). Measurements were taken during navigation of a college campus with either a hand-held display, or an Augmented reality wearable display (ARWD). Hemodynamic measures were also paired with secondary tasks of visual perception and auditory working memory to provide behavioral assessment of situation awareness and mental workload. Navigating with an augmented reality wearable display produced the least workload during the auditory working memory task, and a trend for improved situation awareness in our measures of prefrontal hemodynamics. The hemodynamics associated with errors were also different between the two devices. Errors with an augmented reality wearable display were associated with increased prefrontal activity and the opposite was observed for the hand-held display. This suggests that the cognitive mechanisms underlying errors between the two devices differ. These findings show fNIRS is a valuable tool for assessing new technology in ecologically valid settings and that ARWDs offer benefits with regards to mental workload while navigating, and potentially superior situation awareness with improved display design.

  11. Into the Wild: Neuroergonomic Differentiation of Hand-Held and Augmented Reality Wearable Displays during Outdoor Navigation with Functional Near Infrared Spectroscopy

    PubMed Central

    McKendrick, Ryan; Parasuraman, Raja; Murtza, Rabia; Formwalt, Alice; Baccus, Wendy; Paczynski, Martin; Ayaz, Hasan

    2016-01-01

    Highly mobile computing devices promise to improve quality of life, productivity, and performance. Increased situation awareness and reduced mental workload are two potential means by which this can be accomplished. However, it is difficult to measure these concepts in the “wild”. We employed ultra-portable battery operated and wireless functional near infrared spectroscopy (fNIRS) to non-invasively measure hemodynamic changes in the brain’s Prefrontal cortex (PFC). Measurements were taken during navigation of a college campus with either a hand-held display, or an Augmented reality wearable display (ARWD). Hemodynamic measures were also paired with secondary tasks of visual perception and auditory working memory to provide behavioral assessment of situation awareness and mental workload. Navigating with an augmented reality wearable display produced the least workload during the auditory working memory task, and a trend for improved situation awareness in our measures of prefrontal hemodynamics. The hemodynamics associated with errors were also different between the two devices. Errors with an augmented reality wearable display were associated with increased prefrontal activity and the opposite was observed for the hand-held display. This suggests that the cognitive mechanisms underlying errors between the two devices differ. These findings show fNIRS is a valuable tool for assessing new technology in ecologically valid settings and that ARWDs offer benefits with regards to mental workload while navigating, and potentially superior situation awareness with improved display design. PMID:27242480

  12. Reality-Based Learning and Interdisciplinary Teams: An Interactive Approach Integrating Accounting and Engineering Technology.

    ERIC Educational Resources Information Center

    Rogers, Robert L.; Stemkoski, Michael J.

    This paper describes a reality-based learning project in which sophomore accounting and engineering students collaborated in interdisciplinary teams to design and build a million-dollar waterslide park. Two weeks into the project, the teams received a briefing from an industrial panel of engineers, bankers, entrepreneurs, and other professionals.…

  13. Minimizing the Institutional Change Required to Augment Calculus with Real-World Engineering Problems

    ERIC Educational Resources Information Center

    Neubert, Jeremiah; Khavanin, Mohammad; Worley, Deborah; Kaabouch, Naima

    2014-01-01

    This paper presents a method for allowing calculus taught by mathematics faculty to be augmented with real-world engineering problems. The method relies on modules to deliver the problems and required background information. Students complete the modules outside of class and discuss them in mentor-led sessions. To encourage participation, students…

  14. A head-mounted operating binocular for augmented reality visualization in medicine--design and initial evaluation.

    PubMed

    Birkfellner, Wolfgang; Figl, Michael; Huber, Klaus; Watzinger, Franz; Wanschitz, Felix; Hummel, Johann; Hanel, Rudolf; Greimel, Wolfgang; Homolka, Peter; Ewers, Rolf; Bergmann, Helmar

    2002-08-01

    Computer-aided surgery (CAS), the intraoperative application of biomedical visualization techniques, appears to be one of the most promising fields of application for augmented reality (AR), the display of additional computer-generated graphics over a real-world scene. Typically a device such as a head-mounted display (HMD) is used for AR. However, considerable technical problems connected with AR have limited the intraoperative application of HMDs up to now. One of the difficulties in using HMDs is the requirement for a common optical focal plane for both the realworld scene and the computer-generated image, and acceptance of the HMD by the user in a surgical environment. In order to increase the clinical acceptance of AR, we have adapted the Varioscope (Life Optics, Vienna), a miniature, cost-effective head-mounted operating binocular, for AR. In this paper, we present the basic design of the modified HMD, and the method and results of an extensive laboratory study for photogrammetric calibration of the Varioscope's computer displays to a real-world scene. In a series of 16 calibrations with varying zoom factors and object distances, mean calibration error was found to be 1.24 +/- 0.38 pixels or 0.12 +/- 0.05 mm for a 640 x 480 display. Maximum error accounted for 3.33 +/- 1.04 pixels or 0.33 +/- 0.12 mm. The location of a position measurement probe of an optical tracking system was transformed to the display with an error of less than 1 mm in the real world in 56% of all cases. For the remaining cases, error was below 2 mm. We conclude that the accuracy achieved in our experiments is sufficient for a wide range of CAS applications.

  15. Real-time computer-generated integral imaging and 3D image calibration for augmented reality surgical navigation.

    PubMed

    Wang, Junchen; Suenaga, Hideyuki; Liao, Hongen; Hoshi, Kazuto; Yang, Liangjing; Kobayashi, Etsuko; Sakuma, Ichiro

    2015-03-01

    Autostereoscopic 3D image overlay for augmented reality (AR) based surgical navigation has been studied and reported many times. For the purpose of surgical overlay, the 3D image is expected to have the same geometric shape as the original organ, and can be transformed to a specified location for image overlay. However, how to generate a 3D image with high geometric fidelity and quantitative evaluation of 3D image's geometric accuracy have not been addressed. This paper proposes a graphics processing unit (GPU) based computer-generated integral imaging pipeline for real-time autostereoscopic 3D display, and an automatic closed-loop 3D image calibration paradigm for displaying undistorted 3D images. Based on the proposed methods, a novel AR device for 3D image surgical overlay is presented, which mainly consists of a 3D display, an AR window, a stereo camera for 3D measurement, and a workstation for information processing. The evaluation on the 3D image rendering performance with 2560×1600 elemental image resolution shows the rendering speeds of 50-60 frames per second (fps) for surface models, and 5-8 fps for large medical volumes. The evaluation of the undistorted 3D image after the calibration yields sub-millimeter geometric accuracy. A phantom experiment simulating oral and maxillofacial surgery was also performed to evaluate the proposed AR overlay device in terms of the image registration accuracy, 3D image overlay accuracy, and the visual effects of the overlay. The experimental results show satisfactory image registration and image overlay accuracy, and confirm the system usability.

  16. Studies of Operating Frequency Effects On Ejector-based Thrust Augmentation in a Pulse Detonation Engine

    NASA Technical Reports Server (NTRS)

    Landry, K.

    2005-01-01

    Studies were performed in order to characterize the thrust augmentation potential of an ejector in a Pulse Detonation Engine application. A 49-mm diameter tube of 0.914-m length was constructed with one open end and one closed end. Ethylene, oxygen, and nitrogen were introduced into the tube at the closed end through the implementation of a fast mixing injector. The tube was completely filled with a stoichiometric mixture containing a one to one molar ratio of nitrogen to oxygen. Ethylene was selected as the fuel due to its detonation sensitivity and the molar ratio of the oxidizer was chosen for heat transfer purposes. Detonations were initiated in the tube through the use of a spark ignition system. The PDE was operated in a multi-cycle mode at frequencies ranging from 20-Hz to 50-Hz. Baseline thrust measurements with no ejector present were performed while operating the engine at various frequencies and compared to theoretical estimates. The baseline values were observed to agree with the theoretical model at low operating frequencies and proved to be increasingly lower than the predicted values as the operating frequency was increased. The baseline thrust measurements were observed to agree within 15 percent of the model for all operating frequencies. A straight 152-mm diameter ejector was installed and thrust augmentation percentages were measured. The length of the ejector was varied while the overlap percentage (percent of the ejector length which overlapped the tube) was maintained at 25 percent for all tests. In addition, the effect of ejector inlet geometry was investigated by comparing results with a straight inlet to those of a 38-mm inlet diameter. The thrust augmentation of the straight inlet ejector proved to be independent of engine operating frequency, augmenting thrust by 40 percent for the 0.914-m length ejector. In contrast, the rounded lip ejector of the same length seemed to be highly dependent on the engine operating frequency. An optimum

  17. Space Shuttle Main Engine fuel preburner augmented spark igniter shutdown detonations

    NASA Technical Reports Server (NTRS)

    Dexter, C. E.; Mccay, T. D.

    1986-01-01

    Detonations were experienced in the Space Shuttle Main Engine fuel preburner (FPB) augmented spark igniter (ASI) during engine cutoff. Several of these resulted in over pressures sufficient to damage the FPB ASI oxidizer system. The detonations initiated in the FPB ASI oxidizer line when residual oxidizer (oxygen) in the line mixed with backflowing fuel (hydrogen) and detonated. This paper reviews the damage history to the FPB ASI oxidizer system, an engineering assessment of the problem cause, a verification of the mechanisms, the hazards associated with the detonations, and the solution implemented.

  18. Effective Design of Educational Virtual Reality Applications for Medicine Using Knowledge-Engineering Techniques

    ERIC Educational Resources Information Center

    Górski, Filip; Bun, Pawel; Wichniarek, Radoslaw; Zawadzki, Przemyslaw; Hamrol, Adam

    2017-01-01

    Effective medical and biomedical engineering education is an important problem. Traditional methods are difficult and costly. That is why Virtual Reality is often used for that purpose. Educational medical VR is a well-developed IT field, with many available hardware and software solutions. Current solutions are prepared without methodological…

  19. [Research progress in peri-implant soft tissue engineering augmentation method].

    PubMed

    Pei, Tingting; Yu, Hongqiang; Wen, Chaoju; Guo, Tianqi; Zhou, Yanmin; Peng, Huimin

    2016-05-01

    The sufficiency of hard and soft tissue at the implant site is the guarantee of long-term function, health and the appearance of implant denture. Problem of soft tissue recession at the implant site has always been bothering dentists. Traditional methods for augmentation of soft tissue such as gingival transplantation have disadvantages of instability of the increased soft-tissue and more trauma. Lately the methods that base on tissue engineering to increase the soft tissue of peri-implant sites have drawn great attention. This review focuses on the current methods of peri-implant restoration through tissue engineering, seed cells, biological scaffolds and cytokines.

  20. NACA Conference on Turbojet-Engine Thrust Augmentation Research: A Compilation of the Papers Presented by NACA Staff Members

    NASA Technical Reports Server (NTRS)

    1948-01-01

    The conference on Turbojet-Engine Thrust-Augmentation Research was organized by the NACA to present in summarized form the results of the latest experimental and analytical investigations conducted at the Lewis Flight Propulsion Laboratory on methods of augmenting the thrust of turbojet engines. The technical discussions are reproduced herewith in the same form in which they were presented. The original presentation in this record are considered as complementary to, rather than substitutes for, the committee's system of complete and formal reports.

  1. A CT-ultrasound-coregistered augmented reality enhanced image-guided surgery system and its preliminary study on brain-shift estimation

    NASA Astrophysics Data System (ADS)

    Huang, C. H.; Hsieh, C. H.; Lee, J. D.; Huang, W. C.; Lee, S. T.; Wu, C. T.; Sun, Y. N.; Wu, Y. T.

    2012-08-01

    With the combined view on the physical space and the medical imaging data, augmented reality (AR) visualization can provide perceptive advantages during image-guided surgery (IGS). However, the imaging data are usually captured before surgery and might be different from the up-to-date one due to natural shift of soft tissues. This study presents an AR-enhanced IGS system which is capable to correct the movement of soft tissues from the pre-operative CT images by using intra-operative ultrasound images. First, with reconstructing 2-D free-hand ultrasound images to 3-D volume data, the system applies a Mutual-Information based registration algorithm to estimate the deformation between pre-operative and intra-operative ultrasound images. The estimated deformation transform describes the movement of soft tissues and is then applied to the pre-operative CT images which provide high-resolution anatomical information. As a result, the system thus displays the fusion of the corrected CT images or the real-time 2-D ultrasound images with the patient in the physical space through a head mounted display device, providing an immersive augmented-reality environment. For the performance validation of the proposed system, a brain phantom was utilized to simulate brain-shift scenario. Experimental results reveal that when the shift of an artificial tumor is from 5mm ~ 12mm, the correction rates can be improved from 32% ~ 45% to 87% ~ 95% by using the proposed system.

  2. Effects of Mobile Augmented Reality Learning Compared to Textbook Learning on Medical Students: Randomized Controlled Pilot Study

    PubMed Central

    2013-01-01

    Background By adding new levels of experience, mobile Augmented Reality (mAR) can significantly increase the attractiveness of mobile learning applications in medical education. Objective To compare the impact of the heightened realism of a self-developed mAR blended learning environment (mARble) on learners to textbook material, especially for ethically sensitive subjects such as forensic medicine, while taking into account basic psychological aspects (usability and higher level of emotional involvement) as well as learning outcomes (increased learning efficiency). Methods A prestudy was conducted based on a convenience sample of 10 third-year medical students. The initial emotional status was captured using the “Profile of Mood States” questionnaire (POMS, German variation); previous knowledge about forensic medicine was determined using a 10-item single-choice (SC) test. During the 30-minute learning period, the students were randomized into two groups: the first group consisted of pairs of students, each equipped with one iPhone with a preinstalled copy of mARble, while the second group was provided with textbook material. Subsequently, both groups were asked to once again complete the POMS questionnaire and SC test to measure changes in emotional state and knowledge gain. Usability as well as pragmatic and hedonic qualities of the learning material was captured using AttrakDiff2 questionnaires. Data evaluation was conducted anonymously. Descriptive statistics for the score in total and the subgroups were calculated before and after the intervention. The scores of both groups were tested against each other using paired and unpaired signed-rank tests. An item analysis was performed for the SC test to objectify difficulty and selectivity. Results Statistically significant, the mARble group (6/10) showed greater knowledge gain than the control group (4/10) (Wilcoxon z=2.232, P=.03). The item analysis of the SC test showed a difficulty of P=0.768 (s=0.09) and a

  3. Study of ejector geometry on thrust augmentation for pulse detonation engine ejector systems

    NASA Astrophysics Data System (ADS)

    Shehadeh, Ra'fat

    Pulse detonation engine (PDE) technology is a novel form of propulsion that offers the potential of high efficiency combustion with reduced hardware complexity. Although the primary interest of the research in the pulse detonation engine field is directed towards overcoming the problems associated with operating a pure PDE system, there are other worthy options to be considered for these engines. The PDE driven ejector concept is one such option where the system would be part of a hybrid PD/Turbofan engine. This system offers the promise of replacing the high-pressure turbine sections of the core of a high bypass turbofan engine. The purpose of the current research is to investigate the thrust augmentation capabilities of a PDE driven ejector and provide experimental data that would assist in understanding the behavior of such a system. The major potential advantages of the PDE-ejector include reduced costs due to the reduced engine weight, along with improved specific fuel consumption and specific power inherent in the incorporation of a PDE component. To achieve the goal of this research, the thrust augmentation of a PDE driven ejector was characterized for a set of configurations. Two separate PDE's were utilized in this study. The first PDE was capable of operating at a constant frequency of 10 Hz de to flow rate limitations, and another PDE built to have an operational frequency range of 10 Hz-70 Hz to test the effect of operational frequency on PDE-ejector systems. Optical diagnostics were employed at specific positions of interest to understand the physical behavior of the flow. Baseline experimental results helped define and understand the operational characteristics of the PDE's utilized in this study. Thrust measurements were then made for PDE driven ejector configurations. The parameters that were independently changed were the inlet geometry of a constant diameter ejector, as well as the overlap distance between the PDE tube exit and ejector tube inlet

  4. Engine Yaw Augmentation for Hybrid-Wing-Body Aircraft via Optimal Control Allocation Techniques

    NASA Technical Reports Server (NTRS)

    Taylor, Brian R.; Yoo, Seung-Yeun

    2011-01-01

    Asymmetric engine thrust was implemented in a hybrid-wing-body non-linear simulation to reduce the amount of aerodynamic surface deflection required for yaw stability and control. Hybrid-wing-body aircraft are especially susceptible to yaw surface deflection due to their decreased bare airframe yaw stability resulting from the lack of a large vertical tail aft of the center of gravity. Reduced surface deflection, especially for trim during cruise flight, could reduce the fuel consumption of future aircraft. Designed as an add-on, optimal control allocation techniques were used to create a control law that tracks total thrust and yaw moment commands with an emphasis on not degrading the baseline system. Implementation of engine yaw augmentation is shown and feasibility is demonstrated in simulation with a potential drag reduction of 2 to 4 percent. Future flight tests are planned to demonstrate feasibility in a flight environment.

  5. STS-55 pad abort: Engine 2011 oxidizer preburner augmented spark igniter check valve leak

    NASA Astrophysics Data System (ADS)

    1993-03-01

    The STS-55 initial launch attempt of Columbia (OV102) was terminated on KSC launch pad A March 22, 1993 at 9:51 AM E.S.T. due to violation of an ME-3 (Engine 2011) Launch Commit Criteria (LCC) limit exceedance. The event description and timeline are summarized. Propellant loading was initiated on 22 March, 1993 at 1:15 AM EST. All SSME chill parameters and launch commit criteria (LCC) were nominal. At engine start plus 1.44 seconds, a Failure Identification (FID) was posted against Engine 2011 for exceeding the 50 psia Oxidizer Preburner (OPB) purge pressure redline. The engine was shut down at 1.50 seconds followed by Engines 2034 and 2030. All shut down sequences were nominal and the mission was safely aborted. The OPB purge pressure redline violation and the abort profile/overlay for all three engines are depicted. SSME Avionics hardware and software performed nominally during the incident. A review of vehicle data table (VDT) data and controller software logic revealed no failure indications other than the single FID 013-414, OPB purge pressure redline exceeded. Software logic was executed according to requirements and there was no anomalous controller software operation. Immediately following the abort, a Rocketdyne/NASA failure investigation team was assembled. The team successfully isolated the failure cause to the oxidizer preburner augmented spark igniter purge check valve not being fully closed due to contamination. The source of the contaminant was traced to a cut segment from a rubber O-ring which was used in a fine clean tool during valve production prior to 1992. The valve was apparently contaminated during its fabrication in 1985. The valve had performed acceptably on four previous flights of the engine, and SSME flight history shows 780 combined check valve flights without failure. The failure of an Engine 3 (SSME No. 2011) check valve to close was sensed by onboard engine instruments even though all other engine operations were normal. This

  6. STS-55 pad abort: Engine 2011 oxidizer preburner augmented spark igniter check valve leak

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The STS-55 initial launch attempt of Columbia (OV102) was terminated on KSC launch pad A March 22, 1993 at 9:51 AM E.S.T. due to violation of an ME-3 (Engine 2011) Launch Commit Criteria (LCC) limit exceedance. The event description and timeline are summarized. Propellant loading was initiated on 22 March, 1993 at 1:15 AM EST. All SSME chill parameters and launch commit criteria (LCC) were nominal. At engine start plus 1.44 seconds, a Failure Identification (FID) was posted against Engine 2011 for exceeding the 50 psia Oxidizer Preburner (OPB) purge pressure redline. The engine was shut down at 1.50 seconds followed by Engines 2034 and 2030. All shut down sequences were nominal and the mission was safely aborted. The OPB purge pressure redline violation and the abort profile/overlay for all three engines are depicted. SSME Avionics hardware and software performed nominally during the incident. A review of vehicle data table (VDT) data and controller software logic revealed no failure indications other than the single FID 013-414, OPB purge pressure redline exceeded. Software logic was executed according to requirements and there was no anomalous controller software operation. Immediately following the abort, a Rocketdyne/NASA failure investigation team was assembled. The team successfully isolated the failure cause to the oxidizer preburner augmented spark igniter purge check valve not being fully closed due to contamination. The source of the contaminant was traced to a cut segment from a rubber O-ring which was used in a fine clean tool during valve production prior to 1992. The valve was apparently contaminated during its fabrication in 1985. The valve had performed acceptably on four previous flights of the engine, and SSME flight history shows 780 combined check valve flights without failure. The failure of an Engine 3 (SSME No. 2011) check valve to close was sensed by onboard engine instruments even though all other engine operations were normal. This

  7. Effect of Topography on Learning Military Tactics - Integration of Generalized Intelligent Framework for Tutoring (GIFT) and Augmented REality Sandtable (ARES)

    DTIC Science & Technology

    2016-09-01

    ARL-TR-7792 ● SEP 2016 US Army Research Laboratory Effect of Topography on Learning Military Tactics – Integration of...Laboratory Effect of Topography on Learning Military Tactics – Integration of Generalized Intelligent Framework for Tutoring (GIFT) and Augmented...DATES COVERED (From - To) February 2015–February 2016 4. TITLE AND SUBTITLE Effect of Topography on Learning Military Tactics – Integration of

  8. Augmented-reality visualization of brain structures with stereo and kinetic depth cues: system description and initial evaluation with head phantom

    NASA Astrophysics Data System (ADS)

    Maurer, Calvin R., Jr.; Sauer, Frank; Hu, Bo; Bascle, Benedicte; Geiger, Bernhard; Wenzel, Fabian; Recchi, Filippo; Rohlfing, Torsten; Brown, Christopher R.; Bakos, Robert J.; Maciunas, Robert J.; Bani-Hashemi, Ali R.

    2001-05-01

    We are developing a video see-through head-mounted display (HMD) augmented reality (AR) system for image-guided neurosurgical planning and navigation. The surgeon wears a HMD that presents him with the augmented stereo view. The HMD is custom fitted with two miniature color video cameras that capture a stereo view of the real-world scene. We are concentrating specifically at this point on cranial neurosurgery, so the images will be of the patient's head. A third video camera, operating in the near infrared, is also attached to the HMD and is used for head tracking. The pose (i.e., position and orientation) of the HMD is used to determine where to overlay anatomic structures segmented from preoperative tomographic images (e.g., CT, MR) on the intraoperative video images. Two SGI 540 Visual Workstation computers process the three video streams and render the augmented stereo views for display on the HMD. The AR system operates in real time at 30 frames/sec with a temporal latency of about three frames (100 ms) and zero relative lag between the virtual objects and the real-world scene. For an initial evaluation of the system, we created AR images using a head phantom with actual internal anatomic structures (segmented from CT and MR scans of a patient) realistically positioned inside the phantom. When using shaded renderings, many users had difficulty appreciating overlaid brain structures as being inside the head. When using wire frames, and texture-mapped dot patterns, most users correctly visualized brain anatomy as being internal and could generally appreciate spatial relationships among various objects. The 3D perception of these structures is based on both stereoscopic depth cues and kinetic depth cues, with the user looking at the head phantom from varying positions. The perception of the augmented visualization is natural and convincing. The brain structures appear rigidly anchored in the head, manifesting little or no apparent swimming or jitter. The initial

  9. Treatment of phantom limb pain (PLP) based on augmented reality and gaming controlled by myoelectric pattern recognition: a case study of a chronic PLP patient

    PubMed Central

    Ortiz-Catalan, Max; Sander, Nichlas; Kristoffersen, Morten B.; Håkansson, Bo; Brånemark, Rickard

    2014-01-01

    A variety of treatments have been historically used to alleviate phantom limb pain (PLP) with varying efficacy. Recently, virtual reality (VR) has been employed as a more sophisticated mirror therapy. Despite the advantages of VR over a conventional mirror, this approach has retained the use of the contralateral limb and is therefore restricted to unilateral amputees. Moreover, this strategy disregards the actual effort made by the patient to produce phantom motions. In this work, we investigate a treatment in which the virtual limb responds directly to myoelectric activity at the stump, while the illusion of a restored limb is enhanced through augmented reality (AR). Further, phantom motions are facilitated and encouraged through gaming. The proposed set of technologies was administered to a chronic PLP patient who has shown resistance to a variety of treatments (including mirror therapy) for 48 years. Individual and simultaneous phantom movements were predicted using myoelectric pattern recognition and were then used as input for VR and AR environments, as well as for a racing game. The sustained level of pain reported by the patient was gradually reduced to complete pain-free periods. The phantom posture initially reported as a strongly closed fist was gradually relaxed, interestingly resembling the neutral posture displayed by the virtual limb. The patient acquired the ability to freely move his phantom limb, and a telescopic effect was observed where the position of the phantom hand was restored to the anatomically correct distance. More importantly, the effect of the interventions was positively and noticeably perceived by the patient and his relatives. Despite the limitation of a single case study, the successful results of the proposed system in a patient for whom other medical and non-medical treatments have been ineffective justifies and motivates further investigation in a wider study. PMID:24616655

  10. Treatment of phantom limb pain (PLP) based on augmented reality and gaming controlled by myoelectric pattern recognition: a case study of a chronic PLP patient.

    PubMed

    Ortiz-Catalan, Max; Sander, Nichlas; Kristoffersen, Morten B; Håkansson, Bo; Brånemark, Rickard

    2014-01-01

    A variety of treatments have been historically used to alleviate phantom limb pain (PLP) with varying efficacy. Recently, virtual reality (VR) has been employed as a more sophisticated mirror therapy. Despite the advantages of VR over a conventional mirror, this approach has retained the use of the contralateral limb and is therefore restricted to unilateral amputees. Moreover, this strategy disregards the actual effort made by the patient to produce phantom motions. In this work, we investigate a treatment in which the virtual limb responds directly to myoelectric activity at the stump, while the illusion of a restored limb is enhanced through augmented reality (AR). Further, phantom motions are facilitated and encouraged through gaming. The proposed set of technologies was administered to a chronic PLP patient who has shown resistance to a variety of treatments (including mirror therapy) for 48 years. Individual and simultaneous phantom movements were predicted using myoelectric pattern recognition and were then used as input for VR and AR environments, as well as for a racing game. The sustained level of pain reported by the patient was gradually reduced to complete pain-free periods. The phantom posture initially reported as a strongly closed fist was gradually relaxed, interestingly resembling the neutral posture displayed by the virtual limb. The patient acquired the ability to freely move his phantom limb, and a telescopic effect was observed where the position of the phantom hand was restored to the anatomically correct distance. More importantly, the effect of the interventions was positively and noticeably perceived by the patient and his relatives. Despite the limitation of a single case study, the successful results of the proposed system in a patient for whom other medical and non-medical treatments have been ineffective justifies and motivates further investigation in a wider study.

  11. Display signaling in augmented reality: effects of cue reliability and image realism on attention allocation and trust calibration.

    PubMed

    Yeh, M; Wickens, C D

    2001-01-01

    This experiment seeks to examine the relationships among three advanced technology features (presentation of target cuing, reliability of target cuing, and level of image reality and the attention and trust given to that information). The participants were 16 military personnel who piloted an unmanned air vehicle and searched for targets camouflaged in terrain, which was presented at two levels of image realism. Cuing was available for some targets, and the reliability of this information was manipulated at two levels (100% and 75%). The results showed that the presence of cuing aided target detection for expected targets but drew attention away from the presence of unexpected targets. Cuing benefits and attentional tunneling were both reduced when cuing became less reliable. Increasing image realism was compelling but increased reliance on the cuing information when those data were reliable. Potential applications include a cost-benefit analysis of how trust modulates attention in the use of automated target recognition systems and the extent to which increased realism may influence this trust.

  12. Reality-augmented virtual fluoroscopy for computer-assisted diaphyseal long bone fracture osteosynthesis: a novel technique and feasibility study results.

    PubMed

    Zheng, G; Dong, X; Gruetzner, P A

    2008-01-01

    In this paper, a novel technique to create a reality-augmented virtual fluoroscopy for computer-assisted diaphyseal long bone fracture osteosynthesis and feasibility study results are presented. With this novel technique, repositioning of bone fragments during closed fracture reduction and osteosynthesis can lead to image updates in the virtual imaging planes of all acquired images without any radiation. The technique is achieved with a two-stage method. After acquiring a few (normally two) calibrated fluoroscopic images and before fracture reduction, the first stage, data preparation, interactively identifies and segments the bone fragments from the background in each image. After that, the second stage, image updates, repositions the fragment projection on to each virtual imaging plane in real time during fracture reduction and osteosynthesis using an OpenGL-based texture warping. Combined with a photorealistic virtual implant model rendering technique, the present technique allows the control of a closed indirect fracture osteosynthesis in the real world through direct insight into the virtual world. The first clinical study results show the reduction in the X-ray radiation to the patient as well as to the surgical team, and the improved operative precision, guaranteeing more safety for the patient. Furthermore, based on the experiences gained from this clinical study, two technical enhancements are proposed. One focuses on eliminating the user interactions with automated identifications and segmentations of bone fragments. The other focuses on providing non-photorealistic implant visualization. Further experiments are performed to validate the effectiveness of the proposed enhancements.

  13. Utility of a Three-Dimensional Interactive Augmented Reality Program for Balance and Mobility Rehabilitation in the Elderly: A Feasibility Study

    PubMed Central

    Im, Dal Jae; Ku, Jeunghun; Kim, Yeun Joon; Cho, Sangwoo; Cho, Yun Kyung; Lim, Teo; Lee, Hye Sun; Kim, Hyun Jung

    2015-01-01

    Objective To improve lower extremity function and balance in elderly persons, we developed a novel, three-dimensional interactive augmented reality system (3D ARS). In this feasibility study, we assessed clinical and kinematic improvements, user participation, and the side effects of our system. Methods Eighteen participants (age, 56-76 years) capable of walking independently and standing on one leg were recruited. The participants received 3D ARS training during 10 sessions (30-minute duration each) for 4 weeks. Berg Balance Scale (BBS) and the Timed Up and Go (TUG) scores were obtained before and after the exercises. Outcome performance variables, including response time and success rate, and kinematic variables, such as hip and knee joint angle, were evaluated after each session. Results Participants exhibited significant clinical improvements in lower extremity balance and mobility following the intervention, as shown by improved BBS and TUG scores (p<0.001). Consistent kinematic improvements in the maximum joint angles of the hip and knee were observed across sessions. Outcome performance variables, such as success rate and response time, improved gradually across sessions, for each exercise. The level of participant interest also increased across sessions (p<0.001). All participants completed the program without experiencing any adverse effects. Conclusion Substantial clinical and kinematic improvements were observed after applying a novel 3D ARS training program, suggesting that this system can enhance lower extremity function and facilitate assessments of lower extremity kinematic capacity. PMID:26161353

  14. Heard on The Street: GIS-Guided Immersive 3D Models as an Augmented Reality for Team Collaboration

    NASA Astrophysics Data System (ADS)

    Quinn, B. B.

    2007-12-01

    ] (x,y,z meters). Visiting this model amounts to going inside a map, seeing oneself there, and seeing, gesturing with, and speaking to other visitors in the same space. The Second Life viewer client is free, and the model is hosted on a vast publicly accessible grid that as of 1 September 2007 is simulating 846 square kilometers and frequently has over 40,000 simultaneous users. For reference, this work uses less than 1/200,000 part of the total grid. For cost savings, GIS was used to construct the model at 1/3 scale so that some 35,000 square meters of Berkeley were modeled in less than 4000 square meters of simulator space. Our work in Second Life "Gualala" has shown that it is feasible to use real-life GIS data to guide the construction of a spatially accurate model that reflects the built surface and underground environment. Groups of visitors may position their proxy body, or avatars on the street corner and converse, greatly augmenting the experience of a conference call. Since each visitor controls their own camera position in real time, the model considerably augments a video conference call, and can permit individuals to manipulate 3D objects as part of a demonstration or discussion.

  15. Development Education and Engineering: A Framework for Incorporating Reality of Developing Countries into Engineering Studies

    ERIC Educational Resources Information Center

    Perez-Foguet, A.; Oliete-Josa, S.; Saz-Carranza, A.

    2005-01-01

    Purpose: To show the key points of a development education program for engineering studies fitted within the framework of the human development paradigm. Design/methodology/approach: The bases of the concept of technology for human development are presented, and the relationship with development education analysed. Special attention is dedicated…

  16. Usability Engineering for Complex Interactive Systems Development

    DTIC Science & Technology

    2003-06-01

    called Nomad ( Microvision , 2003), for dismounted soldiers. In this paper, we present a brief description of key usability engineering activities...Nomad augmented vision system manufactured by Microvision ( Microvision , 2003). This display uses a low-powered laser beam to paint an image...Resolving Multiple Occluded Layers in Augmented Reality,” Submitted to ISMAR Conference. 2003. Microvision , Company website, see http

  17. Bilateral maxillary sinus floor augmentation with tissue-engineered autologous osteoblasts and demineralized freeze-dried bone

    PubMed Central

    Deshmukh, Aashish; Kalra, Rinku; Chhadva, Shruti; Shetye, Angad

    2015-01-01

    The pneumatization of the maxillary sinus often results in a lack of sufficient alveolar bone for implant placement. In the last decades, maxillary sinus lift has become a very popular procedure with predictable results. Sinus floor augmentation procedures are generally carried out using autologous bone grafts, bone substitutes, or composites of bone and bone substitutes. However, the inherent limitations associated with each of these, have directed the attention of investigators to new technologies like bone tissue engineering. Bone marrow stromal cells have been regarded as multi-potent cells residing in bone marrow. These cells can be harvested from a person, multiplied outside his body using bioengineering principles and technologies and later introduced into a tissue defect. We present a case where tissue-engineered autologous osteoblasts were used along with demineralized freeze-dried bone for sinus floor augmentation. PMID:26097364

  18. New weather depiction technology for night vision goggle (NVG) training: 3D virtual/augmented reality scene-weather-atmosphere-target simulation

    NASA Astrophysics Data System (ADS)

    Folaron, Michelle; Deacutis, Martin; Hegarty, Jennifer; Vollmerhausen, Richard; Schroeder, John; Colby, Frank P.

    2007-04-01

    US Navy and Marine Corps pilots receive Night Vision Goggle (NVG) training as part of their overall training to maintain the superiority of our forces. This training must incorporate realistic targets; backgrounds; and representative atmospheric and weather effects they may encounter under operational conditions. An approach for pilot NVG training is to use the Night Imaging and Threat Evaluation Laboratory (NITE Lab) concept. The NITE Labs utilize a 10' by 10' static terrain model equipped with both natural and cultural lighting that are used to demonstrate various illumination conditions, and visual phenomena which might be experienced when utilizing night vision goggles. With this technology, the military can safely, systematically, and reliably expose pilots to the large number of potentially dangerous environmental conditions that will be experienced in their NVG training flights. A previous SPIE presentation described our work for NAVAIR to add realistic atmospheric and weather effects to the NVG NITE Lab training facility using the NVG - WDT(Weather Depiction Technology) system (Colby, et al.). NVG -WDT consist of a high end multiprocessor server with weather simulation software, and several fixed and goggle mounted Heads Up Displays (HUDs). Atmospheric and weather effects are simulated using state-of-the-art computer codes such as the WRF (Weather Research μ Forecasting) model; and the US Air Force Research Laboratory MODTRAN radiative transport model. Imagery for a variety of natural and man-made obscurations (e.g. rain, clouds, snow, dust, smoke, chemical releases) are being calculated and injected into the scene observed through the NVG via the fixed and goggle mounted HUDs. This paper expands on the work described in the previous presentation and will describe the 3D Virtual/Augmented Reality Scene - Weather - Atmosphere - Target Simulation part of the NVG - WDT. The 3D virtual reality software is a complete simulation system to generate realistic

  19. Cell painting with an engineered EPCR to augment the protein C system

    PubMed Central

    Bouwens, Eveline A. M.; Stavenuiter, Fabian; Mosnier, Laurent O.

    2016-01-01

    The protein C (PC) system conveys beneficial anticoagulant and cytoprotective effects in numerous in vivo disease models. The endothelial protein C receptor (EPCR) plays a central role in these pathways as cofactor for PC activation and by enhancing activated protein C (APC)-mediated protease-activated receptor (PAR) activation. During inflammatory disease, expression of EPCR on cell membranes is often diminished thereby limiting PC activation and APC’s effects on cells. Here a caveolae-targeting glycosylphosphatidylinositol (GPI)-anchored EPCR (EPCR-GPI) was engineered to restore EPCR’s bioavailability via “cell painting.” The painting efficiency of EPCR-GPI on EPCR-depleted endothelial cells was time- and dose-dependent. The EPCR-GPI bioavailability after painting was long lasting since EPCR surface levels reached 400% of wild-type cells after 2 hours and remained >200% for 24 hours. EPCR-GPI painting conveyed APC binding to EPCR-depleted endothelial cells where EPCR was lost due to shedding or shRNA. EPCR painting normalized PC activation on EPCR-depleted cells indicating that EPCR-GPI is functional active on painted cells. Caveolin-1 lipid rafts were enriched in EPCR after painting due to the GPI-anchor targeting caveolae. Accordingly, EPCR painting supported PAR1 and PAR3 cleavage by APC and augmented PAR1-dependent Akt phosphorylation by APC. Thus, EPCR-GPI painting achieved physiological relevant surface levels on endothelial cells, restored APC binding to EPCR-depleted cells, supported PC activation, and enhanced APC-mediated PAR cleavage and cytoprotective signaling. Therefore, EPCR-GPI provides a novel tool to restore the bioavailability and functionality of EPCR on EPCR-depleted and deficient cells. PMID:26272345

  20. Biologically inspired robotic inspectors: the engineering reality and future outlook (Keynote address)

    NASA Astrophysics Data System (ADS)

    Bar-Cohen, Yoseph

    2005-04-01

    Human errors have long been recognized as a major factor in the reliability of nondestructive evaluation results. To minimize such errors, there is an increasing reliance on automatic inspection tools that allow faster and consistent tests. Crawlers and various manipulation devices are commonly used to perform variety of inspection procedures that include C-scan with contour following capability to rapidly inspect complex structures. The emergence of robots has been the result of the need to deal with parts that are too complex to handle by a simple automatic system. Economical factors are continuing to hamper the wide use of robotics for inspection applications however technology advances are increasingly changing this paradigm. Autonomous robots, which may look like human, can potentially address the need to inspect structures with configuration that are not predetermined. The operation of such robots that mimic biology may take place at harsh or hazardous environments that are too dangerous for human presence. Biomimetic technologies such as artificial intelligence, artificial muscles, artificial vision and numerous others are increasingly becoming common engineering tools. Inspired by science fiction, making biomimetic robots is increasingly becoming an engineering reality and in this paper the state-of-the-art will be reviewed and the outlook for the future will be discussed.

  1. Wireless augmented reality communication system

    NASA Technical Reports Server (NTRS)

    Devereaux, Ann (Inventor); Jedrey, Thomas (Inventor); Agan, Martin (Inventor)

    2006-01-01

    The system of the present invention is a highly integrated radio communication system with a multimedia co-processor which allows true two-way multimedia (video, audio, data) access as well as real-time biomedical monitoring in a pager-sized portable access unit. The system is integrated in a network structure including one or more general purpose nodes for providing a wireless-to-wired interface. The network architecture allows video, audio and data (including biomedical data) streams to be connected directly to external users and devices. The portable access units may also be mated to various non-personal devices such as cameras or environmental sensors for providing a method for setting up wireless sensor nets from which reported data may be accessed through the portable access unit. The reported data may alternatively be automatically logged at a remote computer for access and viewing through a portable access unit, including the user's own.

  2. Wireless Augmented Reality Communication System

    NASA Technical Reports Server (NTRS)

    Devereaux, Ann (Inventor); Jedrey, Thomas (Inventor); Agan, Martin (Inventor)

    2015-01-01

    A portable unit is for video communication to select a user name in a user name network. A transceiver wirelessly accesses a communication network through a wireless connection to a general purpose node coupled to the communication network. A user interface can receive user input to log on to a user name network through the communication network. The user name network has a plurality of user names, at least one of the plurality of user names is associated with a remote portable unit, logged on to the user name network and available for video communication.

  3. Wireless Augmented Reality Communication System

    NASA Technical Reports Server (NTRS)

    Devereaux, Ann (Inventor); Jedrey, Thomas (Inventor); Agan, Martin (Inventor)

    2014-01-01

    The system of the present invention is a highly integrated radio communication system with a multimedia co-processor which allows true two-way multimedia (video, audio, data) access as well as real-time biomedical monitoring in a pager-sized portable access unit. The system is integrated in a network structure including one or more general purpose nodes for providing a wireless-to-wired interface. The network architecture allows video, audio and data (including biomedical data) streams to be connected directly to external users and devices. The portable access units may also be mated to various non-personal devices such as cameras or environmental sensors for providing a method for setting up wireless sensor nets from which reported data may be accessed through the portable access unit. The reported data may alternatively be automatically logged at a remote computer for access and viewing through a portable access unit, including the user's own.

  4. Wireless Augmented Reality Communication System

    NASA Technical Reports Server (NTRS)

    Devereaux, Ann (Inventor); Jedrey, Thomas (Inventor); Agan, Martin (Inventor)

    2016-01-01

    The system of the present invention is a highly integrated radio communication system with a multimedia co-processor which allows true two-way multimedia (video, audio, data) access as well as real-time biomedical monitoring in a pager-sized portable access unit. The system is integrated in a network structure including one or more general purpose nodes for providing a wireless-to-wired interface. The network architecture allows video, audio and data (including biomedical data) streams to be connected directly to external users and devices. The portable access units may also be mated to various non-personal devices such as cameras or environmental sensors for providing a method for setting up wireless sensor nets from which reported data may be accessed through the portable access unit. The reported data may alternatively be automatically logged at a remote computer for access and viewing through a portable access unit, including the user's own.

  5. Military Applications of Augmented Reality

    DTIC Science & Technology

    2011-01-01

    system become challenging, but critical, tasks. Early designs in our human-centered research process attempted to show multiple layers of geometric and...sensory integration can both limit the types of abstractions that make sense for a given application and push the application designer to create new...Livingston et al. to a deployment may wear a system like the Land Warrior system [Army(2001)] containing a wearable computer and head-mounted display designed

  6. Recent Advances in Augmented Reality

    DTIC Science & Technology

    2001-12-01

    forms images directly on the retina. These dis- plays, which MicroVision is developing commercially, literally draw on the retina with low-power lasers...the Phillips Tower, as seen from two different viewpoints. C ou rt es y H RL L ab or at or ie s 8 RV-Border Guards, an AR game. C ou rt es y M R Sy...filtered view (bottom), from Julier et al.55 C ou rt es y N av al R es ea rc h La b 14 Virtual and real occlusions. The brown cow and tree are

  7. Design and implementation of a 3D ocean virtual reality and visualization engine

    NASA Astrophysics Data System (ADS)

    Chen, Ge; Li, Bo; Tian, Fenglin; Ji, Pengbo; Li, Wenqing

    2012-12-01

    In this study, a 3D virtual reality and visualization engine for rendering the ocean, named VV-Ocean, is designed for marine applications. The design goals of VV-Ocean aim at high fidelity simulation of ocean environment, visualization of massive and multidimensional marine data, and imitation of marine lives. VV-Ocean is composed of five modules, i.e. memory management module, resources management module, scene management module, rendering process management module and interaction management module. There are three core functions in VV-Ocean: reconstructing vivid virtual ocean scenes, visualizing real data dynamically in real time, imitating and simulating marine lives intuitively. Based on VV-Ocean, we establish a sea-land integration platform which can reproduce drifting and diffusion processes of oil spilling from sea bottom to surface. Environment factors such as ocean current and wind field have been considered in this simulation. On this platform oil spilling process can be abstracted as movements of abundant oil particles. The result shows that oil particles blend with water well and the platform meets the requirement for real-time and interactive rendering. VV-Ocean can be widely used in ocean applications such as demonstrating marine operations, facilitating maritime communications, developing ocean games, reducing marine hazards, forecasting the weather over oceans, serving marine tourism, and so on. Finally, further technological improvements of VV-Ocean are discussed.

  8. Training Wayfinding: Natural Movement in Mixed Reality

    DTIC Science & Technology

    2007-10-01

    This report describes an experiment that investigated a prototype mixed reality (MR) system, utilizing the Battlefield Augmented Reality System (BARS...for training wayfinding. BARS is a mobile augmented reality system that uses a head mounted display (HMD) and a wireless system that tracks the

  9. Mission Specific Embedded Training Using Mixed Reality

    DTIC Science & Technology

    2011-10-01

    augmented reality (AR) or mixed reality as a training tool for military operations in urban terrain. Our group has developed the Battlefield Augmented ... Reality System (BARS)(trademark), which can be used for a variety of applications, such as situation awareness as well as embedded training. We have

  10. Study on using a digital ride quality augmentation system to trim an engine-out in a Cessna 402B

    NASA Technical Reports Server (NTRS)

    Donaldson, K. E.

    1986-01-01

    A linear model of the Cessna 402B was used to determine if the control power available to a Ride Quality Augmentation System was adequate to trim an engine-out. Two simulations were completed: one using a steady state model, and the other using a state matrix model. The amount of rudder available was not sufficient in all cases to completely trim the airplane, but it was enough to give the pilot valuable reaction time. The system would be an added measure of safety for only a relatively small amount of development.

  11. Thrust Augmentation of a Turbojet Engine at Simulated Flight Conditions by Introduction of a Water-Alcohol Mixture into the Compressor

    NASA Technical Reports Server (NTRS)

    Useller, James W.; Auble, Carmon M.; Harvey, Ray W., Sr.

    1952-01-01

    An investigation was conducted at simulated high-altitude flight conditions to evaluate the use of compressor evaporative cooling as a means of turbojet-engine thrust augmentation. Comparison of the performance of the engine with water-alcohol injection at the compressor inlet, at the sixth stage of the compressor, and at the sixth and ninth stages was made. From consideration of the thrust increases achieved, the interstage injection of the coolant was considered more desirable preferred over the combined sixth- and ninth-stage injection because of its relative simplicity. A maximum augmented net-thrust ratio of 1.106 and a maximum augmented jet-thrust ratio of 1.062 were obtained at an augmented liquid ratio of 2.98 and an engine-inlet temperature of 80 F. At lower inlet temperatures (-40 to 40 F), the maximum augmented net-thrust ratios ranged from 1.040 to 1.076 and the maximum augmented jet-thrust ratios ranged from 1.027 to 1.048, depending upon the inlet temperature. The relatively small increase in performance at the lower inlet-air temperatures can be partially attributed to the inadequate evaporation of the water-alcohol mixture, but the more significant limitation was believed to be caused by the negative influence of the liquid coolant on engine- component performance. In general, it is concluded that the effectiveness of the injection of a coolant into the compressor as a means of thrust augmentation is considerably influenced by the design characteristics of the components of the engine being used.

  12. Engineering a transformation of human-machine interaction to an augmented cognitive relationship.

    SciTech Connect

    Speed, Ann Elizabeth; Bernard, Michael Lewis; Abbott, Robert G.; Forsythe, James Chris; Xavier, Patrick Gordon; Brannon, Nathan Gregory

    2003-02-01

    This project is being conducted by Sandia National Laboratories in support of the DARPA Augmented Cognition program. Work commenced in April of 2002. The objective for the DARPA program is to 'extend, by an order of magnitude or more, the information management capacity of the human-computer warfighter.' Initially, emphasis has been placed on detection of an operator's cognitive state so that systems may adapt accordingly (e.g., adjust information throughput to the operator in response to workload). Work conducted by Sandia focuses on development of technologies to infer an operator's ongoing cognitive processes, with specific emphasis on detecting discrepancies between machine state and an operator's ongoing interpretation of events.

  13. Embedding Mixed-Reality Laboratories into E-Learning Systems for Engineering Education

    ERIC Educational Resources Information Center

    Al-Tikriti, Munther N.; Al-Aubidy, Kasim M.

    2013-01-01

    E-learning, virtual learning and mixed reality techniques are now a global integral part of the academic and educational systems. They provide easier access to educational opportunities to a very wide spectrum of individuals to pursue their educational and qualification objectives. These modern techniques have the potentials to improve the quality…

  14. Vicher: A Virtual Reality Based Educational Module for Chemical Reaction Engineering.

    ERIC Educational Resources Information Center

    Bell, John T.; Fogler, H. Scott

    1996-01-01

    A virtual reality application for undergraduate chemical kinetics and reactor design education, Vicher (Virtual Chemical Reaction Model) was originally designed to simulate a portion of a modern chemical plant. Vicher now consists of two programs: Vicher I that models catalyst deactivation and Vicher II that models nonisothermal effects in…

  15. Turbofan forced mixer lobe flow modeling. Part 3: Application to augment engines

    NASA Technical Reports Server (NTRS)

    Barber, T.; Moore, G. C.; Blatt, J. R.

    1988-01-01

    Military engines frequently need large quantities of thrust for short periods of time. The addition of an augmentor can provide such thrust increases but with a penalty of increased duct length and engine weight. The addition of a forced mixer to the augmentor improves performance and reduces the penalty, as well as providing a method for siting the required flame holders. In this report two augmentor concepts are investigated: a swirl-mixer augmentor and a mixer-flameholder augmentor. Several designs for each concept are included and an experimental assessment of one of the swirl-mixer augmentors is presented.

  16. Greening the curriculum: augmenting engineering and technology courses with sustainability topics

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Duties of engineers and technologists often entail designing and implementing solutions to problems. It is their responsibility to be cognizant of the impacts of their designs on, and thus their accountability to society in general. They must also be aware of subsequent effects upon the environment....

  17. Assessment of Augmented Electronic Fuel Controls for Modular Engine Diagnostics and Condition Monitoring

    DTIC Science & Technology

    1978-12-01

    coutrol- fault diagnosis relative to other typen of engine controls. ’The VAW system by design Intent ill inlidio the nt’uubor of control system...accessory gearbox and one on the exhaust frame. With proper filtering and logic, fault isolation to the module and/or LRU level should be effective...34 System With Fault Isolaion (FI) - FADEC, FICA, EHR, HIT, FD, and Base Shop Diagnosis System 2, with the addition of in-flight tape recording and a

  18. Performance Engineering Research Center and RECOVERY. Performance Engineering Research Institution SciDAC-e Augmentation. Performance enhancement

    SciTech Connect

    Hollingsworth, Jeffrey K.

    2015-10-12

    This project concentrated on various ways to improve the measurement and tuning large-scale parallel applications. This project was supplement to the project DE-FC0206ER25763 (“Performance Engineering Research Center”). The research conducted during this project is summarized in this report. The complete details of the work are available in the ten publications listed at the end of the report. It also supported the Ph.D. studies of three students and one research scientist.

  19. Computer Imagery and Neurological Rehabilitation: On the Use of Augmented Reality in Sensorimotor Training to Step Up Naturally Occurring Cortical Reorganization in Patients Following Stroke.

    PubMed

    Correa-Agudelo, Esteban; Ferrin, Carlos; Velez, Paulo; Gomez, Juan D

    2016-01-01

    This work promotes the use of computer-generated imagery -as visual illusions- to speed up motor learning in rehabilitation. In support of this, we adhere the principles of experience-dependent neuroplasticity and the positive impact of virtual reality (VR) thereof. Specifically, post-stroke patients will undergo motor therapy with a surrogate virtual limb that fakes the paralyzed limb. Along these lines, their motor intentions will match the visual evidence, which fosters physiological, functional and structural changes over time, for recovery of lost function in an injured brain. How we make up such an illusion using computer graphics, is central to this paper.

  20. Virtual Reality: The New Era of Rehabilitation Engineering [From the Regional Editor].

    PubMed

    Tong, Shanbao

    2016-01-01

    Rehabilitation engineering refers to the development and application of techniques, devices, and protocols for restoring function following disability. Although in most cases the concept relates to motor functions (e.g., training after a stroke or the use of limb prosthetics), mental rehabilitation engineering is also an emerging area.