Science.gov

Sample records for augmented reality engineering

  1. Usability engineering: domain analysis activities for augmented-reality systems

    NASA Astrophysics Data System (ADS)

    Gabbard, Joseph; Swan, J. E., II; Hix, Deborah; Lanzagorta, Marco O.; Livingston, Mark; Brown, Dennis B.; Julier, Simon J.

    2002-05-01

    This paper discusses our usability engineering process for the Battlefield Augmented Reality System (BARS). Usability engineering is a structured, iterative, stepwise development process. Like the related disciplines of software and systems engineering, usability engineering is a combination of management principals and techniques, formal and semi- formal evaluation techniques, and computerized tools. BARS is an outdoor augmented reality system that displays heads- up battlefield intelligence information to a dismounted warrior. The paper discusses our general usability engineering process. We originally developed the process in the context of virtual reality applications, but in this work we are adapting the procedures to an augmented reality system. The focus of this paper is our work on domain analysis, the first activity of the usability engineering process. We describe our plans for and our progress to date on our domain analysis for BARS. We give results in terms of a specific urban battlefield use case we have designed.

  2. Confronting an Augmented Reality

    ERIC Educational Resources Information Center

    Munnerley, Danny; Bacon, Matt; Wilson, Anna; Steele, James; Hedberg, John; Fitzgerald, Robert

    2012-01-01

    How can educators make use of augmented reality technologies and practices to enhance learning and why would we want to embrace such technologies anyway? How can an augmented reality help a learner confront, interpret and ultimately comprehend reality itself ? In this article, we seek to initiate a discussion that focuses on these questions, and…

  3. Augmented reality: a review.

    PubMed

    Berryman, Donna R

    2012-01-01

    Augmented reality is a technology that overlays digital information on objects or places in the real world for the purpose of enhancing the user experience. It is not virtual reality, that is, the technology that creates a totally digital or computer created environment. Augmented reality, with its ability to combine reality and digital information, is being studied and implemented in medicine, marketing, museums, fashion, and numerous other areas. This article presents an overview of augmented reality, discussing what it is, how it works, its current implementations, and its potential impact on libraries. PMID:22559183

  4. Augmented Reality in astrophysics

    NASA Astrophysics Data System (ADS)

    Vogt, Frédéric P. A.; Shingles, Luke J.

    2013-09-01

    Augmented Reality consists of merging live images with virtual layers of information. The rapid growth in the popularity of smartphones and tablets over recent years has provided a large base of potential users of Augmented Reality technology, and virtual layers of information can now be attached to a wide variety of physical objects. In this article, we explore the potential of Augmented Reality for astrophysical research with two distinct experiments: (1) Augmented Posters and (2) Augmented Articles. We demonstrate that the emerging technology of Augmented Reality can already be used and implemented without expert knowledge using currently available apps. Our experiments highlight the potential of Augmented Reality to improve the communication of scientific results in the field of astrophysics. We also present feedback gathered from the Australian astrophysics community that reveals evidence of some interest in this technology by astronomers who experimented with Augmented Posters. In addition, we discuss possible future trends for Augmented Reality applications in astrophysics, and explore the current limitations associated with the technology. This Augmented Article, the first of its kind, is designed to allow the reader to directly experiment with this technology.

  5. Augmented reality system

    NASA Astrophysics Data System (ADS)

    Lin, Chien-Liang; Su, Yu-Zheng; Hung, Min-Wei; Huang, Kuo-Cheng

    2010-08-01

    In recent years, Augmented Reality (AR)[1][2][3] is very popular in universities and research organizations. The AR technology has been widely used in Virtual Reality (VR) fields, such as sophisticated weapons, flight vehicle development, data model visualization, virtual training, entertainment and arts. AR has characteristics to enhance the display output as a real environment with specific user interactive functions or specific object recognitions. It can be use in medical treatment, anatomy training, precision instrument casting, warplane guidance, engineering and distance robot control. AR has a lot of vantages than VR. This system developed combines sensors, software and imaging algorithms to make users feel real, actual and existing. Imaging algorithms include gray level method, image binarization method, and white balance method in order to make accurate image recognition and overcome the effects of light.

  6. Augmented Reality Simulations on Handheld Computers

    ERIC Educational Resources Information Center

    Squire, Kurt; Klopfer, Eric

    2007-01-01

    Advancements in handheld computing, particularly its portability, social interactivity, context sensitivity, connectivity, and individuality, open new opportunities for immersive learning environments. This article articulates the pedagogical potential of augmented reality simulations in environmental engineering education by immersing students in…

  7. Fiia: A Model-Based Approach to Engineering Collaborative Augmented Reality

    NASA Astrophysics Data System (ADS)

    Wolfe, Christopher; Smith, J. David; Phillips, W. Greg; Graham, T. C. Nicholas

    Augmented reality systems often involve collaboration among groups of people. While there are numerous toolkits that aid the development of such augmented reality groupware systems (e.g., ARToolkit and Groupkit), there remains an enormous gap between the specification of an AR groupware application and its implementation. In this chapter, we present Fiia, a toolkit which simplifies the development of collaborative AR applications. Developers specify the structure of their applications using the Fiia modeling language, which abstracts details of networking and provides high-level support for specifying adapters between the physical and virtual world. The Fiia.Net runtime system then maps this conceptual model to a runtime implementation. We illustrate Fiia via Raptor, an augmented reality application used to help small groups collaboratively prototype video games.

  8. Augmented Reality Binoculars.

    PubMed

    Oskiper, Taragay; Sizintsev, Mikhail; Branzoi, Vlad; Samarasekera, Supun; Kumar, Rakesh

    2015-05-01

    In this paper we present an augmented reality binocular system to allow long range high precision augmentation of live telescopic imagery with aerial and terrain based synthetic objects, vehicles, people and effects. The inserted objects must appear stable in the display and must not jitter and drift as the user pans around and examines the scene with the binoculars. The design of the system is based on using two different cameras with wide field of view and narrow field of view lenses enclosed in a binocular shaped shell. Using the wide field of view gives us context and enables us to recover the 3D location and orientation of the binoculars much more robustly, whereas the narrow field of view is used for the actual augmentation as well as to increase precision in tracking. We present our navigation algorithm that uses the two cameras in combination with an inertial measurement unit and global positioning system in an extended Kalman filter and provides jitter free, robust and real-time pose estimation for precise augmentation. We have demonstrated successful use of our system as part of information sharing example as well as a live simulated training system for observer training, in which fixed and rotary wing aircrafts, ground vehicles, and weapon effects are combined with real world scenes. PMID:26357208

  9. Augmented Virtual Reality Laboratory

    NASA Technical Reports Server (NTRS)

    Tully-Hanson, Benjamin

    2015-01-01

    Real time motion tracking hardware has for the most part been cost prohibitive for research to regularly take place until recently. With the release of the Microsoft Kinect in November 2010, researchers now have access to a device that for a few hundred dollars is capable of providing redgreenblue (RGB), depth, and skeleton data. It is also capable of tracking multiple people in real time. For its original intended purposes, i.e. gaming, being used with the Xbox 360 and eventually Xbox One, it performs quite well. However, researchers soon found that although the sensor is versatile, it has limitations in real world applications. I was brought aboard this summer by William Little in the Augmented Virtual Reality (AVR) Lab at Kennedy Space Center to find solutions to these limitations.

  10. Augmented reality for wafer prober

    NASA Astrophysics Data System (ADS)

    Gilgenkrantz, Pascal

    2011-03-01

    The link between wafer manufacturing and wafer test is often weak: without common information system, Test engineers have to read locations of test structures from reference documents and search them on the wafer prober screen. Mask Data Preparation team is ideally placed to fill this gap, given its relationship with both design and manufacturing sides. With appropriate design extraction scripts and design conventions, mask engineers can provide exact wafer locations of all embedded test structures to avoid a painful camera search. Going a step further, it would be a great help to provide to wafer probers a "map" of what was build on wafers. With this idea in mind, mask design database can simply be provided to Test engineers; but the real added value would come from a true integration of real-wafer camera views and design database used for wafer manufacturing. As proven by several augmented reality applications, like Google Maps' mixed Satellite/Map view, mixing a real-world view with its theoretical model is very useful to understand the reality. The creation of such interface can only be made by a wafer prober manufacturer, given the high integration of these machines with their control panel. But many existing software libraries could be used to plot the design view matching the camera view. Standard formats for mask design are usually GDSII and OASIS (SEMI P39 standard); multiple free software and commercial viewers/editors/libraries for these formats are available.

  11. Augmented Reality Comes to Physics

    ERIC Educational Resources Information Center

    Buesing, Mark; Cook, Michael

    2013-01-01

    Augmented reality (AR) is a technology used on computing devices where processor-generated graphics are rendered over real objects to enhance the sensory experience in real time. In other words, what you are really seeing is augmented by the computer. Many AR games already exist for systems such as Kinect and Nintendo 3DS and mobile apps, such as…

  12. Augmented Reality Comes to Physics

    NASA Astrophysics Data System (ADS)

    Buesing, Mark; Cook, Michael

    2013-04-01

    Augmented reality (AR) is a technology used on computing devices where processor-generated graphics are rendered over real objects to enhance the sensory experience in real time. In other words, what you are really seeing is augmented by the computer. Many AR games already exist for systems such as Kinect and Nintendo 3DS and mobile apps, such as Tagwhat and Star Chart (a must for astronomy class). The yellow line marking first downs in a televised football game2 and the enhanced puck that makes televised hockey easier to follow3 both use augmented reality to do the job.

  13. Augmented Reality Tower Technology Assessment

    NASA Technical Reports Server (NTRS)

    Reisman, Ronald J.; Brown, David M.

    2009-01-01

    Augmented Reality technology may help improve Air Traffic Control Tower efficiency and safety during low-visibility conditions. This paper presents the assessments of five off-duty controllers who shadow-controlled' with an augmented reality prototype in their own facility. Initial studies indicated unanimous agreement that this technology is potentially beneficial, though the prototype used in the study was not adequate for operational use. Some controllers agreed that augmented reality technology improved situational awareness, had potential to benefit clearance, control, and coordination tasks and duties and could be very useful for acquiring aircraft and weather information, particularly aircraft location, heading, and identification. The strongest objections to the prototype used in this study were directed at aircraft registration errors, unacceptable optical transparency, insufficient display performance in sunlight, inadequate representation of the static environment and insufficient symbology.

  14. Wireless Augmented Reality Prototype (WARP)

    NASA Technical Reports Server (NTRS)

    Devereaux, A. S.

    1999-01-01

    Initiated in January, 1997, under NASA's Office of Life and Microgravity Sciences and Applications, the Wireless Augmented Reality Prototype (WARP) is a means to leverage recent advances in communications, displays, imaging sensors, biosensors, voice recognition and microelectronics to develop a hands-free, tetherless system capable of real-time personal display and control of computer system resources. Using WARP, an astronaut may efficiently operate and monitor any computer-controllable activity inside or outside the vehicle or station. The WARP concept is a lightweight, unobtrusive heads-up display with a wireless wearable control unit. Connectivity to the external system is achieved through a high-rate radio link from the WARP personal unit to a base station unit installed into any system PC. The radio link has been specially engineered to operate within the high- interference, high-multipath environment of a space shuttle or space station module. Through this virtual terminal, the astronaut will be able to view and manipulate imagery, text or video, using voice commands to control the terminal operations. WARP's hands-free access to computer-based instruction texts, diagrams and checklists replaces juggling manuals and clipboards, and tetherless computer system access allows free motion throughout a cabin while monitoring and operating equipment.

  15. A Pilot Study of the Effectiveness of Augmented Reality to Enhance the Use of Remote Labs in Electrical Engineering Education

    NASA Astrophysics Data System (ADS)

    Mejías Borrero, A.; Andújar Márquez, J. M.

    2012-10-01

    Lab practices are an essential part of teaching in Engineering. However, traditional laboratory lessons developed in classroom labs (CL) must be adapted to teaching and learning strategies that go far beyond the common concept of e-learning, in the sense that completely virtualized distance education disconnects teachers and students from the real world, which can generate specific problems in laboratory classes. Current proposals of virtual labs (VL) and remote labs (RL) do not either cover new needs properly or contribute remarkable improvement to traditional labs—except that they favor distance training. Therefore, online teaching and learning in lab practices demand a further step beyond current VL and RL. This paper poses a new reality and new teaching/learning concepts in the field of lab practices in engineering. The developed augmented reality-based lab system (augmented remote lab, ARL) enables teachers and students to work remotely (Internet/intranet) in current CL, including virtual elements which interact with real ones. An educational experience was conducted to assess the developed ARL with the participation of a group of 10 teachers and another group of 20 students. Both groups have completed lab practices of the contents in the subjects Digital Systems and Robotics and Industrial Automation, which belong to the second year of the new degree in Electronic Engineering (adapted to the European Space for Higher Education). The labs were carried out by means of three different possibilities: CL, VL and ARL. After completion, both groups were asked to fill in some questionnaires aimed at measuring the improvement contributed by ARL relative to CL and VL. Except in some specific questions, the opinion of teachers and students was rather similar and positive regarding the use and possibilities of ARL. Although the results are still preliminary and need further study, seems to conclude that ARL remarkably improves the possibilities of current VL and RL

  16. Introduction to augmented and virtual reality

    NASA Astrophysics Data System (ADS)

    Caudell, Thomas P.

    1995-12-01

    This paper introduces the field of augmented reality as a prolog to the body of papers in the remainder of this session. I describe the use of head-mounted display technologies to improve the efficiency and quality of human workers in their performance of engineering design, manufacturing, construction, testing, and maintenance activities. This technology is used to `augment' the visual field of the wearer with information necessary in the performance of the current task. The enabling technology is head-up (see-through) display head sets (HUDsets) combined with head position sensing, real world registration systems, and database access software. A primary difference between virtual reality (VR) and `augmented reality' (AR) is in the complexity of the perceived graphical objects. In AR systems, only simple wire frames, template outlines, designators, and text is displayed. An immediate result of this difference is that augmented reality systems can be driven by standard and inexpensive microprocessors. Many research issues must be addressed before this technology can be widely used, including tracking and registration, human 3D perception and reasoning, and human task performance issues.

  17. Augmented reality building operations tool

    DOEpatents

    Brackney, Larry J.

    2014-09-09

    A method (700) for providing an augmented reality operations tool to a mobile client (642) positioned in a building (604). The method (700) includes, with a server (660), receiving (720) from the client (642) an augmented reality request for building system equipment (612) managed by an energy management system (EMS) (620). The method (700) includes transmitting (740) a data request for the equipment (612) to the EMS (620) and receiving (750) building management data (634) for the equipment (612). The method (700) includes generating (760) an overlay (656) with an object created based on the building management data (634), which may be sensor data, diagnostic procedures, or the like. The overlay (656) is configured for concurrent display on a display screen (652) of the client (642) with a real-time image of the building equipment (612). The method (700) includes transmitting (770) the overlay (656) to the client (642).

  18. Augmented reality using GPS

    NASA Astrophysics Data System (ADS)

    Kim, Juwan; Kim, Haedong; Jang, Byungtae; Kim, Jungsik; Kim, Donghyun

    1998-04-01

    This paper describes a prototype system to be developing using GPS (Global Positioning System) as a tracker in order to combine real images with virtual geographical images in real time. To cover long distances, this system is built using a monitor-based configuration and divided into two parts. One is the real scene acquisition system that includes a vehicle, a wireless CCD camera, a GPS attitude determination device and a wireless data communication device. The other is the processing and visualization system that includes a wireless data communication device, a PC with a video overlay card and a 3D graphics accelerator. The pilot area of the current system is the part of SERI (Systems Engineering Research Institute) which is the institute we are working for now. And virtual objects are generated with 3D modeling data of the main building, the new building to be planned, and so on in SERI. The wireless CCD camera attached to a vehicle acquires the real scenes. And GPS attitude determination device produces a wireless CCD camera's position and orientation data. And then this information is transmitted to the processing and visualization part by air. In the processing and visualization system, virtual images are rendered using the received information and combined with the real scenes. Applications are an enhanced bird's-eye view and disaster rescue work such as earthquake.

  19. Augmenting Your Own Reality: Student Authoring of Science-Based Augmented Reality Games

    ERIC Educational Resources Information Center

    Klopfer, Eric; Sheldon, Josh

    2010-01-01

    Augmented Reality (AR) simulations superimpose a virtual overlay of data and interactions onto a real-world context. The simulation engine at the heart of this technology is built to afford elements of game play that support explorations and learning in students' natural context--their own community and surroundings. In one of the more recent…

  20. Webizing mobile augmented reality content

    NASA Astrophysics Data System (ADS)

    Ahn, Sangchul; Ko, Heedong; Yoo, Byounghyun

    2014-01-01

    This paper presents a content structure for building mobile augmented reality (AR) applications in HTML5 to achieve a clean separation of the mobile AR content and the application logic for scaling as on the Web. We propose that the content structure contains the physical world as well as virtual assets for mobile AR applications as document object model (DOM) elements and that their behaviour and user interactions are controlled through DOM events by representing objects and places with a uniform resource identifier. Our content structure enables mobile AR applications to be seamlessly developed as normal HTML documents under the current Web eco-system.

  1. Augmented reality in medical education?

    PubMed

    Kamphuis, Carolien; Barsom, Esther; Schijven, Marlies; Christoph, Noor

    2014-09-01

    Learning in the medical domain is to a large extent workplace learning and involves mastery of complex skills that require performance up to professional standards in the work environment. Since training in this real-life context is not always possible for reasons of safety, costs, or didactics, alternative ways are needed to achieve clinical excellence. Educational technology and more specifically augmented reality (AR) has the potential to offer a highly realistic situated learning experience supportive of complex medical learning and transfer. AR is a technology that adds virtual content to the physical real world, thereby augmenting the perception of reality. Three examples of dedicated AR learning environments for the medical domain are described. Five types of research questions are identified that may guide empirical research into the effects of these learning environments. Up to now, empirical research mainly appears to focus on the development, usability and initial implementation of AR for learning. Limited review results reflect the motivational value of AR, its potential for training psychomotor skills and the capacity to visualize the invisible, possibly leading to enhanced conceptual understanding of complex causality. PMID:24464832

  2. Enhancing Education through Mobile Augmented Reality

    ERIC Educational Resources Information Center

    Joan, D. R. Robert

    2015-01-01

    In this article, the author has discussed about the Mobile Augmented Reality and enhancing education through it. The aim of the present study was to give some general information about mobile augmented reality which helps to boost education. Purpose of the current study reveals the mobile networks which are used in the institution campus as well…

  3. Augmented Reality for Close Quarters Combat

    ScienceCinema

    None

    2014-06-23

    Sandia National Laboratories has developed a state-of-the-art augmented reality training system for close-quarters combat (CQB). This system uses a wearable augmented reality system to place the user in a real environment while engaging enemy combatants in virtual space (Boston Dynamics DI-Guy). Umbra modeling and simulation environment is used to integrate and control the AR system.

  4. Augmented Reality for Close Quarters Combat

    SciTech Connect

    2013-09-20

    Sandia National Laboratories has developed a state-of-the-art augmented reality training system for close-quarters combat (CQB). This system uses a wearable augmented reality system to place the user in a real environment while engaging enemy combatants in virtual space (Boston Dynamics DI-Guy). Umbra modeling and simulation environment is used to integrate and control the AR system.

  5. Augmented Reality and Mobile Art

    NASA Astrophysics Data System (ADS)

    Gwilt, Ian

    The combined notions of augmented-reality (AR) and mobile art are based on the amalgamation of a number of enabling technologies including computer imaging, emergent display and tracking systems and the increased computing-power in hand-held devices such as Tablet PCs, smart phones, or personal digital assistants (PDAs) which have been utilized in the making of works of art. There is much published research on the technical aspects of AR and the ongoing work being undertaken in the development of faster more efficient AR systems [1] [2]. In this text I intend to concentrate on how AR and its associated typologies can be applied in the context of new media art practices, with particular reference to its application on hand-held or mobile devices.

  6. Augmented reality: past, present, future

    NASA Astrophysics Data System (ADS)

    Inzerillo, Laura

    2013-03-01

    A great opportunity has permitted to carry out a cultural, historical, architectural and social research with great impact factor on the international cultural interest. We are talking about the realization of a museum whose the main theme is the visit and the discovery of a monument of great prestige: the monumental building the "Steri" in Palermo. The museum is divided into sub themes including the one above all, that has aroused the international interest so much that it has been presented the instance to include the museum in the cultural heritage of UNESCO. It is the realization of a museum path that regards the cells of the Inquisition, which are located just inside of some buildings of the monumental building. The project, as a whole, is faced, in a total view, between the various competences implicated: historic, chemic, architectonic, topographic, drawing, representation, virtual communication, informatics. The birth of the museum will be a sum of the results of all these disciplines involved. Methodology, implementation, fruition, virtual museum, goals, 2D graphic restitution, effects on the cultural heritage and landscape environmental, augmented reality, Surveying 2D and 3D, hi-touch screen, Photogrammetric survey, Photographic survey, representation, drawing 3D and more than this has been dealt with this research.

  7. Augmented Reality for the Improvement of Remote Laboratories: An Augmented Remote Laboratory

    ERIC Educational Resources Information Center

    Andujar, J. M.; Mejias, A.; Marquez, M. A.

    2011-01-01

    Augmented reality (AR) provides huge opportunities for online teaching in science and engineering, as these disciplines place emphasis on practical training and unsuited to completely nonclassroom training. This paper proposes a new concept in virtual and remote laboratories: the augmented remote laboratory (ARL). ARL is being tested in the first…

  8. Location-Based Learning through Augmented Reality

    ERIC Educational Resources Information Center

    Chou, Te-Lien; Chanlin, Lih-Juan

    2014-01-01

    A context-aware and mixed-reality exploring tool cannot only effectively provide an information-rich environment to users, but also allows them to quickly utilize useful resources and enhance environment awareness. This study integrates Augmented Reality (AR) technology into smartphones to create a stimulating learning experience at a university…

  9. ARSC: Augmented Reality Student Card--An Augmented Reality Solution for the Education Field

    ERIC Educational Resources Information Center

    El Sayed, Neven A. M.; Zayed, Hala H.; Sharawy, Mohamed I.

    2011-01-01

    Augmented Reality (AR) is the technology of adding virtual objects to real scenes through enabling the addition of missing information in real life. As the lack of resources is a problem that can be solved through AR, this paper presents and explains the usage of AR technology we introduce Augmented Reality Student Card (ARSC) as an application of…

  10. Key technologies of outdoor augmented reality GIS

    NASA Astrophysics Data System (ADS)

    Ren, Fu; Du, Qingyun; Wu, Xueling

    2008-12-01

    Augmented Reality (AR) is a growing research area in virtual reality and generates a composite view for the user. It is combination of the real scene viewed by the user and a virtual scene generated by the computer that augments the scene with additional information. About 80 percent information in the real world is related with spatial location. The combination of Geographical information system (GIS) and AR technologies would promote the development of outdoor AR systems, and also would explore a new research direction for GIS. The key technologies of outdoor augmented reality GIS, including basic tracking methods, display devices, typical applications and registration processes, are discussed. In indoor augmented reality's closed environments the tracking of position and head orientation as well as the presentation of information is much more unproblematic than the same task in an outdoor environment. The main application task of outdoor augmented reality GIS is the presentation of information to a user while moving through an unknown region. The system helps to detect automatically objects in sight of a person who need its information. It compares the conventional solutions of 3D registration with, while it discusses their algorithm procedure to basic parameters to give out their advantages and disadvantages at different condition. While affine transformation approach uses the idea of computer graphics and vision technology for reference. Its accuracy is mainly based on the precision and speed of scene feature point extracted from natural or artificial feature.

  11. CARE: Creating Augmented Reality in Education

    ERIC Educational Resources Information Center

    Latif, Farzana

    2012-01-01

    This paper explores how Augmented Reality using mobile phones can enhance teaching and learning in education. It specifically examines its application in two cases, where it is identified that the agility of mobile devices and the ability to overlay context specific resources offers opportunities to enhance learning that would not otherwise exist.…

  12. Get Real: Augmented Reality for the Classroom

    ERIC Educational Resources Information Center

    Mitchell, Rebecca; DeBay, Dennis

    2012-01-01

    Kids love augmented reality (AR) simulations because they are like real-life video games. AR simulations allow students to learn content while collaborating face to face and interacting with a multimedia-enhanced version of the world around them. Although the technology may seem advanced, AR software makes it easy to develop content-based…

  13. Intelligent Augmented Reality Training for Motherboard Assembly

    ERIC Educational Resources Information Center

    Westerfield, Giles; Mitrovic, Antonija; Billinghurst, Mark

    2015-01-01

    We investigate the combination of Augmented Reality (AR) with Intelligent Tutoring Systems (ITS) to assist with training for manual assembly tasks. Our approach combines AR graphics with adaptive guidance from the ITS to provide a more effective learning experience. We have developed a modular software framework for intelligent AR training…

  14. Design Principles for Augmented Reality Learning

    ERIC Educational Resources Information Center

    Dunleavy, Matt

    2014-01-01

    Augmented reality is an emerging technology that utilizes mobile, context-aware devices (e.g., smartphones, tablets) that enable participants to interact with digital information embedded within the physical environment. This overview of design principles focuses on specific strategies that instructional designers can use to develop AR learning…

  15. Personalized augmented reality for anatomy education.

    PubMed

    Ma, Meng; Fallavollita, Pascal; Seelbach, Ina; Von Der Heide, Anna Maria; Euler, Ekkehard; Waschke, Jens; Navab, Nassir

    2016-05-01

    Anatomy education is a challenging but vital element in forming future medical professionals. In this work, a personalized and interactive augmented reality system is developed to facilitate education. This system behaves as a "magic mirror" which allows personalized in-situ visualization of anatomy on the user's body. Real-time volume visualization of a CT dataset creates the illusion that the user can look inside their body. The system comprises a RGB-D sensor as a real-time tracking device to detect the user moving in front of a display. In addition, the magic mirror system shows text information, medical images, and 3D models of organs that the user can interact with. Through the participation of 7 clinicians and 72 students, two user studies were designed to respectively assess the precision and acceptability of the magic mirror system for education. The results of the first study demonstrated that the average precision of the augmented reality overlay on the user body was 0.96 cm, while the results of the second study indicate 86.1% approval for the educational value of the magic mirror, and 91.7% approval for the augmented reality capability of displaying organs in three dimensions. The usefulness of this unique type of personalized augmented reality technology has been demonstrated in this paper. PMID:26646315

  16. Opportunistic tangible user interfaces for augmented reality.

    PubMed

    Henderson, Steven; Feiner, Steven

    2010-01-01

    Opportunistic Controls are a class of user interaction techniques that we have developed for augmented reality (AR) applications to support gesturing on, and receiving feedback from, otherwise unused affordances already present in the domain environment. By leveraging characteristics of these affordances to provide passive haptics that ease gesture input, Opportunistic Controls simplify gesture recognition, and provide tangible feedback to the user. In this approach, 3D widgets are tightly coupled with affordances to provide visual feedback and hints about the functionality of the control. For example, a set of buttons can be mapped to existing tactile features on domain objects. We describe examples of Opportunistic Controls that we have designed and implemented using optical marker tracking, combined with appearance-based gesture recognition. We present the results of two user studies. In the first, participants performed a simulated maintenance inspection of an aircraft engine using a set of virtual buttons implemented both as Opportunistic Controls and using simpler passive haptics. Opportunistic Controls allowed participants to complete their tasks significantly faster and were preferred over the baseline technique. In the second, participants proposed and demonstrated user interfaces incorporating Opportunistic Controls for two domains, allowing us to gain additional insights into how user interfaces featuring Opportunistic Controls might be designed. PMID:19910657

  17. Spacecraft 3D Augmented Reality Mobile App

    NASA Technical Reports Server (NTRS)

    Hussey, Kevin J.; Doronila, Paul R.; Kumanchik, Brian E.; Chan, Evan G.; Ellison, Douglas J.; Boeck, Andrea; Moore, Justin M.

    2013-01-01

    The Spacecraft 3D application allows users to learn about and interact with iconic NASA missions in a new and immersive way using common mobile devices. Using Augmented Reality (AR) techniques to project 3D renditions of the mission spacecraft into real-world surroundings, users can interact with and learn about Curiosity, GRAIL, Cassini, and Voyager. Additional updates on future missions, animations, and information will be ongoing. Using a printed AR Target and camera on a mobile device, users can get up close with these robotic explorers, see how some move, and learn about these engineering feats, which are used to expand knowledge and understanding about space. The software receives input from the mobile device's camera to recognize the presence of an AR marker in the camera's field of view. It then displays a 3D rendition of the selected spacecraft in the user's physical surroundings, on the mobile device's screen, while it tracks the device's movement in relation to the physical position of the spacecraft's 3D image on the AR marker.

  18. Augmented reality in surgical procedures

    NASA Astrophysics Data System (ADS)

    Samset, E.; Schmalstieg, D.; Vander Sloten, J.; Freudenthal, A.; Declerck, J.; Casciaro, S.; Rideng, Ø.; Gersak, B.

    2008-02-01

    Minimally invasive therapy (MIT) is one of the most important trends in modern medicine. It includes a wide range of therapies in videoscopic surgery and interventional radiology and is performed through small incisions. It reduces hospital stay-time by allowing faster recovery and offers substantially improved cost-effectiveness for the hospital and the society. However, the introduction of MIT has also led to new problems. The manipulation of structures within the body through small incisions reduces dexterity and tactile feedback. It requires a different approach than conventional surgical procedures, since eye-hand co-ordination is not based on direct vision, but more predominantly on image guidance via endoscopes or radiological imaging modalities. ARIS*ER is a multidisciplinary consortium developing a new generation of decision support tools for MIT by augmenting visual and sensorial feedback. We will present tools based on novel concepts in visualization, robotics and haptics providing tailored solutions for a range of clinical applications. Examples from radio-frequency ablation of liver-tumors, laparoscopic liver surgery and minimally invasive cardiac surgery will be presented. Demonstrators were developed with the aim to provide a seamless workflow for the clinical user conducting image-guided therapy.

  19. Augmented reality visualization for thoracoscopic spine surgery

    NASA Astrophysics Data System (ADS)

    Sauer, Frank; Vogt, Sebastian; Khamene, Ali; Heining, Sandro; Euler, Ekkehard; Schneberger, Marc; Zuerl, Konrad; Mutschler, Wolf

    2006-03-01

    We are developing an augmented reality (AR) image guidance system in which information derived from medical images is overlaid onto a video view of the patient. The centerpiece of the system is a head-mounted display custom fitted with two miniature color video cameras that capture the stereo view of the scene. Medical graphics is overlaid onto the video view and appears firmly anchored in the scene, without perceivable time lag or jitter. We have been testing the system for different clinical applications. In this paper we discuss minimally invasive thoracoscopic spine surgery as a promising new orthopedic application. In the standard approach, the thoracoscope - a rigid endoscope - provides visual feedback for the minimally invasive procedure of removing a damaged disc and fusing the two neighboring vertebrae. The navigation challenges are twofold. From a global perspective, the correct vertebrae on the spine have to be located with the inserted instruments. From a local perspective, the actual spine procedure has to be performed precisely. Visual feedback from the thoracoscope provides only limited support for both of these tasks. In the augmented reality approach, we give the surgeon additional anatomical context for the navigation. Before the surgery, we derive a model of the patient's anatomy from a CT scan, and during surgery we track the location of the surgical instruments in relation to patient and model. With this information, we can help the surgeon in both the global and local navigation, providing a global map and 3D information beyond the local 2D view of the thoracoscope. Augmented reality visualization is a particularly intuitive method of displaying this information to the surgeon. To adapt our augmented reality system to this application, we had to add an external optical tracking system, which works now in combination with our head-mounted tracking camera. The surgeon's feedback to the initial phantom experiments is very positive.

  20. Service connectivity architecture for mobile augmented reality

    NASA Astrophysics Data System (ADS)

    Turunen, Tuukka; Pyssysalo, Tino; Roening, Juha

    2001-06-01

    Mobile augmented reality can be utilized in a number of different services, and it provides a lot of added value compared to the interfaces used in mobile multimedia today. Intelligent service connectivity architecture is needed for the emerging commercial mobile augmented reality services, to guarantee mobility and interoperability on a global scale. Some of the key responsibilities of this architecture are to find suitable service providers, to manage the connection with and utilization of such providers, and to allow smooth switching between them whenever the user moves out of the service area of the service provider she is currently connected to. We have studied the potential support technologies for such architectures and propose a way to create an intelligent service connectivity architecture based on current and upcoming wireless networks, an Internet backbone, and mechanisms to manage service connectivity in the upper layers of the protocol stack. In this paper, we explain the key issues of service connectivity, describe the properties of our architecture, and analyze the functionality of an example system. Based on these, we consider our proposition a good solution to the quest for global interoperability in mobile augmented reality services.

  1. Telescopic multi-resolution augmented reality

    NASA Astrophysics Data System (ADS)

    Jenkins, Jeffrey; Frenchi, Christopher; Szu, Harold

    2014-05-01

    To ensure a self-consistent scaling approximation, the underlying microscopic fluctuation components can naturally influence macroscopic means, which may give rise to emergent observable phenomena. In this paper, we describe a consistent macroscopic (cm-scale), mesoscopic (micron-scale), and microscopic (nano-scale) approach to introduce Telescopic Multi-Resolution (TMR) into current Augmented Reality (AR) visualization technology. We propose to couple TMR-AR by introducing an energy-matter interaction engine framework that is based on known Physics, Biology, Chemistry principles. An immediate payoff of TMR-AR is a self-consistent approximation of the interaction between microscopic observables and their direct effect on the macroscopic system that is driven by real-world measurements. Such an interdisciplinary approach enables us to achieve more than multiple scale, telescopic visualization of real and virtual information but also conducting thought experiments through AR. As a result of the consistency, this framework allows us to explore a large dimensionality parameter space of measured and unmeasured regions. Towards this direction, we explore how to build learnable libraries of biological, physical, and chemical mechanisms. Fusing analytical sensors with TMR-AR libraries provides a robust framework to optimize testing and evaluation through data-driven or virtual synthetic simulations. Visualizing mechanisms of interactions requires identification of observable image features that can indicate the presence of information in multiple spatial and temporal scales of analog data. The AR methodology was originally developed to enhance pilot-training as well as `make believe' entertainment industries in a user-friendly digital environment We believe TMR-AR can someday help us conduct thought experiments scientifically, to be pedagogically visualized in a zoom-in-and-out, consistent, multi-scale approximations.

  2. Toward natural fiducials for augmented reality

    NASA Astrophysics Data System (ADS)

    Kitchin, Paul; Martinez, Kirk

    2005-03-01

    Augmented Reality (AR) requires a mapping between the camera(s) and the world, so that virtual objects can be correctly registered. Current AR applications either use pre-prepared fiducial markers or specialist equipment or impose significant constraints on lighting and background. Each of these approaches has significant drawbacks. Fiducial markers are susceptible to loss or damage, can be awkward to work with and may require significant effort to prepare an area for Augmented interaction. Use of such markers may also present an imposition to non-augmented observers, especially in environments such as museums or historical landmarks. Specialist equipment is expensive and not universally available. Lighting and background constraints are often impractical for real-world applications. This paper presents initial results in using the palm of the hand as a pseudo-fiducial marker in a natural real-world environment, through colour, feature and edge analysis. The eventual aim of this research is to enable fiducial marker cards to be dispensed with entirely in some situations in order to allow more natural interaction in Augmented environments. Examples of this would be allowing users to "hold" virtual 3D objects in the palm of their hand or use gestures to interact with virtual objects.

  3. NASA's Wireless Augmented Reality Prototype (WARP)

    NASA Astrophysics Data System (ADS)

    Agan, Martin; Voisinet, Leeann; Devereaux, Ann

    1998-01-01

    The objective of Wireless Augmented Reality Prototype (WARP) effort is to develop and integrate advanced technologies for real-time personal display of information relevant to the health and safety of space station/shuttle personnel. The WARP effort will develop and demonstrate technologies that will ultimately be incorporated into operational Space Station systems and that have potential earth applications such as aircraft pilot alertness monitoring and in various medical and consumer environments where augmented reality is required. To this end a two phase effort will be undertaken to rapidly develop a prototype (Phase I) and an advanced prototype (Phase II) to demonstrate the following key technology features that could be applied to astronaut internal vehicle activity (IVA) and potentially external vehicle activity (EVA) as well: 1) mobile visualization, and 2) distributed information system access. Specifically, Phase I will integrate a low power, miniature wireless communication link and a commercial biosensor with a head mounted display. The Phase I design will emphasize the development of a relatively small, lightweight, and unobtrusive body worn prototype system. Phase II will put increased effort on miniaturization, power consumption reduction, increased throughput, higher resolution, and ``wire removal'' of the subsystems developed in Phase I.

  4. The Local Games Lab ABQ: Homegrown Augmented Reality

    ERIC Educational Resources Information Center

    Holden, Christopher

    2014-01-01

    Experiments in the use of augmented reality games formerly required extensive material resources and expertise to implement above and beyond what might be possible within the usual educational contexts. Currently, the more common availability of hardware in these contexts and the existence of easy-to-use, general purpose augmented reality design…

  5. Augmenting a Child's Reality: Using Educational Tablet Technology

    ERIC Educational Resources Information Center

    Tanner, Patricia; Karas, Carly; Schofield, Damian

    2014-01-01

    This study investigates the classroom integration of an innovative technology, augmented reality. Although the process of adding new technologies into a classroom setting can be daunting, the concept of augmented reality has demonstrated the ability to educate students and to assist with their comprehension of a procedural task. One half of the…

  6. On Location Learning: Authentic Applied Science with Networked Augmented Realities

    ERIC Educational Resources Information Center

    Rosenbaum, Eric; Klopfer, Eric; Perry, Judy

    2007-01-01

    The learning of science can be made more like the practice of science through authentic simulated experiences. We have created a networked handheld Augmented Reality environment that combines the authentic role-playing of Augmented Realities and the underlying models of Participatory Simulations. This game, known as Outbreak @ The Institute, is…

  7. Image-processing with augmented reality (AR)

    NASA Astrophysics Data System (ADS)

    Babaei, Hossein R.; Mohurutshe, Pagiel L.; Habibi Lashkari, Arash

    2013-03-01

    In this project, the aim is to discuss and articulate the intent to create an image-based Android Application. The basis of this study is on real-time image detection and processing. It's a new convenient measure that allows users to gain information on imagery right on the spot. Past studies have revealed attempts to create image based applications but have only gone up to crating image finders that only work with images that are already stored within some form of database. Android platform is rapidly spreading around the world and provides by far the most interactive and technical platform for smart-phones. This is why it was important to base the study and research on it. Augmented Reality is this allows the user to maipulate the data and can add enhanced features (video, GPS tags) to the image taken.

  8. Reconfigurable hardware for an augmented reality application

    NASA Astrophysics Data System (ADS)

    Toledo Moreo, F. Javier; Martinez Alvarez, J. Javier; Garrigos Guerrero, F. Javier; Ferrandez Vicente, J. Manuel

    2005-06-01

    An FPGA-based approach is proposed to build an augmented reality system in order to aid people affected by a visual disorder known as tunnel vision. The aim is to increase the user's knowledge of his environment by superimposing on his own view useful information obtained with image processing. Two different alternatives have been explored to perform the required image processing: a specific purpose algorithm to extract edge detection information, and a cellular neural network with the suitable template. Their implementations in reconfigurable hardware pursue to take advantage of the performance and flexibility that show modern FPGAs. This paper describes the hardware implementation of both the Canny algorithm and the cellular neural network, and the overall system architecture. Results of the implementations and examples of the system functionality are presented.

  9. Temporal Coherence Strategies for Augmented Reality Labeling.

    PubMed

    Madsen, Jacob Boesen; Tatzqern, Markus; Madsen, Claus B; Schmalstieg, Dieter; Kalkofen, Denis

    2016-04-01

    Temporal coherence of annotations is an important factor in augmented reality user interfaces and for information visualization. In this paper, we empirically evaluate four different techniques for annotation. Based on these findings, we follow up with subjective evaluations in a second experiment. Results show that presenting annotations in object space or image space leads to a significant difference in task performance. Furthermore, there is a significant interaction between rendering space and update frequency of annotations. Participants improve significantly in locating annotations, when annotations are presented in object space, and view management update rate is limited. In a follow-up experiment, participants appear to be more satisfied with limited update rate in comparison to a continuous update rate of the view management system. PMID:26780810

  10. Visualizing Sea Level Rise with Augmented Reality

    NASA Astrophysics Data System (ADS)

    Kintisch, E. S.

    2013-12-01

    Looking Glass is an application on the iPhone that visualizes in 3-D future scenarios of sea level rise, overlaid on live camera imagery in situ. Using a technology known as augmented reality, the app allows a layperson user to explore various scenarios of sea level rise using a visual interface. Then the user can see, in an immersive, dynamic way, how those scenarios would affect a real place. The first part of the experience activates users' cognitive, quantitative thinking process, teaching them how global sea level rise, tides and storm surge contribute to flooding; the second allows an emotional response to a striking visual depiction of possible future catastrophe. This project represents a partnership between a science journalist, MIT, and the Rhode Island School of Design, and the talk will touch on lessons this projects provides on structuring and executing such multidisciplinary efforts on future design projects.

  11. Extended overview techniques for outdoor augmented reality.

    PubMed

    Veas, Eduardo; Grasset, Raphaël; Kruijff, Ernst; Schmalstieg, Dieter

    2012-04-01

    In this paper, we explore techniques that aim to improve site understanding for outdoor Augmented Reality (AR) applications. While the first person perspective in AR is a direct way of filtering and zooming on a portion of the data set, it severely narrows overview of the situation, particularly over large areas. We present two interactive techniques to overcome this problem: multi-view AR and variable perspective view. We describe in details the conceptual, visualization and interaction aspects of these techniques and their evaluation through a comparative user study. The results we have obtained strengthen the validity of our approach and the applicability of our methods to a large range of application domains. PMID:22402683

  12. Augmented Reality as a Countermeasure for Sleep Deprivation.

    PubMed

    Baumeister, James; Dorrlan, Jillian; Banks, Siobhan; Chatburn, Alex; Smith, Ross T; Carskadon, Mary A; Lushington, Kurt; Thomas, Bruce H

    2016-04-01

    Sleep deprivation is known to have serious deleterious effects on executive functioning and job performance. Augmented reality has an ability to place pertinent information at the fore, guiding visual focus and reducing instructional complexity. This paper presents a study to explore how spatial augmented reality instructions impact procedural task performance on sleep deprived users. The user study was conducted to examine performance on a procedural task at six time points over the course of a night of total sleep deprivation. Tasks were provided either by spatial augmented reality-based projections or on an adjacent monitor. The results indicate that participant errors significantly increased with the monitor condition when sleep deprived. The augmented reality condition exhibited a positive influence with participant errors and completion time having no significant increase when sleep deprived. The results of our study show that spatial augmented reality is an effective sleep deprivation countermeasure under laboratory conditions. PMID:26780802

  13. A Pilot Study of the Effectiveness of Augmented Reality to Enhance the Use of Remote Labs in Electrical Engineering Education

    ERIC Educational Resources Information Center

    Borrero, A. Mejias; Marquez, J. M. Andujar

    2012-01-01

    Lab practices are an essential part of teaching in Engineering. However, traditional laboratory lessons developed in classroom labs (CL) must be adapted to teaching and learning strategies that go far beyond the common concept of e-learning, in the sense that completely virtualized distance education disconnects teachers and students from the real…

  14. 3D augmented reality with integral imaging display

    NASA Astrophysics Data System (ADS)

    Shen, Xin; Hua, Hong; Javidi, Bahram

    2016-06-01

    In this paper, a three-dimensional (3D) integral imaging display for augmented reality is presented. By implementing the pseudoscopic-to-orthoscopic conversion method, elemental image arrays with different capturing parameters can be transferred into the identical format for 3D display. With the proposed merging algorithm, a new set of elemental images for augmented reality display is generated. The newly generated elemental images contain both the virtual objects and real world scene with desired depth information and transparency parameters. The experimental results indicate the feasibility of the proposed 3D augmented reality with integral imaging.

  15. LCD masks for spatial augmented reality

    NASA Astrophysics Data System (ADS)

    Smithwick, Quinn Y. J.; Reetz, Daniel; Smoot, Lanny

    2014-03-01

    One aim of Spatial Augmented Reality is to visually integrate synthetic objects into real-world spaces amongst physical objects, viewable by many observers without 3D glasses, head-mounted displays or mobile screens. In common implementations, using beam-combiners, scrim projection, or transparent self-emissive displays, the synthetic object's and real-world scene's light combine additively. As a result, synthetic objects appear low-contrast and semitransparent against well-lit backgrounds, and do not cast shadows. These limitations prevent synthetic objects from appearing solid and visually integrated into the real-world space. We use a transparent LCD panel as a programmable dynamic mask. The LCD panel displaying the synthetic object's silhouette mask is colocated with the object's color image, both staying aligned for all points-of-view. The mask blocks the background providing occlusion, presents a black level for high-contrast images, blocks scene illumination thus casting true shadows, and prevents blow-by in projection scrim arrangements. We have several implementations of SAR with LCD masks: 1) beam-combiner with an LCD mask, 2) scrim projection with an LCD mask, and 3) transparent OLED display with an LCD mask. Large format (80" diagonal) and dual layer volumetric variations are also implemented.

  16. Directing driver attention with augmented reality cues

    PubMed Central

    Rusch, Michelle L.; Schall, Mark C.; Gavin, Patrick; Lee, John D.; Dawson, Jeffrey D.; Vecera, Shaun; Rizzo, Matthew

    2013-01-01

    This simulator study evaluated the effects of augmented reality (AR) cues designed to direct the attention of experienced drivers to roadside hazards. Twenty-seven healthy middle-aged licensed drivers with a range of attention capacity participated in a 54 mile (1.5 hour) drive in an interactive fixed-base driving simulator. Each participant received AR cues to potential roadside hazards in six simulated straight (9 mile long) rural roadway segments. Drivers were evaluated on response time for detecting a potentially hazardous event, detection accuracy for target (hazard) and non-target objects, and headway with respect to the hazards. Results showed no negative outcomes associated with interference. AR cues did not impair perception of non-target objects, including for drivers with lower attentional capacity. Results showed near significant response time benefits for AR cued hazards. AR cueing increased response rate for detecting pedestrians and warning signs but not vehicles. AR system false alarms and misses did not impair driver responses to potential hazards. PMID:24436635

  17. Videometric head tracker for augmented reality applications

    NASA Astrophysics Data System (ADS)

    Janin, Adam L.; Zikan, Karel; Mizell, David; Banner, Mike; Sowizral, Henry A.

    1995-12-01

    For the past three years, we have been developing augmented reality technology for application to a variety of touch labor tasks in aircraft manufacturing and assembly. The system would be worn by factory workers to provide them with better-quality information for performing their tasks than was previously available. Using a see-through head-mounted display (HMD) whose optics are set at a focal length of about 18 in., the display and its associated head tracking system can be used to superimpose and stabilize graphics on the surface of a work piece. This technology would obviate many expensive marking systems now used in aerospace manufacturing. The most challenging technical issue with respect to factory applications of AR is head position and orientation tracking. It requires high accuracy, long- range tracking in a high-noise environment. The approach we have chosen uses a head- mounted miniature video camera. The user's wearable computer system utilizes the camera to find fiducial markings that have been placed on known coordinates on or near the work piece. The system then computes the user's position and orientation relative to the fiducial marks. It is referred to as a `videometric' head tracker. In this paper, we describe the steps we took and the results we obtained in the process of prototyping our videometric head tracker, beginning with analytical and simulation results, and continuing through the working prototypes.

  18. Augmented Reality a Review on Technology and Applications

    NASA Astrophysics Data System (ADS)

    Petruse, Radu Emanuil; Bondrea, Ioan

    2014-11-01

    We present in this paper an overview of the concepts and potential industrial Augmented Reality applications that can be very efficient. We also present the basic technological requirements for an AR system

  19. Augmented Reality System for E-maintenance Application

    NASA Astrophysics Data System (ADS)

    Benbelkacem, S.; Zenati-Henda, N.; Belhocine, M.; Malek, S.

    2009-03-01

    We present in this paper an Augmented Reality platform for e-maintenance application. In our case, the aim is not to develop a vision system based on augmented reality concept, but to show the relationship between the different actors in the proposed architecture and to facilitate maintenance of the machine. With this platform we identify all possible scenarios which allow the technician to intervene on a machine breakdown using distant expert if necessary. Each scenario depends on the machine parameters and the technician competences. To implement a configuration of Augmented Reality system, we chose a case study of maintenance scenario for machine breakdown. Then we represent this scenario by an interaction model which allows establishing Augmented Reality configuration.

  20. Graphical user interface concepts for tactical augmented reality

    NASA Astrophysics Data System (ADS)

    Argenta, Chris; Murphy, Anne; Hinton, Jeremy; Cook, James; Sherrill, Todd; Snarski, Steve

    2010-04-01

    Applied Research Associates and BAE Systems are working together to develop a wearable augmented reality system under the DARPA ULTRA-Vis program†. Our approach to achieve the objectives of ULTRAVis, called iLeader, incorporates a full color 40° field of view (FOV) see-thru holographic waveguide integrated with sensors for full position and head tracking to provide an unobtrusive information system for operational maneuvers. iLeader will enable warfighters to mark-up the 3D battle-space with symbologic identification of graphical control measures, friendly force positions and enemy/target locations. Our augmented reality display provides dynamic real-time painting of symbols on real objects, a pose-sensitive 360° representation of relevant object positions, and visual feedback for a variety of system activities. The iLeader user interface and situational awareness graphical representations are highly intuitive, nondisruptive, and always tactically relevant. We used best human-factors practices, system engineering expertise, and cognitive task analysis to design effective strategies for presenting real-time situational awareness to the military user without distorting their natural senses and perception. We present requirements identified for presenting information within a see-through display in combat environments, challenges in designing suitable visualization capabilities, and solutions that enable us to bring real-time iconic command and control to the tactical user community.

  1. Improving Robotic Operator Performance Using Augmented Reality

    NASA Technical Reports Server (NTRS)

    Maida, James C.; Bowen, Charles K.; Pace, John W.

    2007-01-01

    The Special Purpose Dexterous Manipulator (SPDM) is a two-armed robot that functions as an extension to the end effector of the Space Station Robotics Manipulator System (SSRMS), currently in use on the International Space Station (ISS). Crew training for the SPDM is accomplished using a robotic hardware simulator, which performs most of SPDM functions under normal static Earth gravitational forces. Both the simulator and SPDM are controlled from a standard robotic workstation using a laptop for the user interface and three monitors for camera views. Most operations anticipated for the SPDM involve the manipulation, insertion, and removal of any of several types of Orbital Replaceable Unit (ORU), modules which control various ISS functions. Alignment tolerances for insertion of the ORU into its receptacle are 0.25 inch and 0.5 degree from nominal values. The pre-insertion alignment task must be performed within these tolerances by using available video camera views of the intrinsic features of the ORU and receptacle, without special registration markings. Since optimum camera views may not be available, and dynamic orbital lighting conditions may limit periods of viewing, a successful ORU insertion operation may require an extended period of time. This study explored the feasibility of using augmented reality (AR) to assist SPDM operations. Geometric graphical symbols were overlaid on one of the workstation monitors to afford cues to assist the operator in attaining adequate pre-insertion ORU alignment. Twelve skilled subjects performed eight ORU insertion tasks using the simulator with and without the AR symbols in a repeated measures experimental design. Results indicated that using the AR symbols reduced pre-insertion alignment error for all subjects and reduced the time to complete pre-insertion alignment for most subjects.

  2. Transparent 3D display for augmented reality

    NASA Astrophysics Data System (ADS)

    Lee, Byoungho; Hong, Jisoo

    2012-11-01

    Two types of transparent three-dimensional display systems applicable for the augmented reality are demonstrated. One of them is a head-mounted-display-type implementation which utilizes the principle of the system adopting the concave floating lens to the virtual mode integral imaging. Such configuration has an advantage in that the threedimensional image can be displayed at sufficiently far distance resolving the accommodation conflict with the real world scene. Incorporating the convex half mirror, which shows a partial transparency, instead of the concave floating lens, makes it possible to implement the transparent three-dimensional display system. The other type is the projection-type implementation, which is more appropriate for the general use than the head-mounted-display-type implementation. Its imaging principle is based on the well-known reflection-type integral imaging. We realize the feature of transparent display by imposing the partial transparency to the array of concave mirror which is used for the screen of reflection-type integral imaging. Two types of configurations, relying on incoherent and coherent light sources, are both possible. For the incoherent configuration, we introduce the concave half mirror array, whereas the coherent one adopts the holographic optical element which replicates the functionality of the lenslet array. Though the projection-type implementation is beneficial than the head-mounted-display in principle, the present status of the technical advance of the spatial light modulator still does not provide the satisfactory visual quality of the displayed three-dimensional image. Hence we expect that the head-mounted-display-type and projection-type implementations will come up in the market in sequence.

  3. Augmented Reality, the Future of Contextual Mobile Learning

    ERIC Educational Resources Information Center

    Sungkur, Roopesh Kevin; Panchoo, Akshay; Bhoyroo, Nitisha Kirtee

    2016-01-01

    Purpose: This study aims to show the relevance of augmented reality (AR) in mobile learning for the 21st century. With AR, any real-world environment can be augmented by providing users with accurate digital overlays. AR is a promising technology that has the potential to encourage learners to explore learning materials from a totally new…

  4. Augmented reality aiding collimator exchange at the LHC

    NASA Astrophysics Data System (ADS)

    Martínez, Héctor; Fabry, Thomas; Laukkanen, Seppo; Mattila, Jouni; Tabourot, Laurent

    2014-11-01

    Novel Augmented Reality techniques have the potential to have a large positive impact on the way remote maintenance operations are carried out in hazardous areas, e.g. areas where radiation doses that imply careful planning and optimization of maintenance operations are present. This paper describes an Augmented Reality strategy, system and implementation for aiding the remote collimator exchange in the LHC, currently the world's largest and highest-energy particle accelerator. The proposed system relies on marker detection and multi-modal augmentation in real-time. A database system has been used to ensure flexibility. The system has been tested in a mock-up facility, showing real time performance and great potential for future use in the LHC. The technical-scientific difficulties identified during the development of the system and the proposed solutions described in this paper may help the development of future Augmented Reality systems for remote handling in scientific facilities.

  5. Augmented reality for biomedical wellness sensor systems

    NASA Astrophysics Data System (ADS)

    Jenkins, Jeffrey; Szu, Harold

    2013-05-01

    Due to the commercial move and gaming industries, Augmented Reality (AR) technology has matured. By definition of AR, both artificial and real humans can be simultaneously present and realistically interact among one another. With the help of physics and physiology, we can build in the AR tool together with real human day-night webcam inputs through a simple interaction of heat transfer -getting hot, action and reaction -walking or falling, as well as the physiology -sweating due to activity. Knowing the person age, weight and 3D coordinates of joints in the body, we deduce the force, the torque, and the energy expenditure during real human movements and apply to an AR human model. We wish to support the physics-physiology AR version, PPAR, as a BMW surveillance tool for senior home alone (SHA). The functionality is to record senior walking and hand movements inside a home environment. Besides the fringe benefit of enabling more visits from grand children through AR video games, the PP-AR surveillance tool may serve as a means to screen patients in the home for potential falls at points around in house. Moreover, we anticipate PP-AR may help analyze the behavior history of SHA, e.g. enhancing the Smartphone SHA Ubiquitous Care Program, by discovering early symptoms of candidate Alzheimer-like midnight excursions, or Parkinson-like trembling motion for when performing challenging muscular joint movements. Using a set of coordinates corresponding to a set of 3D positions representing human joint locations, we compute the Kinetic Energy (KE) generated by each body segment over time. The Work is then calculated, and converted into calories. Using common graphics rendering pipelines, one could invoke AR technology to provide more information about patients to caretakers. Alerts to caretakers can be prompted by a patient's departure from their personal baseline, and the patient's time ordered joint information can be loaded to a graphics viewer allowing for high

  6. Gunner Goggles: Implementing Augmented Reality into Medical Education.

    PubMed

    Wang, Leo L; Wu, Hao-Hua; Bilici, Nadir; Tenney-Soeiro, Rebecca

    2016-01-01

    There is evidence that both smartphone and tablet integration into medical education has been lacking. At the same time, there is a niche for augmented reality (AR) to improve this process through the enhancement of textbook learning. Gunner Goggles is an attempt to enhance textbook learning in shelf exam preparatory review with augmented reality. Here we describe our initial prototype and detail the process by which augmented reality was implemented into our textbook through Layar. We describe the unique functionalities of our textbook pages upon augmented reality implementation, which includes links, videos and 3D figures, and surveyed 24 third year medical students for their impression of the technology. Upon demonstrating an initial prototype textbook chapter, 100% (24/24) of students felt that augmented reality improved the quality of our textbook chapter as a learning tool. Of these students, 92% (22/24) agreed that their shelf exam review was inadequate and 19/24 (79%) felt that a completed Gunner Goggles product would have been a viable alternative to their shelf exam review. Thus, while students report interest in the integration of AR into medical education test prep, future investigation into how the use of AR can improve performance on exams is warranted. PMID:27046620

  7. Transforming Polar Research with Google Glass Augmented Reality (Invited)

    NASA Astrophysics Data System (ADS)

    Ramachandran, R.; McEniry, M.; Maskey, M.

    2011-12-01

    Augmented reality is a new technology with the potential to accelerate the advancement of science, particularly in geophysical research. Augmented reality is defined as a live, direct or indirect, view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated sensory input such as sound, video, graphics or GPS data. When paired with advanced computing techniques on cloud resources, augmented reality has the potential to improve data collection techniques, visualizations, as well as in-situ analysis for many areas of research. Google is currently a pioneer of augmented reality technology and has released beta versions of their wearable computing device, Google Glass, to a select number of developers and beta testers. This community of 'Glass Explorers' is the vehicle from which Google shapes the future of their augmented reality device. Example applications of Google Glass in geophysical research range from use as a data gathering interface in harsh climates to an on-site visualization and analysis tool. Early participation in the shaping of the Google Glass device is an opportunity for researchers to tailor this new technology to their specific needs. The purpose of this presentation is to provide geophysical researchers with a hands-on first look at Google Glass and its potential as a scientific tool. Attendees will be given an overview of the technical specifications as well as a live demonstration of the device. Potential applications to geophysical research in polar regions will be the primary focus. The presentation will conclude with an open call to participate, during which attendees may indicate interest in developing projects that integrate Google Glass into their research. Application Mockup: Penguin Counter Google Glass Augmented Reality Device

  8. Transforming Polar Research with Google Glass Augmented Reality (Invited)

    NASA Astrophysics Data System (ADS)

    Ruthkoski, T.

    2013-12-01

    Augmented reality is a new technology with the potential to accelerate the advancement of science, particularly in geophysical research. Augmented reality is defined as a live, direct or indirect, view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated sensory input such as sound, video, graphics or GPS data. When paired with advanced computing techniques on cloud resources, augmented reality has the potential to improve data collection techniques, visualizations, as well as in-situ analysis for many areas of research. Google is currently a pioneer of augmented reality technology and has released beta versions of their wearable computing device, Google Glass, to a select number of developers and beta testers. This community of 'Glass Explorers' is the vehicle from which Google shapes the future of their augmented reality device. Example applications of Google Glass in geophysical research range from use as a data gathering interface in harsh climates to an on-site visualization and analysis tool. Early participation in the shaping of the Google Glass device is an opportunity for researchers to tailor this new technology to their specific needs. The purpose of this presentation is to provide geophysical researchers with a hands-on first look at Google Glass and its potential as a scientific tool. Attendees will be given an overview of the technical specifications as well as a live demonstration of the device. Potential applications to geophysical research in polar regions will be the primary focus. The presentation will conclude with an open call to participate, during which attendees may indicate interest in developing projects that integrate Google Glass into their research. Application Mockup: Penguin Counter Google Glass Augmented Reality Device

  9. Soldier-worn augmented reality system for tactical icon visualization

    NASA Astrophysics Data System (ADS)

    Roberts, David; Menozzi, Alberico; Clipp, Brian; Russler, Patrick; Cook, James; Karl, Robert; Wenger, Eric; Church, William; Mauger, Jennifer; Volpe, Chris; Argenta, Chris; Wille, Mark; Snarski, Stephen; Sherrill, Todd; Lupo, Jasper; Hobson, Ross; Frahm, Jan-Michael; Heinly, Jared

    2012-06-01

    This paper describes the development and demonstration of a soldier-worn augmented reality system testbed that provides intuitive 'heads-up' visualization of tactically-relevant geo-registered icons. Our system combines a robust soldier pose estimation capability with a helmet mounted see-through display to accurately overlay geo-registered iconography (i.e., navigation waypoints, blue forces, aircraft) on the soldier's view of reality. Applied Research Associates (ARA), in partnership with BAE Systems and the University of North Carolina - Chapel Hill (UNC-CH), has developed this testbed system in Phase 2 of the DARPA ULTRA-Vis (Urban Leader Tactical, Response, Awareness, and Visualization) program. The ULTRA-Vis testbed system functions in unprepared outdoor environments and is robust to numerous magnetic disturbances. We achieve accurate and robust pose estimation through fusion of inertial, magnetic, GPS, and computer vision data acquired from helmet kit sensors. Icons are rendered on a high-brightness, 40°×30° field of view see-through display. The system incorporates an information management engine to convert CoT (Cursor-on-Target) external data feeds into mil-standard icons for visualization. The user interface provides intuitive information display to support soldier navigation and situational awareness of mission-critical tactical information.

  10. Towards Robot teaching based on Virtual and Augmented Reality Concepts

    NASA Astrophysics Data System (ADS)

    Ennakr, Said; Domingues, Christophe; Benchikh, Laredj; Otmane, Samir; Mallem, Malik

    2009-03-01

    A complex system is a system made up of a great number of entities in local and simultaneous interaction. Its design requires the collaboration of engineers of various complementary specialties, so that it is necessary to invent new design methods. Indeed, currently the industry loses much time between the moment when the product model is designed and when the latter is serially produced on the lines of factories. This production is generally ensured by automated and more often robotized means. A deadline is thus necessary for the development of the automatisms and the robots work on a new product model. In this context we launched a study based on the principle of the mechatronics design in Augmented Reality-Virtual Reality. This new approach will bring solutions to problems encountered in many application scopes, but also to problems involved in the distance which separates the offices from design of vehicles and their production sites. This new approach will minimize the differences of errors between the design model and real prototype.

  11. Augmented reality three-dimensional display with light field fusion.

    PubMed

    Xie, Songlin; Wang, Peng; Sang, Xinzhu; Li, Chengyu

    2016-05-30

    A video see-through augmented reality three-dimensional display method is presented. The system that is used for dense viewpoint augmented reality presentation fuses the light fields of the real scene and the virtual model naturally. Inherently benefiting from the rich information of the light field, depth sense and occlusion can be handled under no priori depth information of the real scene. A series of processes are proposed to optimize the augmented reality performance. Experimental results show that the reconstructed fused 3D light field on the autostereoscopic display is well presented. The virtual model is naturally integrated into the real scene with a consistence between binocular parallax and monocular depth cues. PMID:27410076

  12. E-maintenance Scenarios Based on Augmented Reality Software Architecture

    NASA Astrophysics Data System (ADS)

    Benbelkacem, S.; Zenati-Henda, N.; Belhocine, M.

    2008-06-01

    This paper presents architecture of augmented reality for e-maintenance application. In our case, the aim is not to develop a vision system based on augmented reality concept, but to show the relationship between the different actors in the proposed architecture and to facilitate maintenance of the machine. This architecture allows implementing different scenarios which give to the technician possibilities to intervene on a breakdown device with a distant expert help. Each scenario is established according to machine parameters and technician competences. In our case, a hardware platform is designed to carry out e-maintenance scenarios. An example of e-maintenance scenario is then presented.

  13. Cranial implant design using augmented reality immersive system.

    PubMed

    Ai, Zhuming; Evenhouse, Ray; Leigh, Jason; Charbel, Fady; Rasmussen, Mary

    2007-01-01

    Software tools that utilize haptics for sculpting precise fitting cranial implants are utilized in an augmented reality immersive system to create a virtual working environment for the modelers. The virtual environment is designed to mimic the traditional working environment as closely as possible, providing more functionality for the users. The implant design process uses patient CT data of a defective area. This volumetric data is displayed in an implant modeling tele-immersive augmented reality system where the modeler can build a patient specific implant that precisely fits the defect. To mimic the traditional sculpting workspace, the implant modeling augmented reality system includes stereo vision, viewer centered perspective, sense of touch, and collaboration. To achieve optimized performance, this system includes a dual-processor PC, fast volume rendering with three-dimensional texture mapping, the fast haptic rendering algorithm, and a multi-threading architecture. The system replaces the expensive and time consuming traditional sculpting steps such as physical sculpting, mold making, and defect stereolithography. This augmented reality system is part of a comprehensive tele-immersive system that includes a conference-room-sized system for tele-immersive small group consultation and an inexpensive, easily deployable networked desktop virtual reality system for surgical consultation, evaluation and collaboration. This system has been used to design patient-specific cranial implants with precise fit. PMID:17377223

  14. Evaluating Virtual Reality and Augmented Reality Training for Industrial Maintenance and Assembly Tasks

    ERIC Educational Resources Information Center

    Gavish, Nirit; Gutiérrez, Teresa; Webel, Sabine; Rodríguez, Jorge; Peveri, Matteo; Bockholt, Uli; Tecchia, Franco

    2015-01-01

    The current study evaluated the use of virtual reality (VR) and augmented reality (AR) platforms, developed within the scope of the SKILLS Integrated Project, for industrial maintenance and assembly (IMA) tasks training. VR and AR systems are now widely regarded as promising training platforms for complex and highly demanding IMA tasks. However,…

  15. Computer-vision-based registration techniques for augmented reality

    NASA Astrophysics Data System (ADS)

    Hoff, William A.; Nguyen, Khoi; Lyon, Torsten

    1996-10-01

    Augmented reality is a term used to describe systems in which computer-generated information is superimposed on top of the real world; for example, through the use of a see- through head-mounted display. A human user of such a system could still see and interact with the real world, but have valuable additional information, such as descriptions of important features or instructions for performing physical tasks, superimposed on the world. For example, the computer could identify and overlay them with graphic outlines, labels, and schematics. The graphics are registered to the real-world objects and appear to be 'painted' onto those objects. Augmented reality systems can be used to make productivity aids for tasks such as inspection, manufacturing, and navigation. One of the most critical requirements for augmented reality is to recognize and locate real-world objects with respect to the person's head. Accurate registration is necessary in order to overlay graphics accurately on top of the real-world objects. At the Colorado School of Mines, we have developed a prototype augmented reality system that uses head-mounted cameras and computer vision techniques to accurately register the head to the scene. The current system locates and tracks a set of pre-placed passive fiducial targets placed on the real-world objects. The system computes the pose of the objects and displays graphics overlays using a see-through head-mounted display. This paper describes the architecture of the system and outlines the computer vision techniques used.

  16. Augmented Reality in Education--Cases, Places and Potentials

    ERIC Educational Resources Information Center

    Bower, Matt; Howe, Cathie; McCredie, Nerida; Robinson, Austin; Grover, David

    2014-01-01

    Augmented Reality is poised to profoundly transform Education as we know it. The capacity to overlay rich media onto the real world for viewing through web-enabled devices such as phones and tablet devices means that information can be made available to students at the exact time and place of need. This has the potential to reduce cognitive…

  17. Current Status, Opportunities and Challenges of Augmented Reality in Education

    ERIC Educational Resources Information Center

    Wu, Hsin-Kai; Lee, Silvia Wen-Yu; Chang, Hsin-Yi; Liang, Jyh-Chong

    2013-01-01

    Although augmented reality (AR) has gained much research attention in recent years, the term AR was given different meanings by varying researchers. In this article, we first provide an overview of definitions, taxonomies, and technologies of AR. We argue that viewing AR as a concept rather than a type of technology would be more productive for…

  18. Using Augmented Reality Tools to Enhance Children's Library Services

    ERIC Educational Resources Information Center

    Meredith, Tamara R.

    2015-01-01

    Augmented reality (AR) has been used and documented for a variety of commercial and educational purposes, and the proliferation of mobile devices has increased the average person's access to AR systems and tools. However, little research has been done in the area of using AR to supplement traditional library services, specifically for patrons aged…

  19. Augmented Reality Games: Using Technology on a Budget

    ERIC Educational Resources Information Center

    Annetta, Leonard; Burton, Erin Peters; Frazier, Wendy; Cheng, Rebecca; Chmiel, Margaret

    2012-01-01

    As smartphones become more ubiquitous among adolescents, there is increasing potential for these as a tool to engage students in science instruction through innovative learning environments such as augmented reality (AR). Aligned with the National Science Education Standards (NRC 1996) and integrating the three dimensions of "A Framework for K-12…

  20. Learning Physics through Play in an Augmented Reality Environment

    ERIC Educational Resources Information Center

    Enyedy, Noel; Danish, Joshua A.; Delacruz, Girlie; Kumar, Melissa

    2012-01-01

    The Learning Physics through Play Project (LPP) engaged 6-8-year old students (n = 43) in a series of scientific investigations of Newtonian force and motion including a series of augmented reality activities. We outline the two design principles behind the LPP curriculum: 1) the use of socio-dramatic, embodied play in the form of participatory…

  1. Sensors for Location-Based Augmented Reality the Example of Galileo and Egnos

    NASA Astrophysics Data System (ADS)

    Pagani, Alain; Henriques, José; Stricker, Didier

    2016-06-01

    Augmented Reality has long been approached from the point of view of Computer Vision and Image Analysis only. However, much more sensors can be used, in particular for location-based Augmented Reality scenarios. This paper reviews the various sensors that can be used for location-based Augmented Reality. It then presents and discusses several examples of the usage of Galileo and EGNOS in conjonction with Augmented Reality.

  2. Augmented reality based real-time subcutaneous vein imaging system.

    PubMed

    Ai, Danni; Yang, Jian; Fan, Jingfan; Zhao, Yitian; Song, Xianzheng; Shen, Jianbing; Shao, Ling; Wang, Yongtian

    2016-07-01

    A novel 3D reconstruction and fast imaging system for subcutaneous veins by augmented reality is presented. The study was performed to reduce the failure rate and time required in intravenous injection by providing augmented vein structures that back-project superimposed veins on the skin surface of the hand. Images of the subcutaneous vein are captured by two industrial cameras with extra reflective near-infrared lights. The veins are then segmented by a multiple-feature clustering method. Vein structures captured by the two cameras are matched and reconstructed based on the epipolar constraint and homographic property. The skin surface is reconstructed by active structured light with spatial encoding values and fusion displayed with the reconstructed vein. The vein and skin surface are both reconstructed in the 3D space. Results show that the structures can be precisely back-projected to the back of the hand for further augmented display and visualization. The overall system performance is evaluated in terms of vein segmentation, accuracy of vein matching, feature points distance error, duration times, accuracy of skin reconstruction, and augmented display. All experiments are validated with sets of real vein data. The imaging and augmented system produces good imaging and augmented reality results with high speed. PMID:27446690

  3. Augmented reality based real-time subcutaneous vein imaging system

    PubMed Central

    Ai, Danni; Yang, Jian; Fan, Jingfan; Zhao, Yitian; Song, Xianzheng; Shen, Jianbing; Shao, Ling; Wang, Yongtian

    2016-01-01

    A novel 3D reconstruction and fast imaging system for subcutaneous veins by augmented reality is presented. The study was performed to reduce the failure rate and time required in intravenous injection by providing augmented vein structures that back-project superimposed veins on the skin surface of the hand. Images of the subcutaneous vein are captured by two industrial cameras with extra reflective near-infrared lights. The veins are then segmented by a multiple-feature clustering method. Vein structures captured by the two cameras are matched and reconstructed based on the epipolar constraint and homographic property. The skin surface is reconstructed by active structured light with spatial encoding values and fusion displayed with the reconstructed vein. The vein and skin surface are both reconstructed in the 3D space. Results show that the structures can be precisely back-projected to the back of the hand for further augmented display and visualization. The overall system performance is evaluated in terms of vein segmentation, accuracy of vein matching, feature points distance error, duration times, accuracy of skin reconstruction, and augmented display. All experiments are validated with sets of real vein data. The imaging and augmented system produces good imaging and augmented reality results with high speed. PMID:27446690

  4. The Effect of an Augmented Reality Enhanced Mathematics Lesson on Student Achievement and Motivation

    ERIC Educational Resources Information Center

    Estapa, Anne; Nadolny, Larysa

    2015-01-01

    The purpose of the study was to assess student achievement and motivation during a high school augmented reality mathematics activity focused on dimensional analysis. Included in this article is a review of the literature on the use of augmented reality in mathematics and the combination of print with augmented reality, also known as interactive…

  5. An Analysis of Engagement in a Combination Indoor/Outdoor Augmented Reality Educational Game

    ERIC Educational Resources Information Center

    Folkestad, James; O'Shea, Patrick

    2011-01-01

    This paper describes the results of a qualitative analysis of video captured during a dual indoor/outdoor Augmented Reality experience. Augmented Reality is the layering of virtual information on top of the physical world. This Augmented Reality experience asked students to interact with the San Diego Museum of Art and the Botanical Gardens in San…

  6. Use of augmented reality in aircraft maintenance operations

    NASA Astrophysics Data System (ADS)

    De Marchi, L.; Ceruti, A.; Testoni, N.; Marzani, A.; Liverani, A.

    2014-03-01

    This paper illustrates a Human-Machine Interface based on Augmented Reality (AR) conceived to provide to maintenance operators the results of an impact detection methodology. In particular, the implemented tool dynamically interacts with a head portable visualization device allowing the inspector to see the estimated impact position on the structure. The impact detection methodology combines the signals collected by a network of piezosensors bonded on the structure to be monitored. Then a signal processing algorithm is applied to compensate for dispersion the acquired guided waves. The compensated waveforms yield to a robust estimation of guided waves difference in distance of propagation (DDOP), used to feed hyperbolic algorithms for impact location determination. The output of the impact methodology is passed to an AR visualization technology that is meant to support the inspector during the on-field inspection/diagnosis as well as the maintenance operations. The inspector, in fact, can see interactively in real time the impact data directly on the surface of the structure. Here the proposed approach is tested on the engine cowling of a Cessna 150 general aviation airplane. Preliminary results confirm the feasibility of the method and its exploitability in maintenance practice.

  7. Augmented-reality-based object modeling in Internet robotics

    NASA Astrophysics Data System (ADS)

    Dalton, Barney; Friz, Harald; Taylor, Kenneth

    1998-12-01

    This paper introduces an Augmented Reality interface that can be used in supervisory control of a robot over the Internet. The operator's client interface requires only a modest computer to run the Java control applet. Operators can completely specify the remote manipulator's path, using an Augmented Realty stick cursor, superimposed on multiple monoscopic images of the workspace. The same cursor is used to identify objects in the workspace, allowing interactive modeling of the workspace environment. Operating place predefined wireframe models of objects on an image of the workspace and use the cursor to align them with corresponding objects in the image.

  8. Spatial augmented reality based high accuracy human face projection

    NASA Astrophysics Data System (ADS)

    Li, Dong; Xie, Jinghui; Li, Yufeng; Weng, Dongdong; Liu, Yue

    2015-08-01

    This paper discusses the imaging principles and the technical difficulties of spatial augmented reality based human face projection. A novel geometry correction method is proposed to realize fast, high-accuracy face model projection. Using a depth camera to reconstruct the projected object, the relative position from the rendered model to the projector can be accessed and the initial projection image is generated. Then the projected image is distorted by using Bezier interpolation to guarantee that the projected texture matches with the object surface. The proposed method is under a simple process flow and can achieve high perception registration of virtual and real object. In addition, this method has a good performance in the condition that the reconstructed model is not exactly same with the rendered virtual model which extends its application area in the spatial augmented reality based human face projection.

  9. a Generic Augmented Reality Telescope for Heritage Valorization

    NASA Astrophysics Data System (ADS)

    Chendeb, S.; Ridene, T.; Leroy, L.

    2013-08-01

    Heritage valorisation is one of the greatest challenges that face countries in preserving their own identity from the globalization process. One of those scientific areas which allow this valorisation to be more attractive and at its bravest is the augmented reality. In this paper, we present an innovative augmented reality telescope used by tourists to explore a panoramic view with optional zooming facility, allowing thereby an accurate access to heritage information. The telescope we produced is generic, ergonomic, extensible, and modular by nature. It is designed to be conveniently set up anywhere in the world. We improve the practical use of our system by testing it right in the heart of Paris within a specific use case.

  10. Stereoscopic augmented reality with pseudo-realistic global illumination effects

    NASA Astrophysics Data System (ADS)

    de Sorbier, Francois; Saito, Hideo

    2014-03-01

    Recently, augmented reality has become very popular and has appeared in our daily life with gaming, guiding systems or mobile phone applications. However, inserting object in such a way their appearance seems natural is still an issue, especially in an unknown environment. This paper presents a framework that demonstrates the capabilities of Kinect for convincing augmented reality in an unknown environment. Rather than pre-computing a reconstruction of the scene like proposed by most of the previous method, we propose a dynamic capture of the scene that allows adapting to live changes of the environment. Our approach, based on the update of an environment map, can also detect the position of the light sources. Combining information from the environment map, the light sources and the camera tracking, we can display virtual objects using stereoscopic devices with global illumination effects such as diffuse and mirror reflections, refractions and shadows in real time.

  11. Fourier holographic display for augmented reality using holographic optical element

    NASA Astrophysics Data System (ADS)

    Li, Gang; Lee, Dukho; Jeong, Youngmo; Lee, Byoungho

    2016-03-01

    A method for realizing a three-dimensional see-through augmented reality in Fourier holographic display is proposed. A holographic optical element (HOE) with the function of Fourier lens is adopted in the system. The Fourier hologram configuration causes the real scene located behind the lens to be distorted. In the proposed method, since the HOE is transparent and it functions as the lens just for Bragg matched condition, there is not any distortion when people observe the real scene through the lens HOE (LHOE). Furthermore, two optical characteristics of the recording material are measured for confirming the feasibility of using LHOE in the proposed see-through augmented reality holographic display. The results are verified experimentally.

  12. Use of display technologies for augmented reality enhancement

    NASA Astrophysics Data System (ADS)

    Harding, Kevin

    2016-06-01

    Augmented reality (AR) is seen as an important tool for the future of user interfaces as well as training applications. An important application area for AR is expected to be in the digitization of training and worker instructions used in the Brilliant Factory environment. The transition of work instructions methods from printed pages in a book or taped to a machine to virtual simulations is a long step with many challenges along the way. A variety of augmented reality tools are being explored today for industrial applications that range from simple programmable projections in the work space to 3D displays and head mounted gear. This paper will review where some of these tool are today and some of the pros and cons being considered for the future worker environment.

  13. Augmented Reality Based Doppler Lidar Data Visualization: Promises and Challenges

    NASA Astrophysics Data System (ADS)

    Cherukuru, N. W.; Calhoun, R.

    2016-06-01

    Augmented reality (AR) is a technology in which the enables the user to view virtual content as if it existed in real world. We are exploring the possibility of using this technology to view radial velocities or processed wind vectors from a Doppler wind lidar, thus giving the user an ability to see the wind in a literal sense. This approach could find possible applications in aviation safety, atmospheric data visualization as well as in weather education and public outreach. As a proof of concept, we used the lidar data from a recent field campaign and developed a smartphone application to view the lidar scan in augmented reality. In this paper, we give a brief methodology of this feasibility study, present the challenges and promises of using AR technology in conjunction with Doppler wind lidars.

  14. An augmented reality framework for soft tissue surgery.

    PubMed

    Mountney, Peter; Fallert, Johannes; Nicolau, Stephane; Soler, Luc; Mewes, Philip W

    2014-01-01

    Augmented reality for soft tissue laparoscopic surgery is a growing topic of interest in the medical community and has potential application in intra-operative planning and image guidance. Delivery of such systems to the operating room remains complex with theoretical challenges related to tissue deformation and the practical limitations of imaging equipment. Current research in this area generally only solves part of the registration pipeline or relies on fiducials, manual model alignment or assumes that tissue is static. This paper proposes a novel augmented reality framework for intra-operative planning: the approach co-registers pre-operative CT with stereo laparoscopic images using cone beam CT and fluoroscopy as bridging modalities. It does not require fiducials or manual alignment and compensates for tissue deformation from insufflation and respiration while allowing the laparoscope to be navigated. The paper's theoretical and practical contributions are validated using simulated, phantom, ex vivo, in vivo and non medical data. PMID:25333146

  15. Augmented reality assisted surgery: a urologic training tool.

    PubMed

    Dickey, Ryan M; Srikishen, Neel; Lipshultz, Larry I; Spiess, Philippe E; Carrion, Rafael E; Hakky, Tariq S

    2016-01-01

    Augmented reality is widely used in aeronautics and is a developing concept within surgery. In this pilot study, we developed an application for use on Google Glass ® optical head-mounted display to train urology residents in how to place an inflatable penile prosthesis. We use the phrase Augmented Reality Assisted Surgery to describe this novel application of augmented reality in the setting of surgery. The application demonstrates the steps of the surgical procedure of inflatable penile prosthesis placement. It also contains software that allows for detection of interest points using a camera feed from the optical head-mounted display to enable faculty to interact with residents during placement of the penile prosthesis. Urology trainees and faculty who volunteered to take part in the study were given time to experience the technology in the operative or perioperative setting and asked to complete a feedback survey. From 30 total participants using a 10-point scale, educational usefulness was rated 8.6, ease of navigation was rated 7.6, likelihood to use was rated 7.4, and distraction in operating room was rated 4.9. When stratified between trainees and faculty, trainees found the technology more educationally useful, and less distracting. Overall, 81% of the participants want this technology in their residency program, and 93% see this technology in the operating room in the future. Further development of this technology is warranted before full release, and further studies are necessary to better characterize the effectiveness of Augmented Reality Assisted Surgery in urologic surgical training. PMID:26620455

  16. Augmented reality assisted surgery: a urologic training tool

    PubMed Central

    Dickey, Ryan M; Srikishen, Neel; Lipshultz, Larry I; Spiess, Philippe E; Carrion, Rafael E; Hakky, Tariq S

    2016-01-01

    Augmented reality is widely used in aeronautics and is a developing concept within surgery. In this pilot study, we developed an application for use on Google Glass® optical head-mounted display to train urology residents in how to place an inflatable penile prosthesis. We use the phrase Augmented Reality Assisted Surgery to describe this novel application of augmented reality in the setting of surgery. The application demonstrates the steps of the surgical procedure of inflatable penile prosthesis placement. It also contains software that allows for detection of interest points using a camera feed from the optical head-mounted display to enable faculty to interact with residents during placement of the penile prosthesis. Urology trainees and faculty who volunteered to take part in the study were given time to experience the technology in the operative or perioperative setting and asked to complete a feedback survey. From 30 total participants using a 10-point scale, educational usefulness was rated 8.6, ease of navigation was rated 7.6, likelihood to use was rated 7.4, and distraction in operating room was rated 4.9. When stratified between trainees and faculty, trainees found the technology more educationally useful, and less distracting. Overall, 81% of the participants want this technology in their residency program, and 93% see this technology in the operating room in the future. Further development of this technology is warranted before full release, and further studies are necessary to better characterize the effectiveness of Augmented Reality Assisted Surgery in urologic surgical training. PMID:26620455

  17. Flexible augmented reality architecture applied to environmental management

    NASA Astrophysics Data System (ADS)

    Correia, Nuno M. R.; Romao, Teresa; Santos, Carlos; Trabuco, Adelaide; Santos, Rossana; Romero, Luis; Danado, Jose; Dias, Eduardo; Camara, Antonio; Nobre, Edmundo

    2003-05-01

    Environmental management often requires in loco observation of the area under analysis. Augmented Reality (AR) technologies allow real time superimposition of synthetic objects on real images, providing augmented knowledge about the surrounding world. Users of an AR system can visualize the real surrounding world together with additional data generated in real time in a contextual way. The work reported in this paper was done in the scope of ANTS (Augmented Environments) project. ANTS is an AR project that explores the development of an augmented reality technological infrastructure for environmental management. This paper presents the architecture and the most relevant modules of ANTS. The system"s architecture follows the client-server model and is based on several independent, but functionally interdependent modules. It has a flexible design, which allows the transfer of some modules to and from the client side, according to the available processing capacities of the client device and the application"s requirements. It combines several techniques to identify the user"s position and orientation allowing the system to adapt to the particular characteristics of each environment. The determination of the data associated to a certain location involves the use of both a 3D Model of the location and the multimedia geo-referenced database.

  18. Contextualized Interdisciplinary Learning in Mainstream Schools Using Augmented Reality-Based Technology: A Dream or Reality?

    ERIC Educational Resources Information Center

    Ong, Alex

    2010-01-01

    The use of augmented reality (AR) tools, where virtual objects such as tables and graphs can be displayed and be interacted with in real scenes created from imaging devices, in mainstream school curriculum is uncommon, as they are potentially costly and sometimes bulky. Thus, such learning tools are mainly applied in tertiary institutions, such as…

  19. Augmented Reality Environments in Learning, Communicational and Professional Contexts in Higher Education

    ERIC Educational Resources Information Center

    Martín Gutiérrez, Jorge; Meneses Fernández, María Dolores

    2014-01-01

    This paper explores educational and professional uses of augmented learning environment concerned with issues of training and entertainment. We analyze the state-of-art research of some scenarios based on augmented reality. Some examples for the purpose of education and simulation are described. These applications show that augmented reality can…

  20. Fast Markerless Tracking for Augmented Reality in Planar Environment

    NASA Astrophysics Data System (ADS)

    Basori, Ahmad Hoirul; Afif, Fadhil Noer; Almazyad, Abdulaziz S.; AbuJabal, Hamza Ali S.; Rehman, Amjad; Alkawaz, Mohammed Hazim

    2015-12-01

    Markerless tracking for augmented reality should not only be accurate but also fast enough to provide a seamless synchronization between real and virtual beings. Current reported methods showed that a vision-based tracking is accurate but requires high computational power. This paper proposes a real-time hybrid-based method for tracking unknown environments in markerless augmented reality. The proposed method provides collaboration of vision-based approach with accelerometers and gyroscopes sensors as camera pose predictor. To align the augmentation relative to camera motion, the tracking method is done by substituting feature-based camera estimation with combination of inertial sensors with complementary filter to provide more dynamic response. The proposed method managed to track unknown environment with faster processing time compared to available feature-based approaches. Moreover, the proposed method can sustain its estimation in a situation where feature-based tracking loses its track. The collaboration of sensor tracking managed to perform the task for about 22.97 FPS, up to five times faster than feature-based tracking method used as comparison. Therefore, the proposed method can be used to track unknown environments without depending on amount of features on scene, while requiring lower computational cost.

  1. Live texturing of augmented reality characters from colored drawings.

    PubMed

    Magnenat, Stéphane; Ngo, Dat Tien; Zünd, Fabio; Ryffel, Mattia; Noris, Gioacchino; Rothlin, Gerhard; Marra, Alessia; Nitti, Maurizio; Fua, Pascal; Gross, Markus; Sumner, Robert W

    2015-11-01

    Coloring books capture the imagination of children and provide them with one of their earliest opportunities for creative expression. However, given the proliferation and popularity of digital devices, real-world activities like coloring can seem unexciting, and children become less engaged in them. Augmented reality holds unique potential to impact this situation by providing a bridge between real-world activities and digital enhancements. In this paper, we present an augmented reality coloring book App in which children color characters in a printed coloring book and inspect their work using a mobile device. The drawing is detected and tracked, and the video stream is augmented with an animated 3-D version of the character that is textured according to the child's coloring. This is possible thanks to several novel technical contributions. We present a texturing process that applies the captured texture from a 2-D colored drawing to both the visible and occluded regions of a 3-D character in real time. We develop a deformable surface tracking method designed for colored drawings that uses a new outlier rejection algorithm for real-time tracking and surface deformation recovery. We present a content creation pipeline to efficiently create the 2-D and 3-D content. And, finally, we validate our work with two user studies that examine the quality of our texturing algorithm and the overall App experience. PMID:26340776

  2. FlyAR: augmented reality supported micro aerial vehicle navigation.

    PubMed

    Zollmann, Stefanie; Hoppe, Christof; Langlotz, Tobias; Reitmayr, Gerhard

    2014-04-01

    Micro aerial vehicles equipped with high-resolution cameras can be used to create aerial reconstructions of an area of interest. In that context automatic flight path planning and autonomous flying is often applied but so far cannot fully replace the human in the loop, supervising the flight on-site to assure that there are no collisions with obstacles. Unfortunately, this workflow yields several issues, such as the need to mentally transfer the aerial vehicle’s position between 2D map positions and the physical environment, and the complicated depth perception of objects flying in the distance. Augmented Reality can address these issues by bringing the flight planning process on-site and visualizing the spatial relationship between the planned or current positions of the vehicle and the physical environment. In this paper, we present Augmented Reality supported navigation and flight planning of micro aerial vehicles by augmenting the user’s view with relevant information for flight planning and live feedback for flight supervision. Furthermore, we introduce additional depth hints supporting the user in understanding the spatial relationship of virtual waypoints in the physical world and investigate the effect of these visualization techniques on the spatial understanding. PMID:24650983

  3. Marshall Engineers Use Virtual Reality

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. Marshall Spce Flight Center (MSFC) is begirning to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models are used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup is to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provides general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC).

  4. Augmented reality technology and application in aerodrome building

    NASA Astrophysics Data System (ADS)

    Guo, Yan; Du, Qingyun

    2007-06-01

    Paper discusses the function and meaning of AR in the aerodrome construction project. In the initial stages of the aerodrome building, it applies the advanced technology including 3S (RS, GIS and GPS) techniques, augmented reality, virtual reality and digital image processing techniques, and so on. Virtual image or other information that is created by the computer is superimposed with the surveying district which the observer stands is looking at. When the observer is moving in the district, the virtual information is changing correspondingly, just like the virtual information really exists in real environment. The observer can see the scene of aerodrome if he puts on clairvoyant HMD (head mounted display). If we have structural information of the aerodrome in database, AR can supply X-ray of the building just like pipeline, wire and framework in walls.

  5. Registration correction in hybrid tracking for augmented reality

    NASA Astrophysics Data System (ADS)

    Li, Xinyu; Chen, Dongyi

    2007-12-01

    One of the most important problems faced when building Augmented Reality (AR) systems is registration error. In mobile and outdoor AR systems, dynamic errors are the largest source of registration errors. Hybrid tracking is a key technique in combating the effects of dynamic errors. In this paper we present a multi-stage close-loop correction algorithm for reducing dynamic registration errors in AR system. The control law in this algorithm is based on 2D visual servoing or image-based camera control that allows control of a virtual camera to reduce registration error. The experimental results illustrate the effectiveness of this algorithm.

  6. Context-aware Augmented Reality in laparoscopic surgery.

    PubMed

    Katić, Darko; Wekerle, Anna-Laura; Görtler, Jochen; Spengler, Patrick; Bodenstedt, Sebastian; Röhl, Sebastian; Suwelack, Stefan; Kenngott, Hannes Götz; Wagner, Martin; Müller-Stich, Beat Peter; Dillmann, Rüdiger; Speidel, Stefanie

    2013-03-01

    Augmented Reality is a promising paradigm for intraoperative assistance. Yet, apart from technical issues, a major obstacle to its clinical application is the man-machine interaction. Visualization of unnecessary, obsolete or redundant information may cause confusion and distraction, reducing usefulness and acceptance of the assistance system. We propose a system capable of automatically filtering available information based on recognized phases in the operating room. Our system offers a specific selection of available visualizations which suit the surgeon's needs best. The system was implemented for use in laparoscopic liver and gallbladder surgery and evaluated in phantom experiments in conjunction with expert interviews. PMID:23541864

  7. Context-Aware Based Efficient Training System Using Augmented Reality and Gravity Sensor for Healthcare Services

    NASA Astrophysics Data System (ADS)

    Kim, Seoksoo; Jung, Sungmo; Song, Jae-Gu; Kang, Byong-Ho

    As augmented reality and a gravity sensor is of growing interest, siginificant developement is being made on related technology, which allows application of the technology in a variety of areas with greater expectations. In applying Context-aware to augmented reality, it can make useful programs. A traning system suggested in this study helps a user to understand an effcienct training method using augmented reality and make sure if his exercise is being done propery based on the data collected by a gravity sensor. Therefore, this research aims to suggest an efficient training environment that can enhance previous training methods by applying augmented reality and a gravity sensor.

  8. On Location Learning: Authentic Applied Science with Networked Augmented Realities

    NASA Astrophysics Data System (ADS)

    Rosenbaum, Eric; Klopfer, Eric; Perry, Judy

    2007-02-01

    The learning of science can be made more like the practice of science through authentic simulated experiences. We have created a networked handheld Augmented Reality environment that combines the authentic role-playing of Augmented Realities and the underlying models of Participatory Simulations. This game, known as Outbreak @ The Institute, is played across a university campus where players take on the roles of doctors, medical technicians, and public health experts to contain a disease outbreak. Players can interact with virtual characters and employ virtual diagnostic tests and medicines. They are challenged to identify the source and prevent the spread of an infectious disease that can spread among real and/or virtual characters according to an underlying model. In this paper, we report on data from three high school classes who played the game. We investigate students' perception of the authenticity of the game in terms of their personal embodiment in the game, their experience playing different roles, and their understanding of the dynamic model underlying the game.

  9. Augmented reality needle guidance improves facet joint injection training

    NASA Astrophysics Data System (ADS)

    Ungi, Tamas; Yeo, Caitlin T.; U-Thainual, Paweena; McGraw, Robert C.; Fichtinger, Gabor

    2011-03-01

    PURPOSE: The purpose of this study was to determine if medical trainees would benefit from augmented reality image overlay and laser guidance in learning how to set the correct orientation of a needle for percutaneous facet joint injection. METHODS: A total of 28 medical students were randomized into two groups: (1) The Overlay group received a training session of four insertions with image and laser guidance followed by two insertions with laser overlay only; (2) The Control group was trained by carrying out six freehand insertions. After the training session, needle trajectories of two facet joint injections without any guidance were recorded by an electromagnetic tracker and were analyzed. Number of successful needle placements, distance covered by needle tip inside the phantom and procedural time were measured to evaluate performance. RESULTS: Number of successful placements was significantly higher in the Overlay group compared to the Control group (85.7% vs. 57.1%, p = 0.038). Procedure time and distance covered inside phantom have both been found to be less in the Overlay group, although not significantly. CONCLUSION: Training with augmented reality image overlay and laser guidance improves the accuracy of facet joint injections in medical students learning image-guided facet joint needle placement.

  10. Computer-aided liver surgery planning: an augmented reality approach

    NASA Astrophysics Data System (ADS)

    Bornik, Alexander; Beichel, Reinhard; Reitinger, Bernhard; Gotschuli, Georg; Sorantin, Erich; Leberl, Franz W.; Sonka, Milan

    2003-05-01

    Surgical resection of liver tumors requires a detailed three-dimensional understanding of a complex arrangement of vasculature, liver segments and tumors inside the liver. In most cases, surgeons need to develop this understanding by looking at sequences of axial images from modalities like X-ray computed tomography. A system for liver surgery planning is reported that enables physicians to visualize and refine segmented input liver data sets, as well as to simulate and evaluate different resections plans. The system supports surgeons in finding the optimal treatment strategy for each patient and eases the data preparation process. The use of augmented reality contributes to a user-friendly design and simplifies complex interaction with 3D objects. The main function blocks developed so far are: basic augmented reality environment, user interface, rendering, surface reconstruction from segmented volume data sets, surface manipulation and quantitative measurement toolkit. The flexible design allows to add functionality via plug-ins. First practical evaluation steps have shown a good acceptance. Evaluation of the system is ongoing and future feedback from surgeons will be collected and used for design refinements.

  11. archAR: an archaeological augmented reality experience

    NASA Astrophysics Data System (ADS)

    Wiley, Bridgette; Schulze, Jürgen P.

    2015-03-01

    We present an application for Android phones or tablets called "archAR" that uses augmented reality as an alternative, portable way of viewing archaeological information from UCSD's Levantine Archaeology Laboratory. archAR provides a unique experience of flying through an archaeological dig site in the Levantine area and exploring the artifacts uncovered there. Using a Google Nexus tablet and Qualcomm's Vuforia API, we use an image target as a map and overlay a three-dimensional model of the dig site onto it, augmenting reality such that we are able to interact with the plotted artifacts. The user can physically move the Android device around the image target and see the dig site model from any perspective. The user can also move the device closer to the model in order to "zoom" into the view of a particular section of the model and its associated artifacts. This is especially useful, as the dig site model and the collection of artifacts are very detailed. The artifacts are plotted as points, colored by type. The user can touch the virtual points to trigger a popup information window that contains details of the artifact, such as photographs, material descriptions, and more.

  12. Verification of Image Based Augmented Reality for Urban Visualization

    NASA Astrophysics Data System (ADS)

    Fuse, T.; Nishikawa, S.; Fukunishi, Y.

    2012-07-01

    Recently, visualization of urban scenes with various information attracts attention. For the transmission of urban scenes, virtual reality has been widely used. Since the virtual reality requires comprehensive and detailed three dimensional models, the manual dependent modelling takes a lot of time and effort. On the other hand, it has been tackled that various data is superimposed on the scene which the users see at the time instead of comprehensive modelling, which is well known as augmented reality (AR). Simultaneous localization and mapping (SLAM) has been attempted using simple video cameras for the AR. This method estimates exterior orientation factors of the camera, and three dimensional reconstructions of feature points simultaneously. The method, however, has been applied to only small indoor space. This paper investigates the applicability of the popular method of SALM to wide outdoor space, and improves the stability of the method. Through the application, the tracked feature points successfully are greatly reduced compared with application in indoor environment. According to the experimental result, simple markers or GPS are introduced as auxiliary information. The markers gives the stability of optimization, and GPS gives real scale to AR spaces. Additionally, feature points tracking method is modified by assigning amplitude of displacement and depth. The effect of the markers and GPS are confirmed. On the other hand, some limitations of the method are understood. As a result, more impressive visualization will be accomplished.

  13. Engineering applications of virtual reality

    NASA Astrophysics Data System (ADS)

    Smith, James R.; Grimes, Robert V.; Plant, Tony A.

    1996-04-01

    This paper addresses some of the practical applications, advantages and difficulties associated with the engineering applications of virtual reality. The paper tracks actual investigative work in progress on this subject at the BNR research lab in RTP, NC. This work attempts to demonstrate the actual value added to the engineering process by using existing 3D CAD data for interactive information navigation and evaluation of design concepts and products. Specifically, the work includes translation of Parametric Technology's Pro/ENGINEER models into a virtual world to evaluate potential attributes such as multiple concept exploration and product installation assessment. Other work discussed in this paper includes extensive evaluation of two new tools, VRML and SGI's/Template Graphics' WebSpace for navigation through Pro/ENGINEER models with links to supporting technical documentation and data. The benefits of using these tolls for 3D interactive navigation and exploration throughout three key phases of the physical design process is discussed in depth. The three phases are Design Concept Development, Product Design Evaluation and Product Design Networking. The predicted values added include reduced time to `concept ready', reduced prototype iterations, increased `design readiness' and shorter manufacturing introduction cycles.

  14. Utilization of the Space Vision System as an Augmented Reality System For Mission Operations

    NASA Technical Reports Server (NTRS)

    Maida, James C.; Bowen, Charles

    2003-01-01

    Augmented reality is a technique whereby computer generated images are superimposed on live images for visual enhancement. Augmented reality can also be characterized as dynamic overlays when computer generated images are registered with moving objects in a live image. This technique has been successfully implemented, with low to medium levels of registration precision, in an NRA funded project entitled, "Improving Human Task Performance with Luminance Images and Dynamic Overlays". Future research is already being planned to also utilize a laboratory-based system where more extensive subject testing can be performed. However successful this might be, the problem will still be whether such a technology can be used with flight hardware. To answer this question, the Canadian Space Vision System (SVS) will be tested as an augmented reality system capable of improving human performance where the operation requires indirect viewing. This system has already been certified for flight and is currently flown on each shuttle mission for station assembly. Successful development and utilization of this system in a ground-based experiment will expand its utilization for on-orbit mission operations. Current research and development regarding the use of augmented reality technology is being simulated using ground-based equipment. This is an appropriate approach for development of symbology (graphics and annotation) optimal for human performance and for development of optimal image registration techniques. It is anticipated that this technology will become more pervasive as it matures. Because we know what and where almost everything is on ISS, this reduces the registration problem and improves the computer model of that reality, making augmented reality an attractive tool, provided we know how to use it. This is the basis for current research in this area. However, there is a missing element to this process. It is the link from this research to the current ISS video system and to

  15. A Mobile Service Oriented Multiple Object Tracking Augmented Reality Architecture for Education and Learning Experiences

    ERIC Educational Resources Information Center

    Rattanarungrot, Sasithorn; White, Martin; Newbury, Paul

    2014-01-01

    This paper describes the design of our service-oriented architecture to support mobile multiple object tracking augmented reality applications applied to education and learning scenarios. The architecture is composed of a mobile multiple object tracking augmented reality client, a web service framework, and dynamic content providers. Tracking of…

  16. Augmenting a Child's Reality: Using Educational Tablet Technology

    ERIC Educational Resources Information Center

    Tanner, Patricia; Karas, Carly; Schofield, Damian

    2014-01-01

    This study investigates the classroom integration of an innovative technology, augmented reality. Although the process of adding new technologies into a classroom setting can be daunting, the concept of augmented reality has demonstrated the ability to educate students and to assist with their comprehension of a procedural task. One half of the…

  17. Augmented Reality Trends in Education: A Systematic Review of Research and Applications

    ERIC Educational Resources Information Center

    Bacca, Jorge; Baldiris, Silvia; Fabregat, Ramon; Graf, Sabine; Kinshuk

    2014-01-01

    In recent years, there has been an increasing interest in applying Augmented Reality (AR) to create unique educational settings. So far, however, there is a lack of review studies with focus on investigating factors such as: the uses, advantages, limitations, effectiveness, challenges and features of augmented reality in educational settings.…

  18. Mobile Augmented Reality in Supporting Peer Assessment: An Implementation in a Fundamental Design Course

    ERIC Educational Resources Information Center

    Lan, Chung-Hsien; Chao, Stefan; Kinshuk; Chao, Kuo-Hung

    2013-01-01

    This study presents a conceptual framework for supporting mobile peer assessment by incorporating augmented reality technology to eliminate limitation of reviewing and assessing. According to the characteristics of mobile technology and augmented reality, students' work can be shown in various ways by considering the locations and situations. This…

  19. Social Augmented Reality: Enhancing Context-Dependent Communication and Informal Learning at Work

    ERIC Educational Resources Information Center

    Pejoska, Jana; Bauters, Merja; Purma, Jukka; Leinonen, Teemu

    2016-01-01

    Our design proposal of social augmented reality (SoAR) grows from the observed difficulties of practical applications of augmented reality (AR) in workplace learning. In our research we investigated construction workers doing physical work in the field and analyzed the data using qualitative methods in various workshops. The challenges related to…

  20. Augmented Reality: A Brand New Challenge for the Assessment and Treatment of Psychological Disorders

    PubMed Central

    Chicchi Giglioli, Irene Alice; Pallavicini, Federica; Pedroli, Elisa; Serino, Silvia; Riva, Giuseppe

    2015-01-01

    Augmented Reality is a new technological system that allows introducing virtual contents in the real world in order to run in the same representation and, in real time, enhancing the user's sensory perception of reality. From another point of view, Augmented Reality can be defined as a set of techniques and tools that add information to the physical reality. To date, Augmented Reality has been used in many fields, such as medicine, entertainment, maintenance, architecture, education, and cognitive and motor rehabilitation but very few studies and applications of AR exist in clinical psychology. In the treatment of psychological disorders, Augmented Reality has given preliminary evidence to be a useful tool due to its adaptability to the patient needs and therapeutic purposes and interactivity. Another relevant factor is the quality of the user's experience in the Augmented Reality system determined from emotional engagement and sense of presence. This experience could increase the AR ecological validity in the treatment of psychological disorders. This paper reviews the recent studies on the use of Augmented Reality in the evaluation and treatment of psychological disorders, focusing on current uses of this technology and on the specific features that delineate Augmented Reality a new technique useful for psychology. PMID:26339283

  1. Augmented Reality: A Brand New Challenge for the Assessment and Treatment of Psychological Disorders.

    PubMed

    Chicchi Giglioli, Irene Alice; Pallavicini, Federica; Pedroli, Elisa; Serino, Silvia; Riva, Giuseppe

    2015-01-01

    Augmented Reality is a new technological system that allows introducing virtual contents in the real world in order to run in the same representation and, in real time, enhancing the user's sensory perception of reality. From another point of view, Augmented Reality can be defined as a set of techniques and tools that add information to the physical reality. To date, Augmented Reality has been used in many fields, such as medicine, entertainment, maintenance, architecture, education, and cognitive and motor rehabilitation but very few studies and applications of AR exist in clinical psychology. In the treatment of psychological disorders, Augmented Reality has given preliminary evidence to be a useful tool due to its adaptability to the patient needs and therapeutic purposes and interactivity. Another relevant factor is the quality of the user's experience in the Augmented Reality system determined from emotional engagement and sense of presence. This experience could increase the AR ecological validity in the treatment of psychological disorders. This paper reviews the recent studies on the use of Augmented Reality in the evaluation and treatment of psychological disorders, focusing on current uses of this technology and on the specific features that delineate Augmented Reality a new technique useful for psychology. PMID:26339283

  2. An augmented reality assistance platform for eye laser surgery.

    PubMed

    Ee Ping Ong; Lee, Jimmy Addison; Jun Cheng; Beng Hai Lee; Guozhen Xu; Laude, Augustinus; Teoh, Stephen; Tock Han Lim; Wong, Damon W K; Jiang Liu

    2015-08-01

    This paper presents a novel augmented reality assistance platform for eye laser surgery. The aims of the proposed system are for the application of assisting eye doctors in pre-planning as well as providing guidance and protection during laser surgery. We developed algorithms to automatically register multi-modal images, detect macula and optic disc regions, and demarcate these as protected areas from laser surgery. The doctor will then be able to plan the laser treatment pre-surgery using the registered images and segmented regions. Thereafter, during live surgery, the system will automatically register and track the slit lamp video frames on the registered retina images, send appropriate warning when the laser is near protected areas, and disable the laser function when it points into the protected areas. The proposed system prototype can help doctors to speed up laser surgery with confidence without fearing that they may unintentionally fire laser in the protected areas. PMID:26737252

  3. Real-time tomographic holography for augmented reality

    PubMed Central

    Galeotti, John M.; Siegel, Mel; Stetten, George

    2011-01-01

    The concept and instantiation of Real-Time Tomographic Holography (RTTH) for augmented reality is presented. RTTH enables natural hand-eye coordination to guide invasive medical procedures without requiring tracking or a head-mounted device. It places a real-time virtual image of an object's cross-section into its actual location, without noticeable viewpoint dependence (e.g. parallax error). The virtual image is viewed through a flat narrow-band Holographic Optical Element with optical power that generates an in-situ virtual image (within 1 m of the HOE) from a small SLM display without obscuring a direct view of the physical world. Rigidly fixed upon a medical ultrasound probe, an RTTH device could show the scan in its actual location inside the patient, even as the probe was moved relative to the patient. PMID:20634827

  4. Computer vision and augmented reality in gastrointestinal endoscopy.

    PubMed

    Mahmud, Nadim; Cohen, Jonah; Tsourides, Kleovoulos; Berzin, Tyler M

    2015-08-01

    Augmented reality (AR) is an environment-enhancing technology, widely applied in the computer sciences, which has only recently begun to permeate the medical field. Gastrointestinal endoscopy-which relies on the integration of high-definition video data with pathologic correlates-requires endoscopists to assimilate and process a tremendous amount of data in real time. We believe that AR is well positioned to provide computer-guided assistance with a wide variety of endoscopic applications, beginning with polyp detection. In this article, we review the principles of AR, describe its potential integration into an endoscopy set-up, and envisage a series of novel uses. With close collaboration between physicians and computer scientists, AR promises to contribute significant improvements to the field of endoscopy. PMID:26133175

  5. Toward Standard Usability Questionnaires for Handheld Augmented Reality.

    PubMed

    Santos, Marc Ericson C; Polvi, Jarkko; Taketomi, Takafumi; Yamamoto, Goshiro; Sandor, Christian; Kato, Hirokazu

    2015-01-01

    Usability evaluations are important to improving handheld augmented reality (HAR) systems. However, no standard questionnaire considers perceptual and ergonomic issues found in HAR. The authors performed a systematic literature review to enumerate these issues. Based on these issues, they created a HAR usability scale that consists of comprehensibility and manipulability scales. These scales measure general system usability, ease of understanding the information presented, and ease of handling the device. The questionnaires' validity and reliability were evaluated in four experiments, and the results show that the questionnaires consistently correlate with other subjective and objective measures of usability. The questionnaires also have good reliability based on the Cronbach's alpha. Researchers and professionals can directly use these questionnaires to evaluate their own HAR applications or modify them with the insights presented in this article. PMID:26416363

  6. Multithreaded hybrid feature tracking for markerless augmented reality.

    PubMed

    Lee, Taehee; Höllerer, Tobias

    2009-01-01

    We describe a novel markerless camera tracking approach and user interaction methodology for augmented reality (AR) on unprepared tabletop environments. We propose a real-time system architecture that combines two types of feature tracking. Distinctive image features of the scene are detected and tracked frame-to-frame by computing optical flow. In order to achieve real-time performance, multiple operations are processed in a synchronized multi-threaded manner: capturing a video frame, tracking features using optical flow, detecting distinctive invariant features, and rendering an output frame. We also introduce user interaction methodology for establishing a global coordinate system and for placing virtual objects in the AR environment by tracking a user's outstretched hand and estimating a camera pose relative to it. We evaluate the speed and accuracy of our hybrid feature tracking approach, and demonstrate a proof-of-concept application for enabling AR in unprepared tabletop environments, using bare hands for interaction. PMID:19282544

  7. Efficient Verification of Holograms Using Mobile Augmented Reality.

    PubMed

    Hartl, Andreas Daniel; Arth, Clemens; Grubert, Jens; Schmalstieg, Dieter

    2016-07-01

    Paper documents such as passports, visas and banknotes are frequently checked by inspection of security elements. In particular, optically variable devices such as holograms are important, but difficult to inspect. Augmented Reality can provide all relevant information on standard mobile devices. However, hologram verification on mobiles still takes long and provides lower accuracy than inspection by human individuals using appropriate reference information. We aim to address these drawbacks by automatic matching combined with a special parametrization of an efficient goal-oriented user interface which supports constrained navigation. We first evaluate a series of similarity measures for matching hologram patches to provide a sound basis for automatic decisions. Then a re-parametrized user interface is proposed based on observations of typical user behavior during document capture. These measures help to reduce capture time to approximately 15 s with better decisions regarding the evaluated samples than what can be achieved by untrained users. PMID:26561461

  8. Simulating low-cost cameras for augmented reality compositing.

    PubMed

    Klein, Georg; Murray, David W

    2010-01-01

    Video see-through Augmented Reality adds computer graphics to the real world in real time by overlaying graphics onto a live video feed. To achieve a realistic integration of the virtual and real imagery, the rendered images should have a similar appearance and quality to those produced by the video camera. This paper describes a compositing method which models the artifacts produced by a small low-cost camera, and adds these effects to an ideal pinhole image produced by conventional rendering methods. We attempt to model and simulate each step of the imaging process, including distortions, chromatic aberrations, blur, Bayer masking, noise, sharpening, and color-space compression, all while requiring only an RGBA image and an estimate of camera velocity as inputs. PMID:20224133

  9. Computer vision and augmented reality in gastrointestinal endoscopy

    PubMed Central

    Mahmud, Nadim; Cohen, Jonah; Tsourides, Kleovoulos; Berzin, Tyler M.

    2015-01-01

    Augmented reality (AR) is an environment-enhancing technology, widely applied in the computer sciences, which has only recently begun to permeate the medical field. Gastrointestinal endoscopy—which relies on the integration of high-definition video data with pathologic correlates—requires endoscopists to assimilate and process a tremendous amount of data in real time. We believe that AR is well positioned to provide computer-guided assistance with a wide variety of endoscopic applications, beginning with polyp detection. In this article, we review the principles of AR, describe its potential integration into an endoscopy set-up, and envisage a series of novel uses. With close collaboration between physicians and computer scientists, AR promises to contribute significant improvements to the field of endoscopy. PMID:26133175

  10. Tangible Interaction in Learning Astronomy through Augmented Reality Book-Based Educational Tool

    NASA Astrophysics Data System (ADS)

    Sin, Aw Kien; Badioze Zaman, Halimah

    Live Solar System (LSS) is an Augmented Reality book-based educational tool. Augmented Reality (AR) has its own potential in the education field, because it can provide a seamless interaction between real and virtual objects. LSS applied the Tangible Augmented Reality approach in designing its user interface and interaction. Tangible Augmented Reality is an interface which combines the Tangible User Interface and Augmented Reality Interface. They are naturally complement each other. This paper highlights the tangible interaction in LSS. LSS adopts the 'cube' as the common physical object input device. Thus, LSS does not use the traditional computer input devices such as the mouse or keyboard. To give users a better exploration experience, Visual Information Seeking Mantra principle was applied in the design of LSS. Hence, LSS gives users an effective interactive-intuitive horizontal surface learning environment.

  11. Training for planning tumour resection: augmented reality and human factors.

    PubMed

    Abhari, Kamyar; Baxter, John S H; Chen, Elvis C S; Khan, Ali R; Peters, Terry M; de Ribaupierre, Sandrine; Eagleson, Roy

    2015-06-01

    Planning surgical interventions is a complex task, demanding a high degree of perceptual, cognitive, and sensorimotor skills to reduce intra- and post-operative complications. This process requires spatial reasoning to coordinate between the preoperatively acquired medical images and patient reference frames. In the case of neurosurgical interventions, traditional approaches to planning tend to focus on providing a means for visualizing medical images, but rarely support transformation between different spatial reference frames. Thus, surgeons often rely on their previous experience and intuition as their sole guide is to perform mental transformation. In case of junior residents, this may lead to longer operation times or increased chance of error under additional cognitive demands. In this paper, we introduce a mixed augmented-/virtual-reality system to facilitate training for planning a common neurosurgical procedure, brain tumour resection. The proposed system is designed and evaluated with human factors explicitly in mind, alleviating the difficulty of mental transformation. Our results indicate that, compared to conventional planning environments, the proposed system greatly improves the nonclinicians' performance, independent of the sensorimotor tasks performed ( ). Furthermore, the use of the proposed system by clinicians resulted in a significant reduction in time to perform clinically relevant tasks ( ). These results demonstrate the role of mixed-reality systems in assisting residents to develop necessary spatial reasoning skills needed for planning brain tumour resection, improving patient outcomes. PMID:25546854

  12. Invisible waves and hidden realms: augmented reality and experimental art

    NASA Astrophysics Data System (ADS)

    Ruzanka, Silvia

    2012-03-01

    Augmented reality is way of both altering the visible and revealing the invisible. It offers new opportunities for artistic exploration through virtual interventions in real space. In this paper, the author describes the implementation of two art installations using different AR technologies, one using optical marker tracking on mobile devices and one integrating stereoscopic projections into the physical environment. The first artwork, De Ondas y Abejas (The Waves and the Bees), is based on the widely publicized (but unproven) hypothesis of a link between cellphone radiation and the phenomenon of bee colony collapse disorder. Using an Android tablet, viewers search out small fiducial markers in the shape of electromagnetic waves hidden throughout the gallery, which reveal swarms of bees scattered on the floor. The piece also creates a generative soundscape based on electromagnetic fields. The second artwork, Urban Fauna, is a series of animations in which features of the urban landscape become plants and animals. Surveillance cameras become flocks of birds while miniature cellphone towers, lampposts, and telephone poles grow like small seedlings in time-lapse animation. The animations are presented as small stereoscopic projections, integrated into the physical space of the gallery. These two pieces explore the relationship between nature and technology through the visualization of invisible forces and hidden alternate realities.

  13. Applied Operations Research: Augmented Reality in an Industrial Environment

    NASA Technical Reports Server (NTRS)

    Cole, Stuart K.

    2015-01-01

    Augmented reality is the application of computer generated data or graphics onto a real world view. Its use provides the operator additional information or a heightened situational awareness. While advancements have been made in automation and diagnostics of high value critical equipment to improve readiness, reliability and maintenance, the need for assisting and support to Operations and Maintenance staff persists. AR can improve the human machine interface where computer capabilities maximize the human experience and analysis capabilities. NASA operates multiple facilities with complex ground based HVCE in support of national aerodynamics and space exploration, and the need exists to improve operational support and close a gap related to capability sustainment where key and experienced staff consistently rotate work assignments and reach their expiration of term of service. The initiation of an AR capability to augment and improve human abilities and training experience in the industrial environment requires planning and establishment of a goal and objectives for the systems and specific applications. This paper explored use of AR in support of Operation staff in real time operation of HVCE and its maintenance. The results identified include identification of specific goal and objectives, challenges related to availability and computer system infrastructure.

  14. D Tracking Based Augmented Reality for Cultural Heritage Data Management

    NASA Astrophysics Data System (ADS)

    Battini, C.; Landi, G.

    2015-02-01

    The development of contactless documentation techniques is allowing researchers to collect high volumes of three-dimensional data in a short time but with high levels of accuracy. The digitalisation of cultural heritage opens up the possibility of using image processing and analysis, and computer graphics techniques, to preserve this heritage for future generations; augmenting it with additional information or with new possibilities for its enjoyment and use. The collection of precise datasets about cultural heritage status is crucial for its interpretation, its conservation and during the restoration processes. The application of digital-imaging solutions for various feature extraction, image data-analysis techniques, and three-dimensional reconstruction of ancient artworks, allows the creation of multidimensional models that can incorporate information coming from heterogeneous data sets, research results and historical sources. Real objects can be scanned and reconstructed virtually, with high levels of data accuracy and resolution. Real-time visualisation software and hardware is rapidly evolving and complex three-dimensional models can be interactively visualised and explored on applications developed for mobile devices. This paper will show how a 3D reconstruction of an object, with multiple layers of information, can be stored and visualised through a mobile application that will allow interaction with a physical object for its study and analysis, using 3D Tracking based Augmented Reality techniques.

  15. Augmented reality and stereo vision for remote scene characterization

    NASA Astrophysics Data System (ADS)

    Lawson, Shaun W.; Pretlove, John R. G.

    1999-11-01

    In this paper we present our progress in the research and development of an augmented reality (AR) system for the remote inspection of hazardous environments. It specifically addresses one particular application with which we are involved--that of improving the inspection of underground sewer pipes using robotic vehicles and 3D graphical overlays coupled with stereoscopic visual data. Traditional sewer inspection using a human operator and CCTV systems is a mature technology--though the task itself is difficult, subjective and prone to error. The work described here proposes not to replace the expert human inspector--but to enhance and increase the information that is available to him and to augment that information with other previously stored data. We describe our current system components which comprise a robotic stereo head device, a simulated sewer crawling vehicle and our AR system. We then go on to discuss the lengthy calibration procedures which are necessary in to align any graphical overlay information with live video data. Some experiments in determining alignment errors under head motion and some investigations into the use of a calibrated virtual cursor are then described.

  16. Augmented reality to enhance an active telepresence system

    NASA Astrophysics Data System (ADS)

    Wheeler, Alison; Pretlove, John R. G.; Parker, Graham A.

    1996-12-01

    Tasks carried out remotely via a telerobotic system are typically complex, occur in hazardous environments and require fine control of the robot's movements. Telepresence systems provide the teleoperator with a feeling of being physically present at the remote site. Stereoscopic video has been successfully applied to telepresence vision systems to increase the operator's perception of depth in the remote scene and this sense of presence can be further enhanced using computer generated stereo graphics to augment the visual information presented to the operator. The Mechatronic Systems and Robotics Research Group have over seven years developed a number of high performance active stereo vision systems culminating in the latest, a four degree-of-freedom stereohead. This carries two miniature color cameras and is controlled in real time by the motion of the operator's head, who views the stereoscopic video images on an immersive head mounted display or stereo monitor. The stereohead is mounted on a mobile robot, the movement of which is controlled by a joystick interface. This paper describes the active telepresence system and the development of a prototype augmented reality (AR) application to enhance the operator's sense of presence at the remote site. The initial enhancements are a virtual map and compass to aid navigation in degraded visual conditions and a virtual cursor that provides a means for the operator to interact with the remote environment. The results of preliminary experiments using the initial enhancements are presented.

  17. Registration using natural features for augmented reality systems.

    PubMed

    Yuan, M L; Ong, S K; Nee, A Y C

    2006-01-01

    Registration is one of the most difficult problems in augmented reality (AR) systems. In this paper, a simple registration method using natural features based on the projective reconstruction technique is proposed. This method consists of two steps: embedding and rendering. Embedding involves specifying four points to build the world coordinate system on which a virtual object will be superimposed. In rendering, the Kanade-Lucas-Tomasi (KLT) feature tracker is used to track the natural feature correspondences in the live video. The natural features that have been tracked are used to estimate the corresponding projective matrix in the image sequence. Next, the projective reconstruction technique is used to transfer the four specified points to compute the registration matrix for augmentation. This paper also proposes a robust method for estimating the projective matrix, where the natural features that have been tracked are normalized (translation and scaling) and used as the input data. The estimated projective matrix will be used as an initial estimate for a nonlinear optimization method that minimizes the actual residual errors based on the Levenberg-Marquardt (LM) minimization method, thus making the results more robust and stable. The proposed registration method has three major advantages: 1) It is simple, as no predefined fiducials or markers are used for registration for either indoor and outdoor AR applications. 2) It is robust, because it remains effective as long as at least six natural features are tracked during the entire augmentation, and the existence of the corresponding projective matrices in the live video is guaranteed. Meanwhile, the robust method to estimate the projective matrix can obtain stable results even when there are some outliers during the tracking process. 3) Virtual objects can still be superimposed on the specified areas, even if some parts of the areas are occluded during the entire process. Some indoor and outdoor experiments have

  18. D3D augmented reality imaging system: proof of concept in mammography

    PubMed Central

    Douglas, David B; Petricoin, Emanuel F; Liotta, Lance; Wilson, Eugene

    2016-01-01

    Purpose The purpose of this article is to present images from simulated breast microcalcifications and assess the pattern of the microcalcifications with a technical development called “depth 3-dimensional (D3D) augmented reality”. Materials and methods A computer, head display unit, joystick, D3D augmented reality software, and an in-house script of simulated data of breast microcalcifications in a ductal distribution were used. No patient data was used and no statistical analysis was performed. Results The D3D augmented reality system demonstrated stereoscopic depth perception by presenting a unique image to each eye, focal point convergence, head position tracking, 3D cursor, and joystick fly-through. Conclusion The D3D augmented reality imaging system offers image viewing with depth perception and focal point convergence. The D3D augmented reality system should be tested to determine its utility in clinical practice. PMID:27563261

  19. Augmented versus Virtual Reality Laparoscopic Simulation: What Is the Difference?

    PubMed Central

    Botden, Sanne M.B.I.; Buzink, Sonja N.; Schijven, Marlies P.

    2007-01-01

    Background Virtual reality (VR) is an emerging new modality for laparoscopic skills training; however, most simulators lack realistic haptic feedback. Augmented reality (AR) is a new laparoscopic simulation system offering a combination of physical objects and VR simulation. Laparoscopic instruments are used within an hybrid mannequin on tissue or objects while using video tracking. This study was designed to assess the difference in realism, haptic feedback, and didactic value between AR and VR laparoscopic simulation. Methods The ProMIS AR and LapSim VR simulators were used in this study. The participants performed a basic skills task and a suturing task on both simulators, after which they filled out a questionnaire about their demographics and their opinion of both simulators scored on a 5-point Likert scale. The participants were allotted to 3 groups depending on their experience: experts, intermediates and novices. Significant differences were calculated with the paired t-test. Results There was general consensus in all groups that the ProMIS AR laparoscopic simulator is more realistic than the LapSim VR laparoscopic simulator in both the basic skills task (mean 4.22 resp. 2.18, P < 0.000) as well as the suturing task (mean 4.15 resp. 1.85, P < 0.000). The ProMIS is regarded as having better haptic feedback (mean 3.92 resp. 1.92, P < 0.000) and as being more useful for training surgical residents (mean 4.51 resp. 2.94, P < 0.000). Conclusions In comparison with the VR simulator, the AR laparoscopic simulator was regarded by all participants as a better simulator for laparoscopic skills training on all tested features. PMID:17361356

  20. An Augmented-Reality Edge Enhancement Application for Google Glass

    PubMed Central

    Hwang, Alex D.; Peli, Eli

    2014-01-01

    Purpose Google Glass provides a platform that can be easily extended to include a vision enhancement tool. We have implemented an augmented vision system on Glass, which overlays enhanced edge information over the wearer’s real world view, to provide contrast-improved central vision to the Glass wearers. The enhanced central vision can be naturally integrated with scanning. Methods Goggle Glass’s camera lens distortions were corrected by using an image warping. Since the camera and virtual display are horizontally separated by 16mm, and the camera aiming and virtual display projection angle are off by 10°, the warped camera image had to go through a series of 3D transformations to minimize parallax errors before the final projection to the Glass’ see-through virtual display. All image processes were implemented to achieve near real-time performance. The impacts of the contrast enhancements were measured for three normal vision subjects, with and without a diffuser film to simulate vision loss. Results For all three subjects, significantly improved contrast sensitivity was achieved when the subjects used the edge enhancements with a diffuser film. The performance boost is limited by the Glass camera’s performance. The authors assume this accounts for why performance improvements were observed only with the diffuser filter condition (simulating low vision). Conclusions Improvements were measured with simulated visual impairments. With the benefit of see-through augmented reality edge enhancement, natural visual scanning process is possible, and suggests that the device may provide better visual function in a cosmetically and ergonomically attractive format for patients with macular degeneration. PMID:24978871

  1. Study of application of virtual object projection based on augmented reality technology

    NASA Astrophysics Data System (ADS)

    Liu, San-jun; Zhang, Yan-feng

    2013-03-01

    Augmented reality technology is a hot research topic in recent years. Augmented reality is a technology of synthesizing virtual objects generated by the computer or other information to the real world felt by the user. It is a supplement to the real world, rather than the completely replacement of the real world. Displaying technology and tracking registration technology is key technologies in the augmented reality system, and is the focus of the study. This article briefly describes the displaying technology, and the tracking registration technology.

  2. Intraoperative surgical photoacoustic microscopy (IS-PAM) using augmented reality

    NASA Astrophysics Data System (ADS)

    Lee, Changho; Han, Seunghoon; Kim, Sehui; Jeon, Mansik; Kim, Jeehyun; Kim, Chulhong

    2014-03-01

    We have developed an intraoperative surgical photoacoustic microscopy (IS-PAM) system by integrating an optical resolution photoacoustic microscopy (OR-PAM) and conventional surgical microscope. Based on the common optical path in the OR-PAM and microscope system, we can acquire the PAM and microscope images at the same time. Furthermore, by utilizing a mini-sized beam projector, 2D PAM images are back-projected onto the microscope view plane as augmented reality. Thus, both the conventional microscopic and 2D cross-sectional PAM images are displayed on the plane through an eyepiece lens of the microscope. In our method, additional image display tool is not required to show the PAM image. Therefore, it potentially offers significant convenience to surgeons without movement of their sights during surgeries. In order to demonstrate the performance of our IS-PAM system, first, we successfully monitored needle intervention in phantoms. Moreover, we successfully guided needle insertion into mice skins in vivo by visualizing surrounding blood vessels from the PAM images and the magnified skin surfaces from the conventional microscopic images simultaneously.

  3. Augmented Reality-Based Navigation System for Wrist Arthroscopy: Feasibility

    PubMed Central

    Zemirline, Ahmed; Agnus, Vincent; Soler, Luc; Mathoulin, Christophe L.; Liverneaux, Philippe A.; Obdeijn, Miryam

    2013-01-01

    Purpose In video surgery, and more specifically in arthroscopy, one of the major problems is positioning the camera and instruments within the anatomic environment. The concept of computer-guided video surgery has already been used in ear, nose, and throat (ENT), gynecology, and even in hip arthroscopy. These systems, however, rely on optical or mechanical sensors, which turn out to be restricting and cumbersome. The aim of our study was to develop and evaluate the accuracy of a navigation system based on electromagnetic sensors in video surgery. Methods We used an electromagnetic localization device (Aurora, Northern Digital Inc., Ontario, Canada) to track the movements in space of both the camera and the instruments. We have developed a dedicated application in the Python language, using the VTK library for the graphic display and the OpenCV library for camera calibration. Results A prototype has been designed and evaluated for wrist arthroscopy. It allows display of the theoretical position of instruments onto the arthroscopic view with useful accuracy. Discussion The augmented reality view represents valuable assistance when surgeons want to position the arthroscope or locate their instruments. It makes the maneuver more intuitive, increases comfort, saves time, and enhances concentration. PMID:24436832

  4. MetaTree: augmented reality narrative explorations of urban forests

    NASA Astrophysics Data System (ADS)

    West, Ruth; Margolis, Todd; O'Neil-Dunne, Jarlath; Mendelowitz, Eitan

    2012-03-01

    As cities world-wide adopt and implement reforestation initiatives to plant millions of trees in urban areas, they are engaging in what is essentially a massive ecological and social experiment. Existing air-borne, space-borne, and fieldbased imaging and inventory mechanisms fail to provide key information on urban tree ecology that is crucial to informing management, policy, and supporting citizen initiatives for the planting and stewardship of trees. The shortcomings of the current approaches include: spatial and temporal resolution, poor vantage point, cost constraints and biological metric limitations. Collectively, this limits their effectiveness as real-time inventory and monitoring tools. Novel methods for imaging and monitoring the status of these emerging urban forests and encouraging their ongoing stewardship by the public are required to ensure their success. This art-science collaboration proposes to re-envision citizens' relationship with urban spaces by foregrounding urban trees in relation to local architectural features and simultaneously creating new methods for urban forest monitoring. We explore creating a shift from overhead imaging or field-based tree survey data acquisition methods to continuous, ongoing monitoring by citizen scientists as part of a mobile augmented reality experience. We consider the possibilities of this experience as a medium for interacting with and visualizing urban forestry data and for creating cultural engagement with urban ecology.

  5. Development of a mobile borehole investigation software using augmented reality

    NASA Astrophysics Data System (ADS)

    Son, J.; Lee, S.; Oh, M.; Yun, D. E.; Kim, S.; Park, H. D.

    2015-12-01

    Augmented reality (AR) is one of the most developing technologies in smartphone and IT areas. While various applications have been developed using the AR, there are a few geological applications which adopt its advantages. In this study, a smartphone application to manage boreholes using AR has been developed. The application is consisted of three major modules, an AR module, a map module and a data management module. The AR module calculates the orientation of the device and displays nearby boreholes distributed in three dimensions using the orientation. This module shows the boreholes in a transparent layer on a live camera screen so the user can find and understand the overall characteristics of the underground geology. The map module displays the boreholes on a 2D map to show their distribution and the location of the user. The database module uses SQLite library which has proper characteristics for mobile platforms, and Binary XML is adopted to enable containing additional customized data. The application is able to provide underground information in an intuitive and refined forms and to decrease time and general equipment required for geological field investigations.

  6. L-split marker for augmented reality in aircraft assembly

    NASA Astrophysics Data System (ADS)

    Han, Pengfei; Zhao, Gang

    2016-04-01

    In order to improve the performance of conventional square markers widely used by marker-based augmented reality systems in aircraft assembly environments, an L-split marker is proposed. Every marker consists of four separate L-shaped parts and each of them contains partial information about the marker. Geometric features of the L-shape, which are more discriminate than the symmetrical square shape adopted by conventional markers, are used to detect proposed markers from the camera images effectively. The marker is split into four separate parts in order to improve the robustness to occlusion and curvature to some extent. The registration process can be successfully completed as long as three parts are detected (up to about 80% of the area could be occluded). Moreover, when we attach the marker on nonplanar surfaces, the curvature status of the marker can be roughly analyzed with every part's normal direction, which can be obtained since their six corners have been explicitly determined in the previous detection process. And based on the marker design, new detection and recognition algorithms are proposed and detailed. The experimental results show that the marker and the algorithms are effective.

  7. A spatially augmented reality sketching interface for architectural daylighting design.

    PubMed

    Sheng, Yu; Yapo, Theodore C; Young, Christopher; Cutler, Barbara

    2011-01-01

    We present an application of interactive global illumination and spatially augmented reality to architectural daylight modeling that allows designers to explore alternative designs and new technologies for improving the sustainability of their buildings. Images of a model in the real world, captured by a camera above the scene, are processed to construct a virtual 3D model. To achieve interactive rendering rates, we use a hybrid rendering technique, leveraging radiosity to simulate the interreflectance between diffuse patches and shadow volumes to generate per-pixel direct illumination. The rendered images are then projected on the real model by four calibrated projectors to help users study the daylighting illumination. The virtual heliodon is a physical design environment in which multiple designers, a designer and a client, or a teacher and students can gather to experience animated visualizations of the natural illumination within a proposed design by controlling the time of day, season, and climate. Furthermore, participants may interactively redesign the geometry and materials of the space by manipulating physical design elements and see the updated lighting simulation. PMID:21071786

  8. Shape recognition and pose estimation for mobile Augmented Reality.

    PubMed

    Hagbi, Nate; Bergig, Oriel; El-Sana, Jihad; Billinghurst, Mark

    2011-10-01

    Nestor is a real-time recognition and camera pose estimation system for planar shapes. The system allows shapes that carry contextual meanings for humans to be used as Augmented Reality (AR) tracking targets. The user can teach the system new shapes in real time. New shapes can be shown to the system frontally, or they can be automatically rectified according to previously learned shapes. Shapes can be automatically assigned virtual content by classification according to a shape class library. Nestor performs shape recognition by analyzing contour structures and generating projective-invariant signatures from their concavities. The concavities are further used to extract features for pose estimation and tracking. Pose refinement is carried out by minimizing the reprojection error between sample points on each image contour and its library counterpart. Sample points are matched by evolving an active contour in real time. Our experiments show that the system provides stable and accurate registration, and runs at interactive frame rates on a Nokia N95 mobile phone. PMID:21041876

  9. Augmented reality enabling intelligence exploitation at the edge

    NASA Astrophysics Data System (ADS)

    Kase, Sue E.; Roy, Heather; Bowman, Elizabeth K.; Patton, Debra

    2015-05-01

    Today's Warfighters need to make quick decisions while interacting in densely populated environments comprised of friendly, hostile, and neutral host nation locals. However, there is a gap in the real-time processing of big data streams for edge intelligence. We introduce a big data processing pipeline called ARTEA that ingests, monitors, and performs a variety of analytics including noise reduction, pattern identification, and trend and event detection in the context of an area of operations (AOR). Results of the analytics are presented to the Soldier via an augmented reality (AR) device Google Glass (Glass). Non-intrusive AR devices such as Glass can visually communicate contextually relevant alerts to the Soldier based on the current mission objectives, time, location, and observed or sensed activities. This real-time processing and AR presentation approach to knowledge discovery flattens the intelligence hierarchy enabling the edge Soldier to act as a vital and active participant in the analysis process. We report preliminary observations testing ARTEA and Glass in a document exploitation and person of interest scenario simulating edge Soldier participation in the intelligence process in disconnected deployment conditions.

  10. Concealed target detection using augmented reality with SIRE radar

    NASA Astrophysics Data System (ADS)

    Saponaro, Philip; Kambhamettu, Chandra; Ranney, Kenneth; Sullivan, Anders

    2013-05-01

    The Synchronous Impulse Reconstruction (SIRE) forward-looking radar, developed by the U.S. Army Research Laboratory (ARL), can detect concealed targets using ultra-wideband synthetic aperture technology. The SIRE radar has been mounted on a Ford Expedition and combined with other sensors, including a pan/tilt/zoom camera, to test its capabilities of concealed target detection in a realistic environment. Augmented Reality (AR) can be used to combine the SIRE radar image with the live camera stream into one view, which provides the user with information that is quicker to assess and easier to understand than each separated. In this paper we present an AR system which utilizes a global positioning system (GPS) and inertial measurement unit (IMU) to overlay a SIRE radar image onto a live video stream. We describe a method for transforming 3D world points in the UTM coordinate system onto the video stream by calibrating for the intrinsic parameters of the camera. This calibration is performed offline to save computation time and achieve real time performance. Since the intrinsic parameters are affected by the zoom of the camera, we calibrate at eleven different zooms and interpolate. We show the results of a real time transformation of the SAR imagery onto the video stream. Finally, we quantify both the 2D error and 3D residue associated with our transformation and show that the amount of error is reasonable for our application.

  11. Realistic real-time outdoor rendering in augmented reality.

    PubMed

    Kolivand, Hoshang; Sunar, Mohd Shahrizal

    2014-01-01

    Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems. PMID:25268480

  12. Spatial augmented reality on industrial CNC-machines

    NASA Astrophysics Data System (ADS)

    Olwal, Alex; Gustafsson, Jonny; Lindfors, Christoffer

    2008-02-01

    In this work we present how Augmented Reality (AR) can be used to create an intimate integration of process data with the workspace of an industrial CNC (computer numerical control) machine. AR allows us to combine interactive computer graphics with real objects in a physical environment - in this case, the workspace of an industrial lathe. ASTOR is an autostereoscopic optical see-through spatial AR system, which provides real-time 3D visual feedback without the need for user-worn equipment, such as head-mounted displays or sensors for tracking. The use of a transparent holographic optical element, overlaid onto the safety glass, allows the system to simultaneously provide bright imagery and clear visibility of the tool and workpiece. The system makes it possible to enhance visibility of occluded tools as well as to visualize real-time data from the process in the 3D space. The graphics are geometrically registered with the workspace and provide an intuitive representation of the process, amplifying the user's understanding and simplifying machine operation.

  13. Augmented Reality Image Guidance in Minimally Invasive Prostatectomy

    NASA Astrophysics Data System (ADS)

    Cohen, Daniel; Mayer, Erik; Chen, Dongbin; Anstee, Ann; Vale, Justin; Yang, Guang-Zhong; Darzi, Ara; Edwards, Philip'eddie'

    This paper presents our work aimed at providing augmented reality (AR) guidance of robot-assisted laparoscopic surgery (RALP) using the da Vinci system. There is a good clinical case for guidance due to the significant rate of complications and steep learning curve for this procedure. Patients who were due to undergo robotic prostatectomy for organ-confined prostate cancer underwent preoperative 3T MRI scans of the pelvis. These were segmented and reconstructed to form 3D images of pelvic anatomy. The reconstructed image was successfully overlaid onto screenshots of the recorded surgery post-procedure. Surgeons who perform minimally-invasive prostatectomy took part in a user-needs analysis to determine the potential benefits of an image guidance system after viewing the overlaid images. All surgeons stated that the development would be useful at key stages of the surgery and could help to improve the learning curve of the procedure and improve functional and oncological outcomes. Establishing the clinical need in this way is a vital early step in development of an AR guidance system. We have also identified relevant anatomy from preoperative MRI. Further work will be aimed at automated registration to account for tissue deformation during the procedure, using a combination of transrectal ultrasound and stereoendoscopic video.

  14. Realistic Real-Time Outdoor Rendering in Augmented Reality

    PubMed Central

    Kolivand, Hoshang; Sunar, Mohd Shahrizal

    2014-01-01

    Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems. PMID:25268480

  15. Integrating a Mobile Augmented Reality Activity to Contextualize Student Learning of a Socioscienti?c Issue

    ERIC Educational Resources Information Center

    Chang, Hsin-Yi; Wu, Hsin-Kai; Hsu, Ying-Shao

    2013-01-01

    virtual objects or information overlaying physical objects or environments, resulting in a mixed reality in which virtual objects and real environments coexist in a meaningful way to augment learning…

  16. An indoor augmented reality mobile application for simulation of building evacuation

    NASA Astrophysics Data System (ADS)

    Sharma, Sharad; Jerripothula, Shanmukha

    2015-03-01

    Augmented Reality enables people to remain connected with the physical environment they are in, and invites them to look at the world from new and alternative perspectives. There has been an increasing interest in emergency evacuation applications for mobile devices. Nearly all the smart phones these days are Wi-Fi and GPS enabled. In this paper, we propose a novel emergency evacuation system that will help people to safely evacuate a building in case of an emergency situation. It will further enhance knowledge and understanding of where the exits are in the building and safety evacuation procedures. We have applied mobile augmented reality (mobile AR) to create an application with Unity 3D gaming engine. We show how the mobile AR application is able to display a 3D model of the building and animation of people evacuation using markers and web camera. The system gives a visual representation of a building in 3D space, allowing people to see where exits are in the building through the use of a smart phone or tablets. Pilot studies were conducted with the system showing its partial success and demonstrated the effectiveness of the application in emergency evacuation. Our computer vision methods give good results when the markers are closer to the camera, but accuracy decreases when the markers are far away from the camera.

  17. Past and future of wearable augmented reality displays and their applications

    NASA Astrophysics Data System (ADS)

    Hua, Hong

    2014-10-01

    A wearable augmented reality (AR) display enables the ability to overlay computer-generated imagery on a person's real-world view and it has long been portrayed as a transformative technology to redefine the way we perceive and interact with digital information. In this paper, I will provide an overview on my research group's past development efforts in augmented reality display technologies and applications and discuss key technical challenges and opportunities for future developments.

  18. Augmented Reality Cues and Elderly Driver Hazard Perception

    PubMed Central

    Schall, Mark C.; Rusch, Michelle L.; Lee, John D.; Dawson, Jeffrey D.; Thomas, Geb; Aksan, Nazan; Rizzo, Matthew

    2013-01-01

    Objective Evaluate the effectiveness of augmented reality (AR) cues in improving driving safety in elderly drivers who are at increased crash risk due to cognitive impairments. Background Cognitively challenging driving environments pose a particular crash risk for elderly drivers. AR cueing is a promising technology to mitigate risk by directing driver attention to roadway hazards. This study investigates whether AR cues improve or interfere with hazard perception in elderly drivers with age-related cognitive decline. Methods Twenty elderly (Mean= 73 years, SD= 5 years), licensed drivers with a range of cognitive abilities measured by a speed of processing (SOP) composite participated in a one-hour drive in an interactive, fixed-base driving simulator. Each participant drove through six, straight, six-mile-long rural roadway scenarios following a lead vehicle. AR cues directed attention to potential roadside hazards in three of the scenarios, and the other three were uncued (baseline) drives. Effects of AR cueing were evaluated with respect to: 1) detection of hazardous target objects, 2) interference with detecting nonhazardous secondary objects, and 3) impairment in maintaining safe distance behind a lead vehicle. Results AR cueing improved the detection of hazardous target objects of low visibility. AR cues did not interfere with detection of nonhazardous secondary objects and did not impair ability to maintain safe distance behind a lead vehicle. SOP capacity did not moderate those effects. Conclusion AR cues show promise for improving elderly driver safety by increasing hazard detection likelihood without interfering with other driving tasks such as maintaining safe headway. PMID:23829037

  19. Augmented reality using ultra-wideband radar imagery

    NASA Astrophysics Data System (ADS)

    Nguyen, Lam; Koenig, Francois; Sherbondy, Kelly

    2011-06-01

    The U.S. Army Research Laboratory (ARL) has been investigating the utility of ultra-wideband (UWB) synthetic aperture radar (SAR) technology for detecting concealed targets in various applications. We have designed and built a vehicle-based, low-frequency UWB SAR radar for proof-of-concept demonstration in detecting obstacles for autonomous navigation, detecting concealed targets (mines, etc.), and mapping internal building structures to locate enemy activity. Although the low-frequency UWB radar technology offers valuable information to complement other technologies due to its penetration capability, it is very difficult to comprehend the radar imagery and correlate the detection list from the radar with the objects in the real world. Using augmented reality (AR) technology, we can superimpose the information from the radar onto the video image of the real world in real-time. Using this, Soldiers would view the environment and the superimposed graphics (SAR imagery, detection locations, digital map, etc.) via a standard display or a head-mounted display. The superimposed information would be constantly changed and adjusted for every perspective and movement of the user. ARL has been collaborating with ITT Industries to implement an AR system that integrates the video data captured from the real world and the information from the UWB radar. ARL conducted an experiment and demonstrated the real-time geo-registration of the two independent data streams. The integration of the AR sub-system into the radar system is underway. This paper presents the integration of the AR and SAR systems. It shows results that include the real-time embedding of the SAR imagery and other information into the video data stream.

  20. Augmented reality guidance system for peripheral nerve blocks

    NASA Astrophysics Data System (ADS)

    Wedlake, Chris; Moore, John; Rachinsky, Maxim; Bainbridge, Daniel; Wiles, Andrew D.; Peters, Terry M.

    2010-02-01

    Peripheral nerve block treatments are ubiquitous in hospitals and pain clinics worldwide. State of the art techniques use ultrasound (US) guidance and/or electrical stimulation to verify needle tip location. However, problems such as needle-US beam alignment, poor echogenicity of block needles and US beam thickness can make it difficult for the anesthetist to know the exact needle tip location. Inaccurate therapy delivery raises obvious safety and efficacy issues. We have developed and evaluated a needle guidance system that makes use of a magnetic tracking system (MTS) to provide an augmented reality (AR) guidance platform to accurately localize the needle tip as well as its projected trajectory. Five anesthetists and five novices performed simulated nerve block deliveries in a polyvinyl alcohol phantom to compare needle guidance under US alone to US placed in our AR environment. Our phantom study demonstrated a decrease in targeting attempts, decrease in contacting of critical structures, and an increase in accuracy of 0.68 mm compared to 1.34mm RMS in US guidance alone. Currently, the MTS uses 18 and 21 gauge hypodermic needles with a 5 degree of freedom sensor located at the needle tip. These needles can only be sterilized using an ethylene oxide process. In the interest of providing clinicians with a simple and efficient guidance system, we also evaluated attaching the sensor at the needle hub as a simple clip-on device. To do this, we simultaneously performed a needle bending study to assess the reliability of a hub-based sensor.

  1. Sharing skills: using augmented reality for human-robot collaboration

    NASA Astrophysics Data System (ADS)

    Giesler, Bjorn; Steinhaus, Peter; Walther, Marcus; Dillmann, Ruediger

    2004-05-01

    Both stationary 'industrial' and autonomous mobile robots nowadays pervade many workplaces, but human-friendly interaction with them is still very much an experimental subject. One of the reasons for this is that computer and robotic systems are very bad at performing certain tasks well and robust. A prime example is classification of sensor readings: Which part of a 3D depth image is the cup, which the saucer, which the table? These are tasks that humans excel at. To alleviate this problem, we propose a team approah, wherein the robot records sensor data and uses an Augmented-Reality (AR) system to present the data to the user directly in the 3D environment. The user can then perform classification decisions directly on the data by pointing, gestures and speech commands. After the classification has been performed by the user, the robot takes the classified data and matches it to its environment model. As a demonstration of this approach, we present an initial system for creating objects on-the-fly in the environment model. A rotating laser scanner is used to capture a 3D snapshot of the environment. This snapshot is presented to the user as an overlay over his view of the scene. The user classifies unknown objects by pointing at them. The system segments the snapshot according to the user's indications and presents the results of segmentation back to the user, who can then inspect, correct and enhance them interactively. After a satisfying result has been reached, the laser-scanner can take more snapshots from other angles and use the previous segmentation hints to construct a 3D model of the object.

  2. Augmented reality graphic interface for upstream dam inspection

    NASA Astrophysics Data System (ADS)

    Cote, Jean; Lavallee, Jean

    1995-12-01

    This paper presents a 3D graphic interface for the inspection of cracks along a dam. The monitoring of concrete dams is restricted by the accessibility of the various parts of the structure. Since the upstream face of a dam is not usually exposed, as in our case at Hydro- Quebec, a systematic and even ad hoc inspection become extremely complex. The piloting of a ROV (Remotely Operated Vehicle) underwater is like driving in a snowstorm. The view from the camera is similar to the visibility a driver would have in a snowstorm. Sensor fusion has to be performed by the operator since each sensor is specialized for its task. Even with a 2D positioning system or sonar scan, the approach to the inspection area is very tedious. A new 3D interface has been developed using augmented reality since the position and orientation of the vehicle are known. The point of view of the observer can easily be changed during a manipulation of the ROV. A shared memory based server can access the position data of the ROV and update the graphics in real time. The graphic environment can be used as well to drive the ROV with computer generated trajectories. A video card will be added to the Silicon Graphics workstation to display the view of the camera fixed to the ROV. This visual feedback will only be available when the ROV is close enough to the dam. The images will be calibrated since the position of the camera is known. The operator interface also includes a set of stereoscopic camera, hydrophonic (sound) feedback and imaging tools for measuring cracks.

  3. Augmented reality application utility for aviation maintenance work instruction

    NASA Astrophysics Data System (ADS)

    Pourcho, John Bryan

    Current aviation maintenance work instructions do not display information effectively enough to prevent costly errors and safety concerns. Aircraft are complex assemblies of highly interrelated components that confound troubleshooting and can make the maintenance procedure difficult (Drury & Gramopadhye, 2001). The sophisticated nature of aircraft maintenance necessitates a revolutionized training intervention for aviation maintenance technicians (United States General Accounting Office, 2003). Quite simply, the paper based job task cards fall short of offering rapid access to technical data and the system or component visualization necessary for working on complex integrated aircraft systems. Possible solutions to this problem include upgraded standards for paper based task cards and the use of integrated 3D product definition used on various mobile platforms (Ropp, Thomas, Lee, Broyles, Lewin, Andreychek, & Nicol, 2013). Previous studies have shown that incorporation of 3D graphics in work instructions allow the user to more efficiently and accurately interpret maintenance information (Jackson & Batstone, 2008). For aircraft maintenance workers, the use of mobile 3D model-based task cards could make current paper task card standards obsolete with their ability to deliver relevant, synchronized information to and from the hangar. Unlike previous versions of 3D model-based definition task cards and paper task cards, which are currently used in the maintenance industry, 3D model based definition task cards have the potential to be more mobile and accessible. Utilizing augmented reality applications on mobile devices to seamlessly deliver 3D product definition on mobile devices could increase the efficiency, accuracy, and reduce the mental workload for technicians when performing maintenance tasks (Macchiarella, 2004). This proposal will serve as a literary review of the aviation maintenance industry, the spatial ability of maintenance technicians, and benefits of

  4. Augmented reality in healthcare education: an integrative review

    PubMed Central

    Zhu, Egui; Hadadgar, Arash; Masiello, Italo

    2014-01-01

    Background. The effective development of healthcare competencies poses great educational challenges. A possible approach to provide learning opportunities is the use of augmented reality (AR) where virtual learning experiences can be embedded in a real physical context. The aim of this study was to provide a comprehensive overview of the current state of the art in terms of user acceptance, the AR applications developed and the effect of AR on the development of competencies in healthcare. Methods. We conducted an integrative review. Integrative reviews are the broadest type of research review methods allowing for the inclusion of various research designs to more fully understand a phenomenon of concern. Our review included multi-disciplinary research publications in English reported until 2012. Results. 2529 research papers were found from ERIC, CINAHL, Medline, PubMed, Web of Science and Springer-link. Three qualitative, 20 quantitative and 2 mixed studies were included. Using a thematic analysis, we’ve described three aspects related to the research, technology and education. This study showed that AR was applied in a wide range of topics in healthcare education. Furthermore acceptance for AR as a learning technology was reported among the learners and its potential for improving different types of competencies. Discussion. AR is still considered as a novelty in the literature. Most of the studies reported early prototypes. Also the designed AR applications lacked an explicit pedagogical theoretical framework. Finally the learning strategies adopted were of the traditional style ‘see one, do one and teach one’ and do not integrate clinical competencies to ensure patients’ safety. PMID:25071992

  5. Integrating Augmented Reality in Higher Education: A Multidisciplinary Study of Student Perceptions

    ERIC Educational Resources Information Center

    Delello, Julie A.; McWhorter, Rochell R.; Camp, Kerri M.

    2015-01-01

    Augmented reality (AR) is an emerging technology that blends physical objects with virtual reality. Through the integration of digital and print media, a gap between the "on and offline" worlds are merged, radically shifting student-computer interaction in the classroom. This research examined the results of a multiple case study on the…

  6. Constructing Liminal Blends in a Collaborative Augmented-Reality Learning Environment

    ERIC Educational Resources Information Center

    Enyedy, Noel; Danish, Joshua A.; DeLiema, David

    2015-01-01

    In vision-based augmented-reality (AR) environments, users view the physical world through a video feed or device that "augments" the display with a graphical or informational overlay. Our goal in this manuscript is to ask "how" and "why" these new technologies create opportunities for learning. We suggest that AR is…

  7. Sensorized Garment Augmented 3D Pervasive Virtual Reality System

    NASA Astrophysics Data System (ADS)

    Gulrez, Tauseef; Tognetti, Alessandro; de Rossi, Danilo

    Virtual reality (VR) technology has matured to a point where humans can navigate in virtual scenes; however, providing them with a comfortable fully immersive role in VR remains a challenge. Currently available sensing solutions do not provide ease of deployment, particularly in the seated position due to sensor placement restrictions over the body, and optic-sensing requires a restricted indoor environment to track body movements. Here we present a 52-sensor laden garment interfaced with VR, which offers both portability and unencumbered user movement in a VR environment. This chapter addresses the systems engineering aspects of our pervasive computing solution of the interactive sensorized 3D VR and presents the initial results and future research directions. Participants navigated in a virtual art gallery using natural body movements that were detected by their wearable sensor shirt and then mapped the signals to electrical control signals responsible for VR scene navigation. The initial results are positive, and offer many opportunities for use in computationally intelligentman-machine multimedia control.

  8. Towards Determination of Visual Requirements for Augmented Reality Displays and Virtual Environments for the Airport Tower

    NASA Technical Reports Server (NTRS)

    Ellis, Stephen R.

    2006-01-01

    The visual requirements for augmented reality or virtual environments displays that might be used in real or virtual towers are reviewed wi th respect to similar displays already used in aircraft. As an example of the type of human performance studies needed to determine the use ful specifications of augmented reality displays, an optical see-thro ugh display was used in an ATC Tower simulation. Three different binocular fields of view (14 deg, 28 deg, and 47 deg) were examined to det ermine their effect on subjects# ability to detect aircraft maneuveri ng and landing. The results suggest that binocular fields of view much greater than 47 deg are unlikely to dramatically improve search perf ormance and that partial binocular overlap is a feasible display tech nique for augmented reality Tower applications.

  9. Towards Determination of Visual Requirements for Augmented Reality Displays and Virtual Environments for the Airport Tower

    NASA Technical Reports Server (NTRS)

    Ellis, Stephen R.

    2006-01-01

    The visual requirements for augmented reality or virtual environments displays that might be used in real or virtual towers are reviewed with respect to similar displays already used in aircraft. As an example of the type of human performance studies needed to determine the useful specifications of augmented reality displays, an optical see-through display was used in an ATC Tower simulation. Three different binocular fields of view (14deg, 28deg, and 47deg) were examined to determine their effect on subjects ability to detect aircraft maneuvering and landing. The results suggest that binocular fields of view much greater than 47deg are unlikely to dramatically improve search performance and that partial binocular overlap is a feasible display technique for augmented reality Tower applications.

  10. Emerging technology in surgical education: combining real-time augmented reality and wearable computing devices.

    PubMed

    Ponce, Brent A; Menendez, Mariano E; Oladeji, Lasun O; Fryberger, Charles T; Dantuluri, Phani K

    2014-11-01

    The authors describe the first surgical case adopting the combination of real-time augmented reality and wearable computing devices such as Google Glass (Google Inc, Mountain View, California). A 66-year-old man presented to their institution for a total shoulder replacement after 5 years of progressive right shoulder pain and decreased range of motion. Throughout the surgical procedure, Google Glass was integrated with the Virtual Interactive Presence and Augmented Reality system (University of Alabama at Birmingham, Birmingham, Alabama), enabling the local surgeon to interact with the remote surgeon within the local surgical field. Surgery was well tolerated by the patient and early surgical results were encouraging, with an improvement of shoulder pain and greater range of motion. The combination of real-time augmented reality and wearable computing devices such as Google Glass holds much promise in the field of surgery. PMID:25361359

  11. Design and implementation of wearable outdoor augmented reality system for Yuanmingyuan Garden

    NASA Astrophysics Data System (ADS)

    Lei, Jinchao; Liu, Yue; Wang, Yongtian

    2004-03-01

    The design and implementation of a wearable computer, which is used in the outdoor AR (Augmented Reality) system in Yuanmingyuan garden, is discussed in this paper. The performance and portability of the wearable computer system are the key factor for the proposed outdoor AR system. The design principles and schemes, which must be considered during prototype development, are presented. A prototype of the proposed wearable outdoor Augmented Reality system for Yuanmingyuan garden is developed, which also shows its application scenario in the reconstruction of other historic remains.

  12. Augmented Reality for Teaching Endotracheal Intubation: MR Imaging to Create Anatomically Correct Models

    PubMed Central

    Kerner, Karen F.; Imielinska, Celina; Rolland, Jannick; Tang, Haiying

    2003-01-01

    Clinical procedures have traditionally been taught at the bedside, in the morgue and in the animal lab. Augmented Reality (AR) technology (the merging of virtual reality and real objects or patients) provides a new method for teaching clinical and surgical procedures. Improved patient safety is a major advantage. We describe a system which employs AR technology to teach endotracheal intubation, using the Visible Human datasets, as well as MR images from live patient volunteers. PMID:14728393

  13. Exploring the benefits of augmented reality documentation for maintenance and repair.

    PubMed

    Henderson, Steven; Feiner, Steven

    2011-10-01

    We explore the development of an experimental augmented reality application that provides benefits to professional mechanics performing maintenance and repair tasks in a field setting. We developed a prototype that supports military mechanics conducting routine maintenance tasks inside an armored vehicle turret, and evaluated it with a user study. Our prototype uses a tracked headworn display to augment a mechanic's natural view with text, labels, arrows, and animated sequences designed to facilitate task comprehension, localization, and execution. A within-subject controlled user study examined professional military mechanics using our system to complete 18 common tasks under field conditions. These tasks included installing and removing fasteners and indicator lights, and connecting cables, all within the cramped interior of an armored personnel carrier turret. An augmented reality condition was tested against two baseline conditions: the same headworn display providing untracked text and graphics and a fixed flat panel display representing an improved version of the laptop-based documentation currently employed in practice. The augmented reality condition allowed mechanics to locate tasks more quickly than when using either baseline, and in some instances, resulted in less overall head movement. A qualitative survey showed that mechanics found the augmented reality condition intuitive and satisfying for the tested sequence of tasks. PMID:21041888

  14. Virtual Reality Augmentation for Functional Assessment and Treatment of Stuttering

    ERIC Educational Resources Information Center

    Brundage, Shelley B.

    2007-01-01

    Stuttering characteristics, assessment, and treatment principles present challenges to assessment and treatment that can be addressed with virtual reality (VR) technology. This article describes how VR can be used to assist clinicians in meeting some of these challenges with adults who stutter. A review of current VR research at the Stuttering…

  15. Moving from virtual reality exposure-based therapy to augmented reality exposure-based therapy: a review.

    PubMed

    Baus, Oliver; Bouchard, Stéphane

    2014-01-01

    This paper reviews the move from virtual reality exposure-based therapy to augmented reality exposure-based therapy (ARET). Unlike virtual reality (VR), which entails a complete virtual environment (VE), augmented reality (AR) limits itself to producing certain virtual elements to then merge them into the view of the physical world. Although, the general public may only have become aware of AR in the last few years, AR type applications have been around since beginning of the twentieth century. Since, then, technological developments have enabled an ever increasing level of seamless integration of virtual and physical elements into one view. Like VR, AR allows the exposure to stimuli which, due to various reasons, may not be suitable for real-life scenarios. As such, AR has proven itself to be a medium through which individuals suffering from specific phobia can be exposed "safely" to the object(s) of their fear, without the costs associated with programing complete VEs. Thus, ARET can offer an efficacious alternative to some less advantageous exposure-based therapies. Above and beyond presenting what has been accomplished in ARET, this paper covers some less well-known aspects of the history of AR, raises some ARET related issues, and proposes potential avenues to be followed. These include the type of measures to be used to qualify the user's experience in an augmented reality environment, the exclusion of certain AR-type functionalities from the definition of AR, as well as the potential use of ARET to treat non-small animal phobias, such as social phobia. PMID:24624073

  16. Moving from Virtual Reality Exposure-Based Therapy to Augmented Reality Exposure-Based Therapy: A Review

    PubMed Central

    Baus, Oliver; Bouchard, Stéphane

    2014-01-01

    This paper reviews the move from virtual reality exposure-based therapy to augmented reality exposure-based therapy (ARET). Unlike virtual reality (VR), which entails a complete virtual environment (VE), augmented reality (AR) limits itself to producing certain virtual elements to then merge them into the view of the physical world. Although, the general public may only have become aware of AR in the last few years, AR type applications have been around since beginning of the twentieth century. Since, then, technological developments have enabled an ever increasing level of seamless integration of virtual and physical elements into one view. Like VR, AR allows the exposure to stimuli which, due to various reasons, may not be suitable for real-life scenarios. As such, AR has proven itself to be a medium through which individuals suffering from specific phobia can be exposed “safely” to the object(s) of their fear, without the costs associated with programing complete VEs. Thus, ARET can offer an efficacious alternative to some less advantageous exposure-based therapies. Above and beyond presenting what has been accomplished in ARET, this paper covers some less well-known aspects of the history of AR, raises some ARET related issues, and proposes potential avenues to be followed. These include the type of measures to be used to qualify the user’s experience in an augmented reality environment, the exclusion of certain AR-type functionalities from the definition of AR, as well as the potential use of ARET to treat non-small animal phobias, such as social phobia. PMID:24624073

  17. Research on gesture recognition of augmented reality maintenance guiding system based on improved SVM

    NASA Astrophysics Data System (ADS)

    Zhao, Shouwei; Zhang, Yong; Zhou, Bin; Ma, Dongxi

    2014-09-01

    Interaction is one of the key techniques of augmented reality (AR) maintenance guiding system. Because of the complexity of the maintenance guiding system's image background and the high dimensionality of gesture characteristics, the whole process of gesture recognition can be divided into three stages which are gesture segmentation, gesture characteristic feature modeling and trick recognition. In segmentation stage, for solving the misrecognition of skin-like region, a segmentation algorithm combing background mode and skin color to preclude some skin-like regions is adopted. In gesture characteristic feature modeling of image attributes stage, plenty of characteristic features are analyzed and acquired, such as structure characteristics, Hu invariant moments features and Fourier descriptor. In trick recognition stage, a classifier based on Support Vector Machine (SVM) is introduced into the augmented reality maintenance guiding process. SVM is a novel learning method based on statistical learning theory, processing academic foundation and excellent learning ability, having a lot of issues in machine learning area and special advantages in dealing with small samples, non-linear pattern recognition at high dimension. The gesture recognition of augmented reality maintenance guiding system is realized by SVM after the granulation of all the characteristic features. The experimental results of the simulation of number gesture recognition and its application in augmented reality maintenance guiding system show that the real-time performance and robustness of gesture recognition of AR maintenance guiding system can be greatly enhanced by improved SVM.

  18. PRISMA-MAR: An Architecture Model for Data Visualization in Augmented Reality Mobile Devices

    ERIC Educational Resources Information Center

    Gomes Costa, Mauro Alexandre Folha; Serique Meiguins, Bianchi; Carneiro, Nikolas S.; Gonçalves Meiguins, Aruanda Simões

    2013-01-01

    This paper proposes an extension to mobile augmented reality (MAR) environments--the addition of data charts to the more usual text, image and video components. To this purpose, we have designed a client-server architecture including the main necessary modules and services to provide an Information Visualization MAR experience. The server side…

  19. Making the Invisible Visible in Science Museums through Augmented Reality Devices

    ERIC Educational Resources Information Center

    Yoon, Susan A.; Wang, Joyce

    2014-01-01

    Despite the potential of augmented reality (AR) in enabling students to construct new understanding, little is known about how the processes and interactions with the multimedia lead to increased learning. This study seeks to explore the affordances of an AR tool on learning that is focused on the science concept of magnets and magnetic fields.…

  20. Exploring the Effect of Materials Designed with Augmented Reality on Language Learners' Vocabulary Learning

    ERIC Educational Resources Information Center

    Solak, Ekrem; Cakir, Recep

    2015-01-01

    The purpose of this study was to determine the motivational level of the participants in a language classroom towards course materials designed in accordance with augmented reality technology and to identify the correlation between academic achievement and motivational level. 130 undergraduate students from a state-run university in Turkey…

  1. Modeling Augmented Reality Games with Preservice Elementary and Secondary Science Teachers

    ERIC Educational Resources Information Center

    Burton, Erin Peters; Frazier, Wendy; Annetta, Leonard; Lamb, Richard; Cheng, Rebecca; Chmiel, Margaret

    2011-01-01

    Cell phones are ever-present in daily life, yet vastly underused in the formal science classroom. The purpose of this study was to implement a novel learning tool on cell phones, Augmented Reality Games, and determine how the interaction influenced preservice teachers' content knowledge and self-efficacy of cell phone use in schools. Results show…

  2. Augmented Reality M-Learning to Enhance Nursing Skills Acquisition in the Clinical Skills Laboratory

    ERIC Educational Resources Information Center

    Garrett, Bernard M.; Jackson, Cathryn; Wilson, Brian

    2015-01-01

    Purpose: This paper aims to report on a pilot research project designed to explore if new mobile augmented reality (AR) technologies have the potential to enhance the learning of clinical skills in the lab. Design/methodology/approach: An exploratory action-research-based pilot study was undertaken to explore an initial proof-of-concept design in…

  3. Integrated Authoring Tool for Mobile Augmented Reality-Based E-Learning Applications

    ERIC Educational Resources Information Center

    Lobo, Marcos Fermin; Álvarez García, Víctor Manuel; del Puerto Paule Ruiz, María

    2013-01-01

    Learning management systems are increasingly being used to complement classroom teaching and learning and in some instances even replace traditional classroom settings with online educational tools. Mobile augmented reality is an innovative trend in e-learning that is creating new opportunities for teaching and learning. This article proposes a…

  4. Alien Contact!: Exploring Teacher Implementation of an Augmented Reality Curricular Unit

    ERIC Educational Resources Information Center

    Mitchell, Rebecca

    2011-01-01

    This paper reports on findings from a five-teacher, exploratory case study, critically observing their implementation of a technology-intensive, augmented reality (AR) mathematics curriculum unit, along with its paper-based control. The unit itself was intended to promote multiple proportional-reasoning strategies with urban, public, middle school…

  5. Using Augmented Reality in Early Art Education: A Case Study in Hong Kong Kindergarten

    ERIC Educational Resources Information Center

    Huang, Yujia; Li, Hui; Fong, Ricci

    2016-01-01

    Innovation in pedagogy by technology integration in kindergarten classroom has always been a challenge for most teachers. This design-based research aimed to explore the feasibility of using Augmented Reality (AR) technology in early art education with a focus on the gains and pains of this innovation. A case study was conducted in a typical…

  6. Augmented Reality for Teaching Science Vocabulary to Postsecondary Education Students with Intellectual Disabilities and Autism

    ERIC Educational Resources Information Center

    McMahon, Don D.; Cihak, David F.; Wright, Rachel E.; Bell, Sherry Mee

    2016-01-01

    The purpose of this study was to examine the use of an emerging technology called augmented reality to teach science vocabulary words to college students with intellectual disability and autism spectrum disorders. One student with autism and three students with an intellectual disability participated in a multiple probe across behaviors (i.e.,…

  7. Assessing the Effectiveness of Learning Solid Geometry by Using an Augmented Reality-Assisted Learning System

    ERIC Educational Resources Information Center

    Lin, Hao-Chiang Koong; Chen, Mei-Chi; Chang, Chih-Kai

    2015-01-01

    This study integrates augmented reality (AR) technology into teaching activities to design a learning system that assists junior high-school students in learning solid geometry. The following issues are addressed: (1) the relationship between achievements in mathematics and performance in spatial perception; (2) whether system-assisted learning…

  8. Integrating Augmented Reality Technology to Enhance Children's Learning in Marine Education

    ERIC Educational Resources Information Center

    Lu, Su-Ju; Liu, Ying-Chieh

    2015-01-01

    Marine education comprises rich and multifaceted issues. Raising general awareness of marine environments and issues demands the development of new learning materials. This study adapts concepts from digital game-based learning to design an innovative marine learning program integrating augmented reality (AR) technology for lower grade primary…

  9. What Teachers Need to Know about Augmented Reality Enhanced Learning Environments

    ERIC Educational Resources Information Center

    Wasko, Christopher

    2013-01-01

    Augmented reality (AR) enhanced learning environments have been designed to teach a variety of subjects by having learners act like professionals in the field as opposed to students in a classroom. The environments, grounded in constructivist and situated learning theories, place students in a meaningful, non-classroom environment and force them…

  10. ARBOOK: Development and Assessment of a Tool Based on Augmented Reality for Anatomy

    ERIC Educational Resources Information Center

    Ferrer-Torregrosa, J.; Torralba, J.; Jimenez, M. A.; García, S.; Barcia, J. M.

    2015-01-01

    The evolution of technologies and the development of new tools with educational purposes are growing up. This work presents the experience of a new tool based on augmented reality (AR) focusing on the anatomy of the lower limb. ARBOOK was constructed and developed based on TC and MRN images, dissections and drawings. For ARBOOK evaluation, a…

  11. Examining Young Children's Perception toward Augmented Reality-Infused Dramatic Play

    ERIC Educational Resources Information Center

    Han, Jeonghye; Jo, Miheon; Hyun, Eunja; So, Hyo-jeong

    2015-01-01

    Amid the increasing interest in applying augmented reality (AR) in educational settings, this study explores the design and enactment of an AR-infused robot system to enhance children's satisfaction and sensory engagement with dramatic play activities. In particular, we conducted an exploratory study to empirically examine children's perceptions…

  12. Comparing Virtual and Location-Based Augmented Reality Mobile Learning: Emotions and Learning Outcomes

    ERIC Educational Resources Information Center

    Harley, Jason M.; Poitras, Eric G.; Jarrell, Amanda; Duffy, Melissa C.; Lajoie, Susanne P.

    2016-01-01

    Research on the effectiveness of augmented reality (AR) on learning exists, but there is a paucity of empirical work that explores the role that positive emotions play in supporting learning in such settings. To address this gap, this study compared undergraduate students' emotions and learning outcomes during a guided historical tour using mobile…

  13. Apply an Augmented Reality in a Mobile Guidance to Increase Sense of Place for Heritage Places

    ERIC Educational Resources Information Center

    Chang, Yu-Lien; Hou, Huei-Tse; Pan, Chao-Yang; Sung, Yao-Ting; Chang, Kuo-En

    2015-01-01

    Based on the sense of place theory and the design principles of guidance and interpretation, this study developed an augmented reality mobile guidance system that used a historical geo-context-embedded visiting strategy. This tool for heritage guidance and educational activities enhanced visitor sense of place. This study consisted of 3 visitor…

  14. The Use of Augmented Reality Games in Education: A Review of the Literature

    ERIC Educational Resources Information Center

    Koutromanos, George; Sofos, Alivisos; Avraamidou, Lucy

    2015-01-01

    This paper provides a review of the literature about the use of augmented reality in education and specifically in the context of formal and informal environments. It examines the research that has been conducted up to date on the use of those games through mobile technology devices such as mobile phones and tablets, both in primary and secondary…

  15. Affordances and Limitations of Immersive Participatory Augmented Reality Simulations for Teaching and Learning

    ERIC Educational Resources Information Center

    Dunleavy, Matt; Dede, Chris; Mitchell, Rebecca

    2009-01-01

    The purpose of this study was to document how teachers and students describe and comprehend the ways in which participating in an augmented reality (AR) simulation aids or hinders teaching and learning. Like the multi-user virtual environment (MUVE) interface that underlies Internet games, AR is a good medium for immersive collaborative…

  16. Affordances of Augmented Reality in Science Learning: Suggestions for Future Research

    ERIC Educational Resources Information Center

    Cheng, Kun-Hung; Tsai, Chin-Chung

    2013-01-01

    Augmented reality (AR) is currently considered as having potential for pedagogical applications. However, in science education, research regarding AR-aided learning is in its infancy. To understand how AR could help science learning, this review paper firstly has identified two major approaches of utilizing AR technology in science education,…

  17. Wearable computer for mobile augmented-reality-based controlling of an intelligent robot

    NASA Astrophysics Data System (ADS)

    Turunen, Tuukka; Roening, Juha; Ahola, Sami; Pyssysalo, Tino

    2000-10-01

    An intelligent robot can be utilized to perform tasks that are either hazardous or unpleasant for humans. Such tasks include working in disaster areas or conditions that are, for example, too hot. An intelligent robot can work on its own to some extent, but in some cases the aid of humans will be needed. This requires means for controlling the robot from somewhere else, i.e. teleoperation. Mobile augmented reality can be utilized as a user interface to the environment, as it enhances the user's perception of the situation compared to other interfacing methods and allows the user to perform other tasks while controlling the intelligent robot. Augmented reality is a method that combines virtual objects into the user's perception of the real world. As computer technology evolves, it is possible to build very small devices that have sufficient capabilities for augmented reality applications. We have evaluated the existing wearable computers and mobile augmented reality systems to build a prototype of a future mobile terminal- the CyPhone. A wearable computer with sufficient system resources for applications, wireless communication media with sufficient throughput and enough interfaces for peripherals has been built at the University of Oulu. It is self-sustained in energy, with enough operating time for the applications to be useful, and uses accurate positioning systems.

  18. Augmented reality image guidance for minimally invasive coronary artery bypass

    NASA Astrophysics Data System (ADS)

    Figl, Michael; Rueckert, Daniel; Hawkes, David; Casula, Roberto; Hu, Mingxing; Pedro, Ose; Zhang, Dong Ping; Penney, Graeme; Bello, Fernando; Edwards, Philip

    2008-03-01

    We propose a novel system for image guidance in totally endoscopic coronary artery bypass (TECAB). A key requirement is the availability of 2D-3D registration techniques that can deal with non-rigid motion and deformation. Image guidance for TECAB is mainly required before the mechanical stabilization of the heart, thus the most dominant source of non-rigid deformation is the motion of the beating heart. To augment the images in the endoscope of the da Vinci robot, we have to find the transformation from the coordinate system of the preoperative imaging modality to the system of the endoscopic cameras. In a first step we build a 4D motion model of the beating heart. Intraoperatively we can use the ECG or video processing to determine the phase of the cardiac cycle. We can then take the heart surface from the motion model and register it to the stereo-endoscopic images of the da Vinci robot using 2D-3D registration methods. We are investigating robust feature tracking and intensity-based methods for this purpose. Images of the vessels available in the preoperative coordinate system can then be transformed to the camera system and projected into the calibrated endoscope view using two video mixers with chroma keying. It is hoped that the augmented view can improve the efficiency of TECAB surgery and reduce the conversion rate to more conventional procedures.

  19. Augmented reality: don't we all wish we lived in one?

    SciTech Connect

    Hayes, Birchard P; Michel, Kelly D; Few, Douglas A; Gertman, David; Le Blanc, Katya

    2010-01-01

    From stereophonic, positional sound to high-definition imagery that is crisp and clean, high fidelity computer graphics enhance our view, insight, and intuition regarding our environments and conditions. Contemporary 3-D modeling tools offer an open architecture framework that enables integration with other technologically innovative arenas. One innovation of great interest is Augmented Reality, the merging of virtual, digital environments with physical, real-world environments creating a mixed reality where relevant data and information augments the real or actual experience in real-time by spatial or semantic context. Pairing 3-D virtual immersive models with a dynamic platform such as semi-autonomous robotics or personnel odometry systems to create a mixed reality offers a new and innovative design information verification inspection capability, evaluation accuracy, and information gathering capability for nuclear facilities. Our paper discusses the integration of two innovative technologies, 3-D visualizations with inertial positioning systems, and the resulting augmented reality offered to the human inspector. The discussion in the paper includes an exploration of human and non-human (surrogate) inspections of a nuclear facility, integrated safeguards knowledge within a synchronized virtual model operated, or worn, by a human inspector, and the anticipated benefits to safeguards evaluations of facility operations.

  20. Mobile augmented reality in support of building damage and safety assessment

    NASA Astrophysics Data System (ADS)

    Kim, W.; Kerle, N.; Gerke, M.

    2016-02-01

    Rapid and accurate assessment of the state of buildings in the aftermath of a disaster event is critical for an effective and timely response. For rapid damage assessment of buildings, the utility of remote sensing (RS) technology has been widely researched, with focus on a range of platforms and sensors. However, RS-based approaches still have limitations to assess structural integrity and the specific damage status of individual buildings. Structural integrity refers to the ability of a building to hold the entire structure. Consequently, ground-based assessment conducted by structural engineers and first responders is still required. This paper demonstrates the concept of mobile augmented reality (mAR) to improve performance of building damage and safety assessment in situ. Mobile AR provides a means to superimpose various types of reference or pre-disaster information (virtual data) on actual post-disaster building data (real buildings). To adopt mobile AR, this study defines a conceptual framework based on the level of complexity (LOC). The framework consists of four LOCs, and for each of these, the data types, required processing steps, AR implementation and use for damage assessment are described. Based on this conceptualization we demonstrate prototypes of mAR for both indoor and outdoor purposes. Finally, we conduct a user evaluation of the prototypes to validate the mAR approach for building damage and safety assessment.

  1. Mobile Augmented Reality in support of building damage and safety assessment

    NASA Astrophysics Data System (ADS)

    Kim, W.; Kerle, N.; Gerke, M.

    2015-04-01

    Rapid and accurate assessment of the state of buildings in the aftermath of a disaster event is critical for an effective and timely response. For rapid damage assessment of buildings, the utility of remote sensing (RS) technology has been widely researched, with focus on a range of platforms and sensors. However, RS-based approach still have limitations to assess structural integrity and the specific damage status of individual buildings. Consequently, ground-based assessment conducted by structural engineers and first responders is still required. This paper demonstrates the concept of mobile Augmented Reality (mAR) to improve performance of building damage and safety assessment in situ. Mobile AR provides a means to superimpose various types of reference or pre-disaster information (virtual data) on actual post-disaster building data (real building). To adopt mobile AR, this study defines a conceptual framework based on Level of Complexity (LOC). The framework consists of four LOCs, and for each of these the data types, required processing steps, AR implementation, and use for damage assessment, are described. Based on this conceptualization we demonstrate prototypes of mAR for both indoor and outdoor purposes. Finally, we conduct a user evaluation of the prototypes to validate the mAR approach for building damage and safety assessment.

  2. A method for real-time generation of augmented reality work instructions via expert movements

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Bhaskar; Winer, Eliot

    2015-03-01

    Augmented Reality (AR) offers tremendous potential for a wide range of fields including entertainment, medicine, and engineering. AR allows digital models to be integrated with a real scene (typically viewed through a video camera) to provide useful information in a variety of contexts. The difficulty in authoring and modifying scenes is one of the biggest obstacles to widespread adoption of AR. 3D models must be created, textured, oriented and positioned to create the complex overlays viewed by a user. This often requires using multiple software packages in addition to performing model format conversions. In this paper, a new authoring tool is presented which uses a novel method to capture product assembly steps performed by a user with a depth+RGB camera. Through a combination of computer vision and imaging process techniques, each individual step is decomposed into objects and actions. The objects are matched to those in a predetermined geometry library and the actions turned into animated assembly steps. The subsequent instruction set is then generated with minimal user input. A proof of concept is presented to establish the method's viability.

  3. Comparing "pick and place" task in spatial Augmented Reality versus non-immersive Virtual Reality for rehabilitation setting.

    PubMed

    Khademi, Maryam; Hondori, Hossein Mousavi; Dodakian, Lucy; Cramer, Steve; Lopes, Cristina V

    2013-01-01

    Introducing computer games to the rehabilitation market led to development of numerous Virtual Reality (VR) training applications. Although VR has provided tremendous benefit to the patients and caregivers, it has inherent limitations, some of which might be solved by replacing it with Augmented Reality (AR). The task of pick-and-place, which is part of many activities of daily living (ADL's), is one of the major affected functions stroke patients mainly expect to recover. We developed an exercise consisting of moving an object between various points, following a flash light that indicates the next target. The results show superior performance of subjects in spatial AR versus non-immersive VR setting. This could be due to the extraneous hand-eye coordination which exists in VR whereas it is eliminated in spatial AR. PMID:24110762

  4. Virtual reality, augmented reality, and robotics applied to digestive operative procedures: from in vivo animal preclinical studies to clinical use

    NASA Astrophysics Data System (ADS)

    Soler, Luc; Marescaux, Jacques

    2006-04-01

    Technological innovations of the 20 th century provided medicine and surgery with new tools, among which virtual reality and robotics belong to the most revolutionary ones. Our work aims at setting up new techniques for detection, 3D delineation and 4D time follow-up of small abdominal lesions from standard mecial images (CT scsan, MRI). It also aims at developing innovative systems making tumor resection or treatment easier with the use of augmented reality and robotized systems, increasing gesture precision. It also permits a realtime great distance connection between practitioners so they can share a same 3D reconstructed patient and interact on a same patient, virtually before the intervention and for real during the surgical procedure thanks to a telesurgical robot. In preclinical studies, our first results obtained from a micro-CT scanner show that these technologies provide an efficient and precise 3D modeling of anatomical and pathological structures of rats and mice. In clinical studies, our first results show the possibility to improve the therapeutic choice thanks to a better detection and and representation of the patient before performing the surgical gesture. They also show the efficiency of augmented reality that provides virtual transparency of the patient in real time during the operative procedure. In the near future, through the exploitation of these systems, surgeons will program and check on the virtual patient clone an optimal procedure without errors, which will be replayed on the real patient by the robot under surgeon control. This medical dream is today about to become reality.

  5. Sensor-Aware Recognition and Tracking for Wide-Area Augmented Reality on Mobile Phones.

    PubMed

    Chen, Jing; Cao, Ruochen; Wang, Yongtian

    2015-01-01

    Wide-area registration in outdoor environments on mobile phones is a challenging task in mobile augmented reality fields. We present a sensor-aware large-scale outdoor augmented reality system for recognition and tracking on mobile phones. GPS and gravity information is used to improve the VLAD performance for recognition. A kind of sensor-aware VLAD algorithm, which is self-adaptive to different scale scenes, is utilized to recognize complex scenes. Considering vision-based registration algorithms are too fragile and tend to drift, data coming from inertial sensors and vision are fused together by an extended Kalman filter (EKF) to achieve considerable improvements in tracking stability and robustness. Experimental results show that our method greatly enhances the recognition rate and eliminates the tracking jitters. PMID:26690439

  6. Comparative Evaluation of Monocular Augmented-Reality Display for Surgical Microscopes

    PubMed Central

    Palma, Santiago Rodríguez; Becker, Brian C.; Lobes, Louis A.; Riviere, Cameron N.

    2012-01-01

    Medical augmented reality has undergone great development recently. However, there is a lack of studies to compare quantitatively the different display options available. This paper compares the effects of different graphical overlay systems in a simple micromanipulation task with “soft” visual servoing. We compared positioning accuracy in a real-time visually-guided task using Micron, an active handheld tremor-canceling microsurgical instrument, using three different displays: 2D screen, 3D screen, and microscope with monocular image injection. Tested with novices and an experienced vitreoretinal surgeon, display of virtual cues in the microscope via an augmented reality injection system significantly (p < 0.05) decreased 3D error compared to the 2D and 3D monitors when confounding factors such as magnification level were normalized. PMID:23366164

  7. Sensor-Aware Recognition and Tracking for Wide-Area Augmented Reality on Mobile Phones

    PubMed Central

    Chen, Jing; Cao, Ruochen; Wang, Yongtian

    2015-01-01

    Wide-area registration in outdoor environments on mobile phones is a challenging task in mobile augmented reality fields. We present a sensor-aware large-scale outdoor augmented reality system for recognition and tracking on mobile phones. GPS and gravity information is used to improve the VLAD performance for recognition. A kind of sensor-aware VLAD algorithm, which is self-adaptive to different scale scenes, is utilized to recognize complex scenes. Considering vision-based registration algorithms are too fragile and tend to drift, data coming from inertial sensors and vision are fused together by an extended Kalman filter (EKF) to achieve considerable improvements in tracking stability and robustness. Experimental results show that our method greatly enhances the recognition rate and eliminates the tracking jitters. PMID:26690439

  8. Avoiding Focus Shifts in Surgical Telementoring Using an Augmented Reality Transparent Display.

    PubMed

    Andersen, Daniel; Popescu, Voicu; Cabrera, Maria Eugenia; Shanghavi, Aditya; Gomez, Gerardo; Marley, Sherri; Mullis, Brian; Wachs, Juan

    2016-01-01

    Conventional surgical telementoring systems require the trainee to shift focus away from the operating field to a nearby monitor to receive mentor guidance. This paper presents the next generation of telementoring systems. Our system, STAR (System for Telementoring with Augmented Reality) avoids focus shifts by placing mentor annotations directly into the trainee's field of view using augmented reality transparent display technology. This prototype was tested with pre-medical and medical students. Experiments were conducted where participants were asked to identify precise operating field locations communicated to them using either STAR or a conventional telementoring system. STAR was shown to improve accuracy and to reduce focus shifts. The initial STAR prototype only provides an approximate transparent display effect, without visual continuity between the display and the surrounding area. The current version of our transparent display provides visual continuity by showing the geometry and color of the operating field from the trainee's viewpoint. PMID:27046545

  9. Augmented Reality in a Simulated Tower Environment: Effect of Field of View on Aircraft Detection

    NASA Technical Reports Server (NTRS)

    Ellis, Stephen R.; Adelstein, Bernard D.; Reisman, Ronald J.; Schmidt-Ott, Joelle R.; Gips, Jonathan; Krozel, Jimmy; Cohen, Malcolm (Technical Monitor)

    2002-01-01

    An optical see-through, augmented reality display was used to study subjects' ability to detect aircraft maneuvering and landing at the Dallas Ft. Worth International airport in an ATC Tower simulation. Subjects monitored the traffic patterns as if from the airport's western control tower. Three binocular fields of view (14 deg, 28 deg and 47 deg) were studied in an independent groups' design to measure the degradation in detection performance associated with the visual field restrictions. In a second experiment the 14 deg and 28 deg fields were presented either with 46% binocular overlap or 100% overlap for separate groups. The near asymptotic results of the first experiment suggest that binocular fields of view much greater than 47% are unlikely to dramatically improve performance; and those of the second experiment suggest that partial binocular overlap is feasible for augmented reality displays such as may be used for ATC tower applications.

  10. Vision-based registration for augmented reality system using monocular and binocular vision

    NASA Astrophysics Data System (ADS)

    Vallerand, Steve; Kanbara, Masayuki; Yokoya, Naokazu

    2003-05-01

    In vision-based augmented reality systems, the relation between the real and virtual worlds needs to be estimated to perform the registration of virtual objects. This paper suggests a vision-based registration method for video see-through augmented reality systems using binocular cameras which increases the quality of the registration performed using three points of a known marker. The originality of this work is the use of both monocular vision-based and stereoscopic vision-based techniques in order to complete the registration. Also, a method that performs a correction of the 2D positions in the images of the marker points is proposed. The correction improves the registration stability and accuracy of the system. The stability of the registration obtained with the proposed registration method combined or not with the correction method is compared to the registration obtained with standard stereoscopic registration.

  11. Tracking and registration method based on vector operation for augmented reality system

    NASA Astrophysics Data System (ADS)

    Gao, Yanfei; Wang, Hengyou; Bian, Xiaoning

    2015-08-01

    Tracking and registration is one key issue for an augmented reality (AR) system. For the marker-based AR system, the research focuses on detecting the real-time position and orientation of camera. In this paper, we describe a method of tracking and registration using the vector operations. Our method is proved to be stable and accurate, and have a good real-time performance.

  12. How to reduce workload--augmented reality to ease the work of air traffic controllers.

    PubMed

    Hofmann, Thomas; König, Christina; Bruder, Ralph; Bergner, Jörg

    2012-01-01

    In the future the air traffic will rise--the workload of the controllers will do the same. In the BMWi research project, one of the tasks is, how to ensure safe air traffic, and a reasonable workload for the air traffic controllers. In this project it was the goal to find ways how to reduce the workload (and stress) for the controllers to allow safe air traffic, esp. at huge hub-airports by implementing augmented reality visualization and interaction. PMID:22316878

  13. Technical experience from clinical studies with INPRES and a concept for a miniature augmented reality system

    NASA Astrophysics Data System (ADS)

    Sudra, Gunther; Marmulla, Ruediger; Salb, Tobias; Gockel, Tilo; Eggers, Georg; Giesler, Bjoern; Ghanai, Sassan; Fritz, Dominik; Dillmann, Ruediger; Muehling, Joachim

    2005-04-01

    This paper is going to present a summary of our technical experience with the INPRES System -- an augmented reality system based upon a tracked see-through head-mounted display. With INPRES a complete augmented reality solution has been developed that has crucial advantages when compared with previous navigation systems. Using these techniques the surgeon does not need to turn his head from the patient to the computer monitor and vice versa. The system's purpose is to display virtual objects, e.g. cutting trajectories, tumours and risk-areas from computer-based surgical planning systems directly in the surgical site. The INPRES system was evaluated in several patient experiments in craniofacial surgery at the Department of Oral and Maxillofacial Surgery/University of Heidelberg. We will discuss the technical advantages as well as the limitations of INPRES and present two strategies as a result. On the one hand we will improve the existing and successful INPRES system with new hardware and a new calibration method to compensate for the stated disadvantage. On the other hand we will focus on miniaturized augmented reality systems and present a new concept based on fibre optics. This new system should be easily adaptable at surgical instruments and capable of projecting small structures. It consists of a source of light, a miniature TFT display, a fibre optic cable and a tool grip. Compared to established projection systems it has the capability of projecting into areas that are only accessible by a narrow path. No wide surgical exposure of the region is necessary for the use of augmented reality.

  14. Augmented reality three-dimensional object visualization and recognition with axially distributed sensing.

    PubMed

    Markman, Adam; Shen, Xin; Hua, Hong; Javidi, Bahram

    2016-01-15

    An augmented reality (AR) smartglass display combines real-world scenes with digital information enabling the rapid growth of AR-based applications. We present an augmented reality-based approach for three-dimensional (3D) optical visualization and object recognition using axially distributed sensing (ADS). For object recognition, the 3D scene is reconstructed, and feature extraction is performed by calculating the histogram of oriented gradients (HOG) of a sliding window. A support vector machine (SVM) is then used for classification. Once an object has been identified, the 3D reconstructed scene with the detected object is optically displayed in the smartglasses allowing the user to see the object, remove partial occlusions of the object, and provide critical information about the object such as 3D coordinates, which are not possible with conventional AR devices. To the best of our knowledge, this is the first report on combining axially distributed sensing with 3D object visualization and recognition for applications to augmented reality. The proposed approach can have benefits for many applications, including medical, military, transportation, and manufacturing. PMID:26766698

  15. Does Augmented Reality Affect High School Students' Learning Outcomes in Chemistry?

    NASA Astrophysics Data System (ADS)

    Renner, Jonathan Christopher

    Some teens may prefer using a self-directed, constructivist, and technologic approach to learning rather than traditional classroom instruction. If it can be demonstrated, educators may adjust their teaching methodology. The guiding research question for this study focused on how augmented reality affects high school students' learning outcomes in chemistry, as measured by a pretest and posttest methodology when ensuring that the individual outcomes were not the result of group collaboration. This study employed a quantitative, quasi-experimental study design that used a comparison and experimental group. Inferential statistical analysis was employed. The study was conducted at a high school in southwest Colorado. Eighty-nine respondents returned completed and signed consent forms, and 78 participants completed the study. Results demonstrated that augmented reality instruction caused posttest scores to significantly increase, as compared to pretest scores, but it was not as effective as traditional classroom instruction. Scores did improve under both types of instruction; therefore, more research is needed in this area. The present study was the first quantitative experiment controlling for individual learning to validate augmented reality using mobile handheld digital devices that affected individual students' learning outcomes without group collaboration. This topic was important to the field of education as it may help educators understand how students learn and it may also change the way students are taught.

  16. Design and Validation of an Augmented Reality System for Laparoscopic Surgery in a Real Environment

    PubMed Central

    López-Mir, F.; Naranjo, V.; Fuertes, J. J.; Alcañiz, M.; Bueno, J.; Pareja, E.

    2013-01-01

    Purpose. This work presents the protocol carried out in the development and validation of an augmented reality system which was installed in an operating theatre to help surgeons with trocar placement during laparoscopic surgery. The purpose of this validation is to demonstrate the improvements that this system can provide to the field of medicine, particularly surgery. Method. Two experiments that were noninvasive for both the patient and the surgeon were designed. In one of these experiments the augmented reality system was used, the other one was the control experiment, and the system was not used. The type of operation selected for all cases was a cholecystectomy due to the low degree of complexity and complications before, during, and after the surgery. The technique used in the placement of trocars was the French technique, but the results can be extrapolated to any other technique and operation. Results and Conclusion. Four clinicians and ninety-six measurements obtained of twenty-four patients (randomly assigned in each experiment) were involved in these experiments. The final results show an improvement in accuracy and variability of 33% and 63%, respectively, in comparison to traditional methods, demonstrating that the use of an augmented reality system offers advantages for trocar placement in laparoscopic surgery. PMID:24236293

  17. 3D interactive augmented reality-enhanced digital learning systems for mobile devices

    NASA Astrophysics Data System (ADS)

    Feng, Kai-Ten; Tseng, Po-Hsuan; Chiu, Pei-Shuan; Yang, Jia-Lin; Chiu, Chun-Jie

    2013-03-01

    With enhanced processing capability of mobile platforms, augmented reality (AR) has been considered a promising technology for achieving enhanced user experiences (UX). Augmented reality is to impose virtual information, e.g., videos and images, onto a live-view digital display. UX on real-world environment via the display can be e ectively enhanced with the adoption of interactive AR technology. Enhancement on UX can be bene cial for digital learning systems. There are existing research works based on AR targeting for the design of e-learning systems. However, none of these work focuses on providing three-dimensional (3-D) object modeling for en- hanced UX based on interactive AR techniques. In this paper, the 3-D interactive augmented reality-enhanced learning (IARL) systems will be proposed to provide enhanced UX for digital learning. The proposed IARL systems consist of two major components, including the markerless pattern recognition (MPR) for 3-D models and velocity-based object tracking (VOT) algorithms. Realistic implementation of proposed IARL system is conducted on Android-based mobile platforms. UX on digital learning can be greatly improved with the adoption of proposed IARL systems.

  18. Augmented reality and cone beam CT guidance for transoral robotic surgery.

    PubMed

    Liu, Wen P; Richmon, Jeremy D; Sorger, Jonathan M; Azizian, Mahdi; Taylor, Russell H

    2015-09-01

    In transoral robotic surgery preoperative image data do not reflect large deformations of the operative workspace from perioperative setup. To address this challenge, in this study we explore image guidance with cone beam computed tomographic angiography to guide the dissection of critical vascular landmarks and resection of base-of-tongue neoplasms with adequate margins for transoral robotic surgery. We identify critical vascular landmarks from perioperative c-arm imaging to augment the stereoscopic view of a da Vinci si robot in addition to incorporating visual feedback from relative tool positions. Experiments resecting base-of-tongue mock tumors were conducted on a series of ex vivo and in vivo animal models comparing the proposed workflow for video augmentation to standard non-augmented practice and alternative, fluoroscopy-based image guidance. Accurate identification of registered augmented critical anatomy during controlled arterial dissection and en bloc mock tumor resection was possible with the augmented reality system. The proposed image-guided robotic system also achieved improved resection ratios of mock tumor margins (1.00) when compared to control scenarios (0.0) and alternative methods of image guidance (0.58). The experimental results show the feasibility of the proposed workflow and advantages of cone beam computed tomography image guidance through video augmentation of the primary stereo endoscopy as compared to control and alternative navigation methods. PMID:26531203

  19. Impact of Soft Tissue Heterogeneity on Augmented Reality for Liver Surgery.

    PubMed

    Haouchine, Nazim; Cotin, Stephane; Peterlik, Igor; Dequidt, Jeremie; Lopez, Mario Sanz; Kerrien, Erwan; Berger, Marie-Odile

    2015-05-01

    This paper presents a method for real-time augmented reality of internal liver structures during minimally invasive hepatic surgery. Vessels and tumors computed from pre-operative CT scans can be overlaid onto the laparoscopic view for surgery guidance. Compared to current methods, our method is able to locate the in-depth positions of the tumors based on partial three-dimensional liver tissue motion using a real-time biomechanical model. This model permits to properly handle the motion of internal structures even in the case of anisotropic or heterogeneous tissues, as it is the case for the liver and many anatomical structures. Experimentations conducted on phantom liver permits to measure the accuracy of the augmentation while real-time augmentation on in vivo human liver during real surgery shows the benefits of such an approach for minimally invasive surgery. PMID:26357206

  20. Augmented reality system for MR-guided interventions: phantom studies and first animal test

    NASA Astrophysics Data System (ADS)

    Vogt, Sebastian; Wacker, Frank; Khamene, Ali; Elgort, Daniel R.; Sielhorst, Tobias; Niemann, Heinrich; Duerk, Jeff; Lewin, Jonathan S.; Sauer, Frank

    2004-05-01

    We developed an augmented reality navigation system for MR-guided interventions. A head-mounted display provides in real-time a stereoscopic video-view of the patient, which is augmented with three-dimensional medical information to perform MR-guided needle placement procedures. Besides with the MR image information, we augment the scene with 3D graphics representing a forward extension of the needle and the needle itself. During insertion, the needle can be observed virtually at its actual location in real-time, supporting the interventional procedure in an efficient and intuitive way. In this paper we report on quantitative results of AR guided needle placement procedures on gel phantoms with embedded targets of 12mm and 6mm diameter; we furthermore evaluate our first animal experiment involving needle insertion into deep lying anatomical structures of a pig.

  1. Planning, simulation, and augmented reality for robotic cardiac procedures: The STARS system of the ChIR team.

    PubMed

    Coste-Manière, Eve; Adhami, Louaï; Mourgues, Fabien; Carpentier, Alain

    2003-04-01

    This paper presents STARS (Simulation and Transfer Architecture for Robotic Surgery), a versatile system that aims at enhancing minimally invasive robotic surgery through patient-dependent optimized planning, realistic simulation, safe supervision, and augmented reality. The underlying architecture of the proposed approach is presented, then each component is detailed. An experimental validation is conducted on a dog for a coronary bypass intervention using the Da Vinci(TM) surgical system focusing on planing, registration, and augmented reality trials. PMID:12838484

  2. In Vivo versus Augmented Reality Exposure in the Treatment of Small Animal Phobia: A Randomized Controlled Trial.

    PubMed

    Botella, Cristina; Pérez-Ara, M Ángeles; Bretón-López, Juana; Quero, Soledad; García-Palacios, Azucena; Baños, Rosa María

    2016-01-01

    Although in vivo exposure is the treatment of choice for specific phobias, some acceptability problems have been associated with it. Virtual Reality exposure has been shown to be as effective as in vivo exposure, and it is widely accepted for the treatment of specific phobias, but only preliminary data are available in the literature about the efficacy of Augmented Reality. The purpose of the present study was to examine the efficacy and acceptance of two treatment conditions for specific phobias in which the exposure component was applied in different ways: In vivo exposure (N = 31) versus an Augmented Reality system (N = 32) in a randomized controlled trial. "One-session treatment" guidelines were followed. Participants in the Augmented Reality condition significantly improved on all the outcome measures at post-treatment and follow-ups. When the two treatment conditions were compared, some differences were found at post-treatment, favoring the participants who received in vivo exposure. However, these differences disappeared at the 3- and 6-month follow-ups. Regarding participants' expectations and satisfaction with the treatment, very positive ratings were reported in both conditions. In addition, participants from in vivo exposure condition considered the treatment more useful for their problem whereas participants from Augmented Reality exposure considered the treatment less aversive. Results obtained in this study indicate that Augmented Reality exposure is an effective treatment for specific phobias and well accepted by the participants. PMID:26886423

  3. In Vivo versus Augmented Reality Exposure in the Treatment of Small Animal Phobia: A Randomized Controlled Trial

    PubMed Central

    Botella, Cristina; Pérez-Ara, M. Ángeles; Bretón-López, Juana; Quero, Soledad; García-Palacios, Azucena; Baños, Rosa María

    2016-01-01

    Although in vivo exposure is the treatment of choice for specific phobias, some acceptability problems have been associated with it. Virtual Reality exposure has been shown to be as effective as in vivo exposure, and it is widely accepted for the treatment of specific phobias, but only preliminary data are available in the literature about the efficacy of Augmented Reality. The purpose of the present study was to examine the efficacy and acceptance of two treatment conditions for specific phobias in which the exposure component was applied in different ways: In vivo exposure (N = 31) versus an Augmented Reality system (N = 32) in a randomized controlled trial. “One-session treatment” guidelines were followed. Participants in the Augmented Reality condition significantly improved on all the outcome measures at post-treatment and follow-ups. When the two treatment conditions were compared, some differences were found at post-treatment, favoring the participants who received in vivo exposure. However, these differences disappeared at the 3- and 6-month follow-ups. Regarding participants’ expectations and satisfaction with the treatment, very positive ratings were reported in both conditions. In addition, participants from in vivo exposure condition considered the treatment more useful for their problem whereas participants from Augmented Reality exposure considered the treatment less aversive. Results obtained in this study indicate that Augmented Reality exposure is an effective treatment for specific phobias and well accepted by the participants. PMID:26886423

  4. A Novel Augmented Reality Navigation System for Endoscopic Sinus and Skull Base Surgery: A Feasibility Study

    PubMed Central

    Li, Liang; Yang, Jian; Chu, Yakui; Wu, Wenbo; Xue, Jin; Liang, Ping; Chen, Lei

    2016-01-01

    Objective To verify the reliability and clinical feasibility of a self-developed navigation system based on an augmented reality technique for endoscopic sinus and skull base surgery. Materials and Methods In this study we performed a head phantom and cadaver experiment to determine the display effect and accuracy of our navigational system. We compared cadaver head-based simulated operations, the target registration error, operation time, and National Aeronautics and Space Administration Task Load Index scores of our navigation system to conventional navigation systems. Results The navigation system developed in this study has a novel display mode capable of fusing endoscopic images to three-dimensional (3-D) virtual images. In the cadaver head experiment, the target registration error was 1.28 ± 0.45 mm, which met the accepted standards of a navigation system used for nasal endoscopic surgery. Compared with conventional navigation systems, the new system was more effective in terms of operation time and the mental workload of surgeons, which is especially important for less experienced surgeons. Conclusion The self-developed augmented reality navigation system for endoscopic sinus and skull base surgery appears to have advantages that outweigh those of conventional navigation systems. We conclude that this navigational system will provide rhinologists with more intuitive and more detailed imaging information, thus reducing the judgment time and mental workload of surgeons when performing complex sinus and skull base surgeries. Ultimately, this new navigational system has potential to increase the quality of surgeries. In addition, the augmented reality navigational system could be of interest to junior doctors being trained in endoscopic techniques because it could speed up their learning. However, it should be noted that the navigation system serves as an adjunct to a surgeon’s skills and knowledge, not as a substitute. PMID:26757365

  5. Augmented Reality for the Assessment of Children's Spatial Memory in Real Settings

    PubMed Central

    Juan, M.-Carmen; Mendez-Lopez, Magdalena; Perez-Hernandez, Elena; Albiol-Perez, Sergio

    2014-01-01

    Short-term memory can be defined as the capacity for holding a small amount of information in mind in an active state for a short period of time. Although some instruments have been developed to study spatial short-term memory in real environments, there are no instruments that are specifically designed to assess visuospatial short-term memory in an attractive way to children. In this paper, we present the ARSM (Augmented Reality Spatial Memory) task, the first Augmented Reality task that involves a user's movement to assess spatial short-term memory in healthy children. The experimental procedure of the ARSM task was designed to assess the children's skill to retain visuospatial information. They were individually asked to remember the real place where augmented reality objects were located. The children (N = 76) were divided into two groups: preschool (5–6 year olds) and primary school (7–8 year olds). We found a significant improvement in ARSM task performance in the older group. The correlations between scores for the ARSM task and traditional procedures were significant. These traditional procedures were the Dot Matrix subtest for the assessment of visuospatial short-term memory of the computerized AWMA-2 battery and a parent's questionnaire about a child's everyday spatial memory. Hence, we suggest that the ARSM task has high verisimilitude with spatial short-term memory skills in real life. In addition, we evaluated the ARSM task's usability and perceived satisfaction. The study revealed that the younger children were more satisfied with the ARSM task. This novel instrument could be useful in detecting visuospatial short-term difficulties that affect specific developmental navigational disorders and/or school academic achievement. PMID:25438146

  6. Real-time self-calibration of a tracked augmented reality display

    NASA Astrophysics Data System (ADS)

    Baum, Zachary; Lasso, Andras; Ungi, Tamas; Fichtinger, Gabor

    2016-03-01

    PURPOSE: Augmented reality systems have been proposed for image-guided needle interventions but they have not become widely used in clinical practice due to restrictions such as limited portability, low display refresh rates, and tedious calibration procedures. We propose a handheld tablet-based self-calibrating image overlay system. METHODS: A modular handheld augmented reality viewbox was constructed from a tablet computer and a semi-transparent mirror. A consistent and precise self-calibration method, without the use of any temporary markers, was designed to achieve an accurate calibration of the system. Markers attached to the viewbox and patient are simultaneously tracked using an optical pose tracker to report the position of the patient with respect to a displayed image plane that is visualized in real-time. The software was built using the open-source 3D Slicer application platform's SlicerIGT extension and the PLUS toolkit. RESULTS: The accuracy of the image overlay with image-guided needle interventions yielded a mean absolute position error of 0.99 mm (95th percentile 1.93 mm) in-plane of the overlay and a mean absolute position error of 0.61 mm (95th percentile 1.19 mm) out-of-plane. This accuracy is clinically acceptable for tool guidance during various procedures, such as musculoskeletal injections. CONCLUSION: A self-calibration method was developed and evaluated for a tracked augmented reality display. The results show potential for the use of handheld image overlays in clinical studies with image-guided needle interventions.

  7. Augmented reality for the assessment of children's spatial memory in real settings.

    PubMed

    Juan, M-Carmen; Mendez-Lopez, Magdalena; Perez-Hernandez, Elena; Albiol-Perez, Sergio

    2014-01-01

    Short-term memory can be defined as the capacity for holding a small amount of information in mind in an active state for a short period of time. Although some instruments have been developed to study spatial short-term memory in real environments, there are no instruments that are specifically designed to assess visuospatial short-term memory in an attractive way to children. In this paper, we present the ARSM (Augmented Reality Spatial Memory) task, the first Augmented Reality task that involves a user's movement to assess spatial short-term memory in healthy children. The experimental procedure of the ARSM task was designed to assess the children's skill to retain visuospatial information. They were individually asked to remember the real place where augmented reality objects were located. The children (N = 76) were divided into two groups: preschool (5-6 year olds) and primary school (7-8 year olds). We found a significant improvement in ARSM task performance in the older group. The correlations between scores for the ARSM task and traditional procedures were significant. These traditional procedures were the Dot Matrix subtest for the assessment of visuospatial short-term memory of the computerized AWMA-2 battery and a parent's questionnaire about a child's everyday spatial memory. Hence, we suggest that the ARSM task has high verisimilitude with spatial short-term memory skills in real life. In addition, we evaluated the ARSM task's usability and perceived satisfaction. The study revealed that the younger children were more satisfied with the ARSM task. This novel instrument could be useful in detecting visuospatial short-term difficulties that affect specific developmental navigational disorders and/or school academic achievement. PMID:25438146

  8. Beam steering for virtual/augmented reality displays with a cycloidal diffractive waveplate.

    PubMed

    Chen, Haiwei; Weng, Yishi; Xu, Daming; Tabiryan, Nelson V; Wu, Shin-Tson

    2016-04-01

    We proposed a switchable beam steering device with cycloidal diffractive waveplate (CDW) for eye tracking in a virtual reality (VR) or augmented reality (AR) display system. Such a CDW diffracts the incident circularly polarized light to the first order with over 95% efficiency. To convert the input linearly polarized light to right-handed or left-handed circular polarization, we developed a broadband polarization switch consisting of a twisted nematic liquid crystal cell and an achromatic quarter-wave retardation film. By cascading 2-3 CDWs together, multiple diffraction angles can be achieved. To suppress the color dispersion, we proposed two approaches to obtain the same diffraction angle for red, green, and blue LEDs-based full color displays. Our device exhibits several advantages, such as high diffraction efficiency, fast response time, low power consumption, and low cost. It holds promise for the emerging VR/AR displays. PMID:27137019

  9. Building a hybrid patient's model for augmented reality in surgery: a registration problem.

    PubMed

    Lavallée, S; Cinquin, P; Szeliski, R; Peria, O; Hamadeh, A; Champleboux, G; Troccaz, J

    1995-03-01

    In the field of Augmented Reality in Surgery, building a hybrid patient's model, i.e. merging all the data and systems available for a given application, is a difficult but crucial technical problem. The purpose is to merge all the data that constitute the patient model with the reality of the surgery, i.e. the surgical tools and feedback devices. In this paper, we first develop this concept, we show that this construction comes to a problem of registration between various sensor data, and we detail a general framework of registration. The state of the art in this domain is presented. Finally, we show results that we have obtained using a method which is based on the use of anatomical reference surfaces. We show that in many clinical cases, registration is only possible through the use of internal patient structures. PMID:7554833

  10. Using Augmented Reality to Elicit Pretend Play for Children with Autism.

    PubMed

    Zhen Bai; Blackwell, Alan F; Coulouris, George

    2015-05-01

    Children with autism spectrum condition (ASC) suffer from deficits or developmental delays in symbolic thinking. In particular, they are often found lacking in pretend play during early childhood. Researchers believe that they encounter difficulty in generating and maintaining mental representation of pretense coupled with the immediate reality. We have developed an interactive system that explores the potential of Augmented Reality (AR) technology to visually conceptualize the representation of pretense within an open-ended play environment. Results from an empirical study involving children with ASC aged 4 to 7 demonstrated a significant improvement of pretend play in terms of frequency, duration and relevance using the AR system in comparison to a non computer-assisted situation. We investigated individual differences, skill transfer, system usability and limitations of the proposed AR system. We discuss design guidelines for future AR systems for children with ASC and other pervasive developmental disorders. PMID:26357207

  11. Social Gaming and Learning Applications: A Driving Force for the Future of Virtual and Augmented Reality?

    NASA Astrophysics Data System (ADS)

    Dörner, Ralf; Lok, Benjamin; Broll, Wolfgang

    Backed by a large consumer market, entertainment and education applications have spurred developments in the fields of real-time rendering and interactive computer graphics. Relying on Computer Graphics methodologies, Virtual Reality and Augmented Reality benefited indirectly from this; however, there is no large scale demand for VR and AR in gaming and learning. What are the shortcomings of current VR/AR technology that prevent a widespread use in these application areas? What advances in VR/AR will be necessary? And what might future “VR-enhanced” gaming and learning look like? Which role can and will Virtual Humans play? Concerning these questions, this article analyzes the current situation and provides an outlook on future developments. The focus is on social gaming and learning.

  12. An Augmented Reality Nanomanipulator for Learning Nanophysics: The "NanoLearner" Platform

    NASA Astrophysics Data System (ADS)

    Marchi, Florence; Marliere, Sylvain; Florens, Jean Loup; Luciani, Annie; Chevrier, Joel

    The work focuses on the description and evaluation of an augmented reality nanomanipulator, called "NanoLearner" platform used as educational tool in practical works of nanophysics. Through virtual reality associated to multisensory renderings, students are immersed in the nanoworld where they can interact in real time with a sample surface or an object, using their senses as hearing, seeing and touching. The role of each sensorial rendering in the understanding and control of the "approach-retract" interaction has been determined thanks to statistical studies obtained during the practical works. Finally, we present two extensions of the use of this innovative tool for investigating nano effects in living organisms and for allowing grand public to have access to a natural understanding of nanophenomena.

  13. Enhancing a Multi-body Mechanism with Learning-Aided Cues in an Augmented Reality Environment

    NASA Astrophysics Data System (ADS)

    Singh Sidhu, Manjit

    2013-06-01

    Augmented Reality (AR) is a potential area of research for education, covering issues such as tracking and calibration, and realistic rendering of virtual objects. The ability to augment real world with virtual information has opened the possibility of using AR technology in areas such as education and training as well. In the domain of Computer Aided Learning (CAL), researchers have long been looking into enhancing the effectiveness of the teaching and learning process by providing cues that could assist learners to better comprehend the materials presented. Although a number of works were done looking into the effectiveness of learning-aided cues, but none has really addressed this issue for AR-based learning solutions. This paper discusses the design and model of an AR based software that uses visual cues to enhance the learning process and the outcome perception results of the cues.

  14. GyroWand: An Approach to IMU-Based Raycasting for Augmented Reality.

    PubMed

    Hincapié-Ramos, Juan David; Özacar, Kasim; Irani, Pourang P; Kitamura, Yoshifumi

    2016-01-01

    Optical see-through head-mounted displays enable augmented reality (AR) applications that display virtual objects overlaid on the real world. At the core of this new generation of devices are low-cost tracking technologies that allow us to interpret users' motion in the real world in relation to the virtual content for the purposes of navigation and interaction. The advantages of pervasive tracking come at the cost of limiting interaction possibilities, however. To address these challenges the authors introduce GyroWand, a raycasting technique for AR HMDs using inertial measurement unit (IMU) rotational data from a handheld controller. PMID:26960031

  15. Conceiving a specific holographic combiner for an augmented reality HMD dedicated to surgical applications

    NASA Astrophysics Data System (ADS)

    Sittler, Gilles; Twardowski, Patrice; Fasquel, Jean-Baptiste; Fontaine, Joël

    2006-04-01

    In this paper, we present the conception of a holographic combiner for an augmented reality Head Mounted Display (HMD) dedicated to surgical applications. The recording of this holographic component has been performed at the Laboratoire des Systemes Photoniques (LSP) in Strasbourg, France. We present in this paper two different approaches for the recording of such a component: one using plane waves, and the other using spherical waves. The setup linked to the first approach has been developed and built, so that measurments of the diffraction efficiency can be shown. For the other way of recording the holographic combiner, we have performed numerical simulations to find the best recording setup to fit our specifications.

  16. AUVA - Augmented Reality Empowers Visual Analytics to explore Medical Curriculum Data.

    PubMed

    Nifakos, Sokratis; Vaitsis, Christos; Zary, Nabil

    2015-01-01

    Medical curriculum data play a key role in the structure and the organization of medical programs in Universities around the world. The effective processing and usage of these data may improve the educational environment of medical students. As a consequence, the new generation of health professionals would have improved skills from the previous ones. This study introduces the process of enhancing curriculum data by the use of augmented reality technology as a management and presentation tool. The final goal is to enrich the information presented from a visual analytics approach applied on medical curriculum data and to sustain low levels of complexity of understanding these data. PMID:25991196

  17. Real-Time Occlusion Handling in Augmented Reality Based on an Object Tracking Approach

    PubMed Central

    Tian, Yuan; Guan, Tao; Wang, Cheng

    2010-01-01

    To produce a realistic augmentation in Augmented Reality, the correct relative positions of real objects and virtual objects are very important. In this paper, we propose a novel real-time occlusion handling method based on an object tracking approach. Our method is divided into three steps: selection of the occluding object, object tracking and occlusion handling. The user selects the occluding object using an interactive segmentation method. The contour of the selected object is then tracked in the subsequent frames in real-time. In the occlusion handling step, all the pixels on the tracked object are redrawn on the unprocessed augmented image to produce a new synthesized image in which the relative position between the real and virtual object is correct. The proposed method has several advantages. First, it is robust and stable, since it remains effective when the camera is moved through large changes of viewing angles and volumes or when the object and the background have similar colors. Second, it is fast, since the real object can be tracked in real-time. Last, a smoothing technique provides seamless merging between the augmented and virtual object. Several experiments are provided to validate the performance of the proposed method. PMID:22319278

  18. The Effects of Augmented Reality-based Otago Exercise on Balance, Gait, and Falls Efficacy of Elderly Women

    PubMed Central

    Yoo, Ha-na; Chung, EunJung; Lee, Byoung-Hee

    2013-01-01

    [Purpose] The purpose of this study was to determine the effects of augmented reality-based Otago exercise on balance, gait, and falls efficacy of elderly women. [Subjects] The subjects were 21 elderly women, who were randomly divided into two groups: an augmented reality-based Otago exercise group of 10 subjects and an Otago exercise group of 11 subjects. [Methods] All subjects were evaluated for balance (Berg Balance Scale, BBS), gait parameters (velocity, cadence, step length, and stride length), and falls efficacy. Within 12 weeks, Otago exercise for muscle strengthening and balance training was conducted three times, for a period of 60 minutes each, and subjects in the experimental group performed augmented reality-based Otago exercise. [Results] Following intervention, the augmented reality-based Otago exercise group showed significant increases in BBS, velocity, cadence, step length (right side), stride length (right side and left side) and falls efficacy. [Conclusion] The results of this study suggest the feasibility and suitability of this augmented reality-based Otago exercise for elderly women. PMID:24259856

  19. Using augmented reality as a clinical support tool to assist combat medics in the treatment of tension pneumothoraces.

    PubMed

    Wilson, Kenneth L; Doswell, Jayfus T; Fashola, Olatokunbo S; Debeatham, Wayne; Darko, Nii; Walker, Travelyan M; Danner, Omar K; Matthews, Leslie R; Weaver, William L

    2013-09-01

    This study was to extrapolate potential roles of augmented reality goggles as a clinical support tool assisting in the reduction of preventable causes of death on the battlefield. Our pilot study was designed to improve medic performance in accurately placing a large bore catheter to release tension pneumothorax (prehospital setting) while using augmented reality goggles. Thirty-four preclinical medical students recruited from Morehouse School of Medicine performed needle decompressions on human cadaver models after hearing a brief training lecture on tension pneumothorax management. Clinical vignettes identifying cadavers as having life-threatening tension pneumothoraces as a consequence of improvised explosive device attacks were used. Study group (n = 13) performed needle decompression using augmented reality goggles whereas the control group (n = 21) relied solely on memory from the lecture. The two groups were compared according to their ability to accurately complete the steps required to decompress a tension pneumothorax. The medical students using augmented reality goggle support were able to treat the tension pneumothorax on the human cadaver models more accurately than the students relying on their memory (p < 0.008). Although the augmented reality group required more time to complete the needle decompression intervention (p = 0.0684), this did not reach statistical significance. PMID:24005547

  20. Optical augmented reality assisted navigation system for neurosurgery teaching and planning

    NASA Astrophysics Data System (ADS)

    Wu, Hui-Qun; Geng, Xing-Yun; Wang, Li; Zhang, Yuan-Peng; Jiang, Kui; Tang, Le-Min; Zhou, Guo-Min; Dong, Jian-Cheng

    2013-07-01

    This paper proposed a convenient navigation system for neurosurgeon's pre-operative planning and teaching with augmented reality (AR) technique, which maps the three-dimensional reconstructed virtual anatomy structures onto a skull model. This system included two parts, a virtual reality system and a skull model scence. In our experiment, a 73 year old right-handed man initially diagnosed with astrocytoma was selected as an example to vertify our system. His imaging data from different modalities were registered and the skull soft tissue, brain and inside vessels as well as tumor were reconstructed. Then the reconstructed models were overlayed on the real scence. Our findings showed that the reconstructed tissues were augmented into the real scence and the registration results were in good alignment. The reconstructed brain tissue was well distributed in the skull cavity. The probe was used by a neurosurgeon to explore the surgical pathway which could be directly posed into the tumor while not injuring important vessels. In this way, the learning cost for students and patients' education about surgical risks reduced. Therefore, this system could be a selective protocol for image guided surgery(IGS), and is promising for neurosurgeon's pre-operative planning and teaching.

  1. Integrated multimodal human-computer interface and augmented reality for interactive display applications

    NASA Astrophysics Data System (ADS)

    Vassiliou, Marius S.; Sundareswaran, Venkataraman; Chen, S.; Behringer, Reinhold; Tam, Clement K.; Chan, M.; Bangayan, Phil T.; McGee, Joshua H.

    2000-08-01

    We describe new systems for improved integrated multimodal human-computer interaction and augmented reality for a diverse array of applications, including future advanced cockpits, tactical operations centers, and others. We have developed an integrated display system featuring: speech recognition of multiple concurrent users equipped with both standard air- coupled microphones and novel throat-coupled sensors (developed at Army Research Labs for increased noise immunity); lip reading for improving speech recognition accuracy in noisy environments, three-dimensional spatialized audio for improved display of warnings, alerts, and other information; wireless, coordinated handheld-PC control of a large display; real-time display of data and inferences from wireless integrated networked sensors with on-board signal processing and discrimination; gesture control with disambiguated point-and-speak capability; head- and eye- tracking coupled with speech recognition for 'look-and-speak' interaction; and integrated tetherless augmented reality on a wearable computer. The various interaction modalities (speech recognition, 3D audio, eyetracking, etc.) are implemented a 'modality servers' in an Internet-based client-server architecture. Each modality server encapsulates and exposes commercial and research software packages, presenting a socket network interface that is abstracted to a high-level interface, minimizing both vendor dependencies and required changes on the client side as the server's technology improves.

  2. Generic precise augmented reality guiding system and its calibration method based on 3D virtual model.

    PubMed

    Liu, Miao; Yang, Shourui; Wang, Zhangying; Huang, Shujun; Liu, Yue; Niu, Zhenqi; Zhang, Xiaoxuan; Zhu, Jigui; Zhang, Zonghua

    2016-05-30

    Augmented reality system can be applied to provide precise guidance for various kinds of manual works. The adaptability and guiding accuracy of such systems are decided by the computational model and the corresponding calibration method. In this paper, a novel type of augmented reality guiding system and the corresponding designing scheme are proposed. Guided by external positioning equipment, the proposed system can achieve high relative indication accuracy in a large working space. Meanwhile, the proposed system is realized with a digital projector and the general back projection model is derived with geometry relationship between digitized 3D model and the projector in free space. The corresponding calibration method is also designed for the proposed system to obtain the parameters of projector. To validate the proposed back projection model, the coordinate data collected by a 3D positioning equipment is used to calculate and optimize the extrinsic parameters. The final projecting indication accuracy of the system is verified with subpixel pattern projecting technique. PMID:27410124

  3. Endoscopic feature tracking for augmented-reality assisted prosthesis selection in mitral valve repair

    NASA Astrophysics Data System (ADS)

    Engelhardt, Sandy; Kolb, Silvio; De Simone, Raffaele; Karck, Matthias; Meinzer, Hans-Peter; Wolf, Ivo

    2016-03-01

    Mitral valve annuloplasty describes a surgical procedure where an artificial prosthesis is sutured onto the anatomical structure of the mitral annulus to re-establish the valve's functionality. Choosing an appropriate commercially available ring size and shape is a difficult decision the surgeon has to make intraoperatively according to his experience. In our augmented-reality framework, digitalized ring models are superimposed onto endoscopic image streams without using any additional hardware. To place the ring model on the proper position within the endoscopic image plane, a pose estimation is performed that depends on the localization of sutures placed by the surgeon around the leaflet origins and punctured through the stiffer structure of the annulus. In this work, the tissue penetration points are tracked by the real-time capable Lucas Kanade optical flow algorithm. The accuracy and robustness of this tracking algorithm is investigated with respect to the question whether outliers influence the subsequent pose estimation. Our results suggest that optical flow is very stable for a variety of different endoscopic scenes and tracking errors do not affect the position of the superimposed virtual objects in the scene, making this approach a viable candidate for annuloplasty augmented reality-enhanced decision support.

  4. A study of students' motivation using the augmented reality science textbook

    NASA Astrophysics Data System (ADS)

    Gopalan, Valarmathie; Zulkifli, Abdul Nasir; Bakar, Juliana Aida Abu

    2016-08-01

    Science plays a major role in assisting Malaysia to achieve the developed nation status by 2020. However, over a few decades, Malaysia is facing a downward trend in the number of students pursuing careers and higher education in science related fields. Since school is the first platform where students learn science, a new learning approach needs to be introduced to motivate them towards science learning. The aim of this study is to determine whether the intervention of the enhanced science textbook using augmented reality contributes to the learning process of lower secondary school students in science. The study was carried out among a sample of 70 lower secondary school students. Pearson Correlation and Regression analyses were used to determine the effects of ease of use, engaging, enjoyment and fun on students' motivation in using the augmented reality science textbook for science learning. The results provide empirical support for the positive and statistically significant relationship between engaging, enjoyment and fun and students' motivation for science learning. However, Ease of use is not significant but positively correlated to Motivation.

  5. Augmented-reality-guided biopsy of a tumor near the skull base: the surgeon's experience

    NASA Astrophysics Data System (ADS)

    Eggers, Georg; Sudra, Gunther; Ghanai, Sassan; Salb, Tobias; Dillmann, Ruediger; Marmulla, Ruediger; Hassfeld, Stefan

    2005-04-01

    INPRES, a system for Augmented Reality has been developed in the collaborative research center "Information Technology in Medicine - Computer- and Sensor-Aided Surgery". The system is based on see-through glasses. In extensive preclinical testing the system has proven its functionality and tests with volunteers had been performed successfully, based on MRI imaging. We report the surgeons view of the first use of the system for AR guided biopsy of a tumour near the skull base. Preoperative planning was performed based on CT image data. The information to be projected was the tumour volume and was segmented from image data. With the use of infrared cameras, the positions of patient and surgeon were tracked intraoperatively and the information on the glasses displays was updated accordingly. The systems proved its functionality under OR conditions in patient care: Augmented reality information could be visualized with sufficient accuracy for the surgical task. After intraoperative calibration by the surgeon, the biopsy was acquired successfully. The advantage of see through glasses is their flexibility. A virtual stereoscopic image can be set up wherever and whenever desired. A biopsy at a delicate location could be performed without the need for wide exposure. This means additional safety and lower operation related morbidity to the patient. The integration of the calibration-procedure of the glasses into the intraoperative workflow is of importance to the surgeon.

  6. Augmented reality and photogrammetry: A synergy to visualize physical and virtual city environments

    NASA Astrophysics Data System (ADS)

    Portalés, Cristina; Lerma, José Luis; Navarro, Santiago

    2010-01-01

    Close-range photogrammetry is based on the acquisition of imagery to make accurate measurements and, eventually, three-dimensional (3D) photo-realistic models. These models are a photogrammetric product per se. They are usually integrated into virtual reality scenarios where additional data such as sound, text or video can be introduced, leading to multimedia virtual environments. These environments allow users both to navigate and interact on different platforms such as desktop PCs, laptops and small hand-held devices (mobile phones or PDAs). In very recent years, a new technology derived from virtual reality has emerged: Augmented Reality (AR), which is based on mixing real and virtual environments to boost human interactions and real-life navigations. The synergy of AR and photogrammetry opens up new possibilities in the field of 3D data visualization, navigation and interaction far beyond the traditional static navigation and interaction in front of a computer screen. In this paper we introduce a low-cost outdoor mobile AR application to integrate buildings of different urban spaces. High-accuracy 3D photo-models derived from close-range photogrammetry are integrated in real (physical) urban worlds. The augmented environment that is presented herein requires for visualization a see-through video head mounted display (HMD), whereas user's movement navigation is achieved in the real world with the help of an inertial navigation sensor. After introducing the basics of AR technology, the paper will deal with real-time orientation and tracking in combined physical and virtual city environments, merging close-range photogrammetry and AR. There are, however, some software and complex issues, which are discussed in the paper.

  7. Interactive Learning Environment: Web-based Virtual Hydrological Simulation System using Augmented and Immersive Reality

    NASA Astrophysics Data System (ADS)

    Demir, I.

    2014-12-01

    Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The hydrological simulation system is a web-based 3D interactive learning environment for teaching hydrological processes and concepts. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create or load predefined scenarios, control environmental parameters, and evaluate environmental mitigation alternatives. The web-based simulation system provides an environment for students to learn about the hydrological processes (e.g. flooding and flood damage), and effects of development and human activity in the floodplain. The system utilizes latest web technologies and graphics processing unit (GPU) for water simulation and object collisions on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various visualization and interaction modes.

  8. Thrust Augmentation Measurements for a Pulse Detonation Engine Driven Ejector

    NASA Technical Reports Server (NTRS)

    Pal, S.; Santoro, Robert J.; Shehadeh, R.; Saretto, S.; Lee, S.-Y.

    2005-01-01

    Thrust augmentation results of an ongoing study of pulse detonation engine driven ejectors are presented and discussed. The experiments were conducted using a pulse detonation engine (PDE) setup with various ejector configurations. The PDE used in these experiments utilizes ethylene (C2H4) as the fuel, and an equi-molar mixture of oxygen and nitrogen as the oxidizer at an equivalence ratio of one. High fidelity thrust measurements were made using an integrated spring damper system. The baseline thrust of the PDE engine was first measured and agrees with experimental and modeling results found in the literature. Thrust augmentation measurements were then made for constant diameter ejectors. The parameter space for the study included ejector length, PDE tube exit to ejector tube inlet overlap distance, and straight versus rounded ejector inlets. The relationship between the thrust augmentation results and various physical phenomena is described. To further understand the flow dynamics, shadow graph images of the exiting shock wave front from the PDE were also made. For the studied parameter space, the results showed a maximum augmentation of 40%. Further increase in augmentation is possible if the geometry of the ejector is tailored, a topic currently studied by numerous groups in the field.

  9. Inexpensive Monocular Pico-Projector-based Augmented Reality Display for Surgical Microscope

    PubMed Central

    Shi, Chen; Becker, Brian C.; Riviere, Cameron N.

    2013-01-01

    This paper describes an inexpensive pico-projector-based augmented reality (AR) display for a surgical microscope. The system is designed for use with Micron, an active handheld surgical tool that cancels hand tremor of surgeons to improve microsurgical accuracy. Using the AR display, virtual cues can be injected into the microscope view to track the movement of the tip of Micron, show the desired position, and indicate the position error. Cues can be used to maintain high performance by helping the surgeon to avoid drifting out of the workspace of the instrument. Also, boundary information such as the view range of the cameras that record surgical procedures can be displayed to tell surgeons the operation area. Furthermore, numerical, textual, or graphical information can be displayed, showing such things as tool tip depth in the work space and on/off status of the canceling function of Micron. PMID:25264542

  10. Electrically adjustable location of a projected image in augmented reality via a liquid-crystal lens.

    PubMed

    Chen, Hung-Shan; Wang, Yu-Jen; Chen, Po-Ju; Lin, Yi-Hsin

    2015-11-01

    An augmented reality (AR) system involving the electrically tunable location of a projected image is implemented using a liquid-crystal (LC) lens. The projected image is either real or virtual. By effectively doubling the LC lens power following light reflection, the position of a projected virtual image can be made to vary from 42 to 360 cm, while the tunable range for a projected real image is from 27 to 52 cm on the opposite side. The optical principle of the AR system is introduced and could be further developed for other tunable focusing lenses, even those with a lower lens power. The benefits of this study could be extended to head-mounted display systems for vision correction or vision compensation. We believe that tunable focusing LC optical elements are promising developments in the thriving field of AR applications. PMID:26561086