Science.gov

Sample records for augmented reality engineering

  1. Usability engineering: domain analysis activities for augmented-reality systems

    NASA Astrophysics Data System (ADS)

    Gabbard, Joseph; Swan, J. E., II; Hix, Deborah; Lanzagorta, Marco O.; Livingston, Mark; Brown, Dennis B.; Julier, Simon J.

    2002-05-01

    This paper discusses our usability engineering process for the Battlefield Augmented Reality System (BARS). Usability engineering is a structured, iterative, stepwise development process. Like the related disciplines of software and systems engineering, usability engineering is a combination of management principals and techniques, formal and semi- formal evaluation techniques, and computerized tools. BARS is an outdoor augmented reality system that displays heads- up battlefield intelligence information to a dismounted warrior. The paper discusses our general usability engineering process. We originally developed the process in the context of virtual reality applications, but in this work we are adapting the procedures to an augmented reality system. The focus of this paper is our work on domain analysis, the first activity of the usability engineering process. We describe our plans for and our progress to date on our domain analysis for BARS. We give results in terms of a specific urban battlefield use case we have designed.

  2. Confronting an Augmented Reality

    ERIC Educational Resources Information Center

    Munnerley, Danny; Bacon, Matt; Wilson, Anna; Steele, James; Hedberg, John; Fitzgerald, Robert

    2012-01-01

    How can educators make use of augmented reality technologies and practices to enhance learning and why would we want to embrace such technologies anyway? How can an augmented reality help a learner confront, interpret and ultimately comprehend reality itself ? In this article, we seek to initiate a discussion that focuses on these questions, and…

  3. Augmented reality: a review.

    PubMed

    Berryman, Donna R

    2012-01-01

    Augmented reality is a technology that overlays digital information on objects or places in the real world for the purpose of enhancing the user experience. It is not virtual reality, that is, the technology that creates a totally digital or computer created environment. Augmented reality, with its ability to combine reality and digital information, is being studied and implemented in medicine, marketing, museums, fashion, and numerous other areas. This article presents an overview of augmented reality, discussing what it is, how it works, its current implementations, and its potential impact on libraries. PMID:22559183

  4. Augmented Reality in astrophysics

    NASA Astrophysics Data System (ADS)

    Vogt, Frédéric P. A.; Shingles, Luke J.

    2013-09-01

    Augmented Reality consists of merging live images with virtual layers of information. The rapid growth in the popularity of smartphones and tablets over recent years has provided a large base of potential users of Augmented Reality technology, and virtual layers of information can now be attached to a wide variety of physical objects. In this article, we explore the potential of Augmented Reality for astrophysical research with two distinct experiments: (1) Augmented Posters and (2) Augmented Articles. We demonstrate that the emerging technology of Augmented Reality can already be used and implemented without expert knowledge using currently available apps. Our experiments highlight the potential of Augmented Reality to improve the communication of scientific results in the field of astrophysics. We also present feedback gathered from the Australian astrophysics community that reveals evidence of some interest in this technology by astronomers who experimented with Augmented Posters. In addition, we discuss possible future trends for Augmented Reality applications in astrophysics, and explore the current limitations associated with the technology. This Augmented Article, the first of its kind, is designed to allow the reader to directly experiment with this technology.

  5. Augmented reality system

    NASA Astrophysics Data System (ADS)

    Lin, Chien-Liang; Su, Yu-Zheng; Hung, Min-Wei; Huang, Kuo-Cheng

    2010-08-01

    In recent years, Augmented Reality (AR)[1][2][3] is very popular in universities and research organizations. The AR technology has been widely used in Virtual Reality (VR) fields, such as sophisticated weapons, flight vehicle development, data model visualization, virtual training, entertainment and arts. AR has characteristics to enhance the display output as a real environment with specific user interactive functions or specific object recognitions. It can be use in medical treatment, anatomy training, precision instrument casting, warplane guidance, engineering and distance robot control. AR has a lot of vantages than VR. This system developed combines sensors, software and imaging algorithms to make users feel real, actual and existing. Imaging algorithms include gray level method, image binarization method, and white balance method in order to make accurate image recognition and overcome the effects of light.

  6. Augmented Reality Simulations on Handheld Computers

    ERIC Educational Resources Information Center

    Squire, Kurt; Klopfer, Eric

    2007-01-01

    Advancements in handheld computing, particularly its portability, social interactivity, context sensitivity, connectivity, and individuality, open new opportunities for immersive learning environments. This article articulates the pedagogical potential of augmented reality simulations in environmental engineering education by immersing students in…

  7. Fiia: A Model-Based Approach to Engineering Collaborative Augmented Reality

    NASA Astrophysics Data System (ADS)

    Wolfe, Christopher; Smith, J. David; Phillips, W. Greg; Graham, T. C. Nicholas

    Augmented reality systems often involve collaboration among groups of people. While there are numerous toolkits that aid the development of such augmented reality groupware systems (e.g., ARToolkit and Groupkit), there remains an enormous gap between the specification of an AR groupware application and its implementation. In this chapter, we present Fiia, a toolkit which simplifies the development of collaborative AR applications. Developers specify the structure of their applications using the Fiia modeling language, which abstracts details of networking and provides high-level support for specifying adapters between the physical and virtual world. The Fiia.Net runtime system then maps this conceptual model to a runtime implementation. We illustrate Fiia via Raptor, an augmented reality application used to help small groups collaboratively prototype video games.

  8. Augmented Reality Binoculars.

    PubMed

    Oskiper, Taragay; Sizintsev, Mikhail; Branzoi, Vlad; Samarasekera, Supun; Kumar, Rakesh

    2015-05-01

    In this paper we present an augmented reality binocular system to allow long range high precision augmentation of live telescopic imagery with aerial and terrain based synthetic objects, vehicles, people and effects. The inserted objects must appear stable in the display and must not jitter and drift as the user pans around and examines the scene with the binoculars. The design of the system is based on using two different cameras with wide field of view and narrow field of view lenses enclosed in a binocular shaped shell. Using the wide field of view gives us context and enables us to recover the 3D location and orientation of the binoculars much more robustly, whereas the narrow field of view is used for the actual augmentation as well as to increase precision in tracking. We present our navigation algorithm that uses the two cameras in combination with an inertial measurement unit and global positioning system in an extended Kalman filter and provides jitter free, robust and real-time pose estimation for precise augmentation. We have demonstrated successful use of our system as part of information sharing example as well as a live simulated training system for observer training, in which fixed and rotary wing aircrafts, ground vehicles, and weapon effects are combined with real world scenes. PMID:26357208

  9. Augmented Virtual Reality Laboratory

    NASA Technical Reports Server (NTRS)

    Tully-Hanson, Benjamin

    2015-01-01

    Real time motion tracking hardware has for the most part been cost prohibitive for research to regularly take place until recently. With the release of the Microsoft Kinect in November 2010, researchers now have access to a device that for a few hundred dollars is capable of providing redgreenblue (RGB), depth, and skeleton data. It is also capable of tracking multiple people in real time. For its original intended purposes, i.e. gaming, being used with the Xbox 360 and eventually Xbox One, it performs quite well. However, researchers soon found that although the sensor is versatile, it has limitations in real world applications. I was brought aboard this summer by William Little in the Augmented Virtual Reality (AVR) Lab at Kennedy Space Center to find solutions to these limitations.

  10. Augmented reality for wafer prober

    NASA Astrophysics Data System (ADS)

    Gilgenkrantz, Pascal

    2011-03-01

    The link between wafer manufacturing and wafer test is often weak: without common information system, Test engineers have to read locations of test structures from reference documents and search them on the wafer prober screen. Mask Data Preparation team is ideally placed to fill this gap, given its relationship with both design and manufacturing sides. With appropriate design extraction scripts and design conventions, mask engineers can provide exact wafer locations of all embedded test structures to avoid a painful camera search. Going a step further, it would be a great help to provide to wafer probers a "map" of what was build on wafers. With this idea in mind, mask design database can simply be provided to Test engineers; but the real added value would come from a true integration of real-wafer camera views and design database used for wafer manufacturing. As proven by several augmented reality applications, like Google Maps' mixed Satellite/Map view, mixing a real-world view with its theoretical model is very useful to understand the reality. The creation of such interface can only be made by a wafer prober manufacturer, given the high integration of these machines with their control panel. But many existing software libraries could be used to plot the design view matching the camera view. Standard formats for mask design are usually GDSII and OASIS (SEMI P39 standard); multiple free software and commercial viewers/editors/libraries for these formats are available.

  11. Augmented Reality Comes to Physics

    ERIC Educational Resources Information Center

    Buesing, Mark; Cook, Michael

    2013-01-01

    Augmented reality (AR) is a technology used on computing devices where processor-generated graphics are rendered over real objects to enhance the sensory experience in real time. In other words, what you are really seeing is augmented by the computer. Many AR games already exist for systems such as Kinect and Nintendo 3DS and mobile apps, such as…

  12. Augmented Reality Comes to Physics

    NASA Astrophysics Data System (ADS)

    Buesing, Mark; Cook, Michael

    2013-04-01

    Augmented reality (AR) is a technology used on computing devices where processor-generated graphics are rendered over real objects to enhance the sensory experience in real time. In other words, what you are really seeing is augmented by the computer. Many AR games already exist for systems such as Kinect and Nintendo 3DS and mobile apps, such as Tagwhat and Star Chart (a must for astronomy class). The yellow line marking first downs in a televised football game2 and the enhanced puck that makes televised hockey easier to follow3 both use augmented reality to do the job.

  13. Augmented Reality Tower Technology Assessment

    NASA Technical Reports Server (NTRS)

    Reisman, Ronald J.; Brown, David M.

    2009-01-01

    Augmented Reality technology may help improve Air Traffic Control Tower efficiency and safety during low-visibility conditions. This paper presents the assessments of five off-duty controllers who shadow-controlled' with an augmented reality prototype in their own facility. Initial studies indicated unanimous agreement that this technology is potentially beneficial, though the prototype used in the study was not adequate for operational use. Some controllers agreed that augmented reality technology improved situational awareness, had potential to benefit clearance, control, and coordination tasks and duties and could be very useful for acquiring aircraft and weather information, particularly aircraft location, heading, and identification. The strongest objections to the prototype used in this study were directed at aircraft registration errors, unacceptable optical transparency, insufficient display performance in sunlight, inadequate representation of the static environment and insufficient symbology.

  14. Wireless Augmented Reality Prototype (WARP)

    NASA Technical Reports Server (NTRS)

    Devereaux, A. S.

    1999-01-01

    Initiated in January, 1997, under NASA's Office of Life and Microgravity Sciences and Applications, the Wireless Augmented Reality Prototype (WARP) is a means to leverage recent advances in communications, displays, imaging sensors, biosensors, voice recognition and microelectronics to develop a hands-free, tetherless system capable of real-time personal display and control of computer system resources. Using WARP, an astronaut may efficiently operate and monitor any computer-controllable activity inside or outside the vehicle or station. The WARP concept is a lightweight, unobtrusive heads-up display with a wireless wearable control unit. Connectivity to the external system is achieved through a high-rate radio link from the WARP personal unit to a base station unit installed into any system PC. The radio link has been specially engineered to operate within the high- interference, high-multipath environment of a space shuttle or space station module. Through this virtual terminal, the astronaut will be able to view and manipulate imagery, text or video, using voice commands to control the terminal operations. WARP's hands-free access to computer-based instruction texts, diagrams and checklists replaces juggling manuals and clipboards, and tetherless computer system access allows free motion throughout a cabin while monitoring and operating equipment.

  15. A Pilot Study of the Effectiveness of Augmented Reality to Enhance the Use of Remote Labs in Electrical Engineering Education

    NASA Astrophysics Data System (ADS)

    Mejías Borrero, A.; Andújar Márquez, J. M.

    2012-10-01

    Lab practices are an essential part of teaching in Engineering. However, traditional laboratory lessons developed in classroom labs (CL) must be adapted to teaching and learning strategies that go far beyond the common concept of e-learning, in the sense that completely virtualized distance education disconnects teachers and students from the real world, which can generate specific problems in laboratory classes. Current proposals of virtual labs (VL) and remote labs (RL) do not either cover new needs properly or contribute remarkable improvement to traditional labs—except that they favor distance training. Therefore, online teaching and learning in lab practices demand a further step beyond current VL and RL. This paper poses a new reality and new teaching/learning concepts in the field of lab practices in engineering. The developed augmented reality-based lab system (augmented remote lab, ARL) enables teachers and students to work remotely (Internet/intranet) in current CL, including virtual elements which interact with real ones. An educational experience was conducted to assess the developed ARL with the participation of a group of 10 teachers and another group of 20 students. Both groups have completed lab practices of the contents in the subjects Digital Systems and Robotics and Industrial Automation, which belong to the second year of the new degree in Electronic Engineering (adapted to the European Space for Higher Education). The labs were carried out by means of three different possibilities: CL, VL and ARL. After completion, both groups were asked to fill in some questionnaires aimed at measuring the improvement contributed by ARL relative to CL and VL. Except in some specific questions, the opinion of teachers and students was rather similar and positive regarding the use and possibilities of ARL. Although the results are still preliminary and need further study, seems to conclude that ARL remarkably improves the possibilities of current VL and RL

  16. Introduction to augmented and virtual reality

    NASA Astrophysics Data System (ADS)

    Caudell, Thomas P.

    1995-12-01

    This paper introduces the field of augmented reality as a prolog to the body of papers in the remainder of this session. I describe the use of head-mounted display technologies to improve the efficiency and quality of human workers in their performance of engineering design, manufacturing, construction, testing, and maintenance activities. This technology is used to `augment' the visual field of the wearer with information necessary in the performance of the current task. The enabling technology is head-up (see-through) display head sets (HUDsets) combined with head position sensing, real world registration systems, and database access software. A primary difference between virtual reality (VR) and `augmented reality' (AR) is in the complexity of the perceived graphical objects. In AR systems, only simple wire frames, template outlines, designators, and text is displayed. An immediate result of this difference is that augmented reality systems can be driven by standard and inexpensive microprocessors. Many research issues must be addressed before this technology can be widely used, including tracking and registration, human 3D perception and reasoning, and human task performance issues.

  17. Augmented reality building operations tool

    DOEpatents

    Brackney, Larry J.

    2014-09-09

    A method (700) for providing an augmented reality operations tool to a mobile client (642) positioned in a building (604). The method (700) includes, with a server (660), receiving (720) from the client (642) an augmented reality request for building system equipment (612) managed by an energy management system (EMS) (620). The method (700) includes transmitting (740) a data request for the equipment (612) to the EMS (620) and receiving (750) building management data (634) for the equipment (612). The method (700) includes generating (760) an overlay (656) with an object created based on the building management data (634), which may be sensor data, diagnostic procedures, or the like. The overlay (656) is configured for concurrent display on a display screen (652) of the client (642) with a real-time image of the building equipment (612). The method (700) includes transmitting (770) the overlay (656) to the client (642).

  18. Augmented reality using GPS

    NASA Astrophysics Data System (ADS)

    Kim, Juwan; Kim, Haedong; Jang, Byungtae; Kim, Jungsik; Kim, Donghyun

    1998-04-01

    This paper describes a prototype system to be developing using GPS (Global Positioning System) as a tracker in order to combine real images with virtual geographical images in real time. To cover long distances, this system is built using a monitor-based configuration and divided into two parts. One is the real scene acquisition system that includes a vehicle, a wireless CCD camera, a GPS attitude determination device and a wireless data communication device. The other is the processing and visualization system that includes a wireless data communication device, a PC with a video overlay card and a 3D graphics accelerator. The pilot area of the current system is the part of SERI (Systems Engineering Research Institute) which is the institute we are working for now. And virtual objects are generated with 3D modeling data of the main building, the new building to be planned, and so on in SERI. The wireless CCD camera attached to a vehicle acquires the real scenes. And GPS attitude determination device produces a wireless CCD camera's position and orientation data. And then this information is transmitted to the processing and visualization part by air. In the processing and visualization system, virtual images are rendered using the received information and combined with the real scenes. Applications are an enhanced bird's-eye view and disaster rescue work such as earthquake.

  19. Augmenting Your Own Reality: Student Authoring of Science-Based Augmented Reality Games

    ERIC Educational Resources Information Center

    Klopfer, Eric; Sheldon, Josh

    2010-01-01

    Augmented Reality (AR) simulations superimpose a virtual overlay of data and interactions onto a real-world context. The simulation engine at the heart of this technology is built to afford elements of game play that support explorations and learning in students' natural context--their own community and surroundings. In one of the more recent…

  20. Webizing mobile augmented reality content

    NASA Astrophysics Data System (ADS)

    Ahn, Sangchul; Ko, Heedong; Yoo, Byounghyun

    2014-01-01

    This paper presents a content structure for building mobile augmented reality (AR) applications in HTML5 to achieve a clean separation of the mobile AR content and the application logic for scaling as on the Web. We propose that the content structure contains the physical world as well as virtual assets for mobile AR applications as document object model (DOM) elements and that their behaviour and user interactions are controlled through DOM events by representing objects and places with a uniform resource identifier. Our content structure enables mobile AR applications to be seamlessly developed as normal HTML documents under the current Web eco-system.

  1. Augmented reality in medical education?

    PubMed

    Kamphuis, Carolien; Barsom, Esther; Schijven, Marlies; Christoph, Noor

    2014-09-01

    Learning in the medical domain is to a large extent workplace learning and involves mastery of complex skills that require performance up to professional standards in the work environment. Since training in this real-life context is not always possible for reasons of safety, costs, or didactics, alternative ways are needed to achieve clinical excellence. Educational technology and more specifically augmented reality (AR) has the potential to offer a highly realistic situated learning experience supportive of complex medical learning and transfer. AR is a technology that adds virtual content to the physical real world, thereby augmenting the perception of reality. Three examples of dedicated AR learning environments for the medical domain are described. Five types of research questions are identified that may guide empirical research into the effects of these learning environments. Up to now, empirical research mainly appears to focus on the development, usability and initial implementation of AR for learning. Limited review results reflect the motivational value of AR, its potential for training psychomotor skills and the capacity to visualize the invisible, possibly leading to enhanced conceptual understanding of complex causality. PMID:24464832

  2. Augmented Reality for Close Quarters Combat

    ScienceCinema

    None

    2014-06-23

    Sandia National Laboratories has developed a state-of-the-art augmented reality training system for close-quarters combat (CQB). This system uses a wearable augmented reality system to place the user in a real environment while engaging enemy combatants in virtual space (Boston Dynamics DI-Guy). Umbra modeling and simulation environment is used to integrate and control the AR system.

  3. Enhancing Education through Mobile Augmented Reality

    ERIC Educational Resources Information Center

    Joan, D. R. Robert

    2015-01-01

    In this article, the author has discussed about the Mobile Augmented Reality and enhancing education through it. The aim of the present study was to give some general information about mobile augmented reality which helps to boost education. Purpose of the current study reveals the mobile networks which are used in the institution campus as well…

  4. Augmented Reality for Close Quarters Combat

    SciTech Connect

    2013-09-20

    Sandia National Laboratories has developed a state-of-the-art augmented reality training system for close-quarters combat (CQB). This system uses a wearable augmented reality system to place the user in a real environment while engaging enemy combatants in virtual space (Boston Dynamics DI-Guy). Umbra modeling and simulation environment is used to integrate and control the AR system.

  5. Augmented Reality and Mobile Art

    NASA Astrophysics Data System (ADS)

    Gwilt, Ian

    The combined notions of augmented-reality (AR) and mobile art are based on the amalgamation of a number of enabling technologies including computer imaging, emergent display and tracking systems and the increased computing-power in hand-held devices such as Tablet PCs, smart phones, or personal digital assistants (PDAs) which have been utilized in the making of works of art. There is much published research on the technical aspects of AR and the ongoing work being undertaken in the development of faster more efficient AR systems [1] [2]. In this text I intend to concentrate on how AR and its associated typologies can be applied in the context of new media art practices, with particular reference to its application on hand-held or mobile devices.

  6. Augmented reality: past, present, future

    NASA Astrophysics Data System (ADS)

    Inzerillo, Laura

    2013-03-01

    A great opportunity has permitted to carry out a cultural, historical, architectural and social research with great impact factor on the international cultural interest. We are talking about the realization of a museum whose the main theme is the visit and the discovery of a monument of great prestige: the monumental building the "Steri" in Palermo. The museum is divided into sub themes including the one above all, that has aroused the international interest so much that it has been presented the instance to include the museum in the cultural heritage of UNESCO. It is the realization of a museum path that regards the cells of the Inquisition, which are located just inside of some buildings of the monumental building. The project, as a whole, is faced, in a total view, between the various competences implicated: historic, chemic, architectonic, topographic, drawing, representation, virtual communication, informatics. The birth of the museum will be a sum of the results of all these disciplines involved. Methodology, implementation, fruition, virtual museum, goals, 2D graphic restitution, effects on the cultural heritage and landscape environmental, augmented reality, Surveying 2D and 3D, hi-touch screen, Photogrammetric survey, Photographic survey, representation, drawing 3D and more than this has been dealt with this research.

  7. Augmented Reality for the Improvement of Remote Laboratories: An Augmented Remote Laboratory

    ERIC Educational Resources Information Center

    Andujar, J. M.; Mejias, A.; Marquez, M. A.

    2011-01-01

    Augmented reality (AR) provides huge opportunities for online teaching in science and engineering, as these disciplines place emphasis on practical training and unsuited to completely nonclassroom training. This paper proposes a new concept in virtual and remote laboratories: the augmented remote laboratory (ARL). ARL is being tested in the first…

  8. Location-Based Learning through Augmented Reality

    ERIC Educational Resources Information Center

    Chou, Te-Lien; Chanlin, Lih-Juan

    2014-01-01

    A context-aware and mixed-reality exploring tool cannot only effectively provide an information-rich environment to users, but also allows them to quickly utilize useful resources and enhance environment awareness. This study integrates Augmented Reality (AR) technology into smartphones to create a stimulating learning experience at a university…

  9. ARSC: Augmented Reality Student Card--An Augmented Reality Solution for the Education Field

    ERIC Educational Resources Information Center

    El Sayed, Neven A. M.; Zayed, Hala H.; Sharawy, Mohamed I.

    2011-01-01

    Augmented Reality (AR) is the technology of adding virtual objects to real scenes through enabling the addition of missing information in real life. As the lack of resources is a problem that can be solved through AR, this paper presents and explains the usage of AR technology we introduce Augmented Reality Student Card (ARSC) as an application of…

  10. Key technologies of outdoor augmented reality GIS

    NASA Astrophysics Data System (ADS)

    Ren, Fu; Du, Qingyun; Wu, Xueling

    2008-12-01

    Augmented Reality (AR) is a growing research area in virtual reality and generates a composite view for the user. It is combination of the real scene viewed by the user and a virtual scene generated by the computer that augments the scene with additional information. About 80 percent information in the real world is related with spatial location. The combination of Geographical information system (GIS) and AR technologies would promote the development of outdoor AR systems, and also would explore a new research direction for GIS. The key technologies of outdoor augmented reality GIS, including basic tracking methods, display devices, typical applications and registration processes, are discussed. In indoor augmented reality's closed environments the tracking of position and head orientation as well as the presentation of information is much more unproblematic than the same task in an outdoor environment. The main application task of outdoor augmented reality GIS is the presentation of information to a user while moving through an unknown region. The system helps to detect automatically objects in sight of a person who need its information. It compares the conventional solutions of 3D registration with, while it discusses their algorithm procedure to basic parameters to give out their advantages and disadvantages at different condition. While affine transformation approach uses the idea of computer graphics and vision technology for reference. Its accuracy is mainly based on the precision and speed of scene feature point extracted from natural or artificial feature.

  11. Personalized augmented reality for anatomy education.

    PubMed

    Ma, Meng; Fallavollita, Pascal; Seelbach, Ina; Von Der Heide, Anna Maria; Euler, Ekkehard; Waschke, Jens; Navab, Nassir

    2016-05-01

    Anatomy education is a challenging but vital element in forming future medical professionals. In this work, a personalized and interactive augmented reality system is developed to facilitate education. This system behaves as a "magic mirror" which allows personalized in-situ visualization of anatomy on the user's body. Real-time volume visualization of a CT dataset creates the illusion that the user can look inside their body. The system comprises a RGB-D sensor as a real-time tracking device to detect the user moving in front of a display. In addition, the magic mirror system shows text information, medical images, and 3D models of organs that the user can interact with. Through the participation of 7 clinicians and 72 students, two user studies were designed to respectively assess the precision and acceptability of the magic mirror system for education. The results of the first study demonstrated that the average precision of the augmented reality overlay on the user body was 0.96 cm, while the results of the second study indicate 86.1% approval for the educational value of the magic mirror, and 91.7% approval for the augmented reality capability of displaying organs in three dimensions. The usefulness of this unique type of personalized augmented reality technology has been demonstrated in this paper. PMID:26646315

  12. CARE: Creating Augmented Reality in Education

    ERIC Educational Resources Information Center

    Latif, Farzana

    2012-01-01

    This paper explores how Augmented Reality using mobile phones can enhance teaching and learning in education. It specifically examines its application in two cases, where it is identified that the agility of mobile devices and the ability to overlay context specific resources offers opportunities to enhance learning that would not otherwise exist.…

  13. Get Real: Augmented Reality for the Classroom

    ERIC Educational Resources Information Center

    Mitchell, Rebecca; DeBay, Dennis

    2012-01-01

    Kids love augmented reality (AR) simulations because they are like real-life video games. AR simulations allow students to learn content while collaborating face to face and interacting with a multimedia-enhanced version of the world around them. Although the technology may seem advanced, AR software makes it easy to develop content-based…

  14. Intelligent Augmented Reality Training for Motherboard Assembly

    ERIC Educational Resources Information Center

    Westerfield, Giles; Mitrovic, Antonija; Billinghurst, Mark

    2015-01-01

    We investigate the combination of Augmented Reality (AR) with Intelligent Tutoring Systems (ITS) to assist with training for manual assembly tasks. Our approach combines AR graphics with adaptive guidance from the ITS to provide a more effective learning experience. We have developed a modular software framework for intelligent AR training…

  15. Design Principles for Augmented Reality Learning

    ERIC Educational Resources Information Center

    Dunleavy, Matt

    2014-01-01

    Augmented reality is an emerging technology that utilizes mobile, context-aware devices (e.g., smartphones, tablets) that enable participants to interact with digital information embedded within the physical environment. This overview of design principles focuses on specific strategies that instructional designers can use to develop AR learning…

  16. Opportunistic tangible user interfaces for augmented reality.

    PubMed

    Henderson, Steven; Feiner, Steven

    2010-01-01

    Opportunistic Controls are a class of user interaction techniques that we have developed for augmented reality (AR) applications to support gesturing on, and receiving feedback from, otherwise unused affordances already present in the domain environment. By leveraging characteristics of these affordances to provide passive haptics that ease gesture input, Opportunistic Controls simplify gesture recognition, and provide tangible feedback to the user. In this approach, 3D widgets are tightly coupled with affordances to provide visual feedback and hints about the functionality of the control. For example, a set of buttons can be mapped to existing tactile features on domain objects. We describe examples of Opportunistic Controls that we have designed and implemented using optical marker tracking, combined with appearance-based gesture recognition. We present the results of two user studies. In the first, participants performed a simulated maintenance inspection of an aircraft engine using a set of virtual buttons implemented both as Opportunistic Controls and using simpler passive haptics. Opportunistic Controls allowed participants to complete their tasks significantly faster and were preferred over the baseline technique. In the second, participants proposed and demonstrated user interfaces incorporating Opportunistic Controls for two domains, allowing us to gain additional insights into how user interfaces featuring Opportunistic Controls might be designed. PMID:19910657

  17. Spacecraft 3D Augmented Reality Mobile App

    NASA Technical Reports Server (NTRS)

    Hussey, Kevin J.; Doronila, Paul R.; Kumanchik, Brian E.; Chan, Evan G.; Ellison, Douglas J.; Boeck, Andrea; Moore, Justin M.

    2013-01-01

    The Spacecraft 3D application allows users to learn about and interact with iconic NASA missions in a new and immersive way using common mobile devices. Using Augmented Reality (AR) techniques to project 3D renditions of the mission spacecraft into real-world surroundings, users can interact with and learn about Curiosity, GRAIL, Cassini, and Voyager. Additional updates on future missions, animations, and information will be ongoing. Using a printed AR Target and camera on a mobile device, users can get up close with these robotic explorers, see how some move, and learn about these engineering feats, which are used to expand knowledge and understanding about space. The software receives input from the mobile device's camera to recognize the presence of an AR marker in the camera's field of view. It then displays a 3D rendition of the selected spacecraft in the user's physical surroundings, on the mobile device's screen, while it tracks the device's movement in relation to the physical position of the spacecraft's 3D image on the AR marker.

  18. Augmented reality in surgical procedures

    NASA Astrophysics Data System (ADS)

    Samset, E.; Schmalstieg, D.; Vander Sloten, J.; Freudenthal, A.; Declerck, J.; Casciaro, S.; Rideng, Ø.; Gersak, B.

    2008-02-01

    Minimally invasive therapy (MIT) is one of the most important trends in modern medicine. It includes a wide range of therapies in videoscopic surgery and interventional radiology and is performed through small incisions. It reduces hospital stay-time by allowing faster recovery and offers substantially improved cost-effectiveness for the hospital and the society. However, the introduction of MIT has also led to new problems. The manipulation of structures within the body through small incisions reduces dexterity and tactile feedback. It requires a different approach than conventional surgical procedures, since eye-hand co-ordination is not based on direct vision, but more predominantly on image guidance via endoscopes or radiological imaging modalities. ARIS*ER is a multidisciplinary consortium developing a new generation of decision support tools for MIT by augmenting visual and sensorial feedback. We will present tools based on novel concepts in visualization, robotics and haptics providing tailored solutions for a range of clinical applications. Examples from radio-frequency ablation of liver-tumors, laparoscopic liver surgery and minimally invasive cardiac surgery will be presented. Demonstrators were developed with the aim to provide a seamless workflow for the clinical user conducting image-guided therapy.

  19. Augmented reality visualization for thoracoscopic spine surgery

    NASA Astrophysics Data System (ADS)

    Sauer, Frank; Vogt, Sebastian; Khamene, Ali; Heining, Sandro; Euler, Ekkehard; Schneberger, Marc; Zuerl, Konrad; Mutschler, Wolf

    2006-03-01

    We are developing an augmented reality (AR) image guidance system in which information derived from medical images is overlaid onto a video view of the patient. The centerpiece of the system is a head-mounted display custom fitted with two miniature color video cameras that capture the stereo view of the scene. Medical graphics is overlaid onto the video view and appears firmly anchored in the scene, without perceivable time lag or jitter. We have been testing the system for different clinical applications. In this paper we discuss minimally invasive thoracoscopic spine surgery as a promising new orthopedic application. In the standard approach, the thoracoscope - a rigid endoscope - provides visual feedback for the minimally invasive procedure of removing a damaged disc and fusing the two neighboring vertebrae. The navigation challenges are twofold. From a global perspective, the correct vertebrae on the spine have to be located with the inserted instruments. From a local perspective, the actual spine procedure has to be performed precisely. Visual feedback from the thoracoscope provides only limited support for both of these tasks. In the augmented reality approach, we give the surgeon additional anatomical context for the navigation. Before the surgery, we derive a model of the patient's anatomy from a CT scan, and during surgery we track the location of the surgical instruments in relation to patient and model. With this information, we can help the surgeon in both the global and local navigation, providing a global map and 3D information beyond the local 2D view of the thoracoscope. Augmented reality visualization is a particularly intuitive method of displaying this information to the surgeon. To adapt our augmented reality system to this application, we had to add an external optical tracking system, which works now in combination with our head-mounted tracking camera. The surgeon's feedback to the initial phantom experiments is very positive.

  20. Service connectivity architecture for mobile augmented reality

    NASA Astrophysics Data System (ADS)

    Turunen, Tuukka; Pyssysalo, Tino; Roening, Juha

    2001-06-01

    Mobile augmented reality can be utilized in a number of different services, and it provides a lot of added value compared to the interfaces used in mobile multimedia today. Intelligent service connectivity architecture is needed for the emerging commercial mobile augmented reality services, to guarantee mobility and interoperability on a global scale. Some of the key responsibilities of this architecture are to find suitable service providers, to manage the connection with and utilization of such providers, and to allow smooth switching between them whenever the user moves out of the service area of the service provider she is currently connected to. We have studied the potential support technologies for such architectures and propose a way to create an intelligent service connectivity architecture based on current and upcoming wireless networks, an Internet backbone, and mechanisms to manage service connectivity in the upper layers of the protocol stack. In this paper, we explain the key issues of service connectivity, describe the properties of our architecture, and analyze the functionality of an example system. Based on these, we consider our proposition a good solution to the quest for global interoperability in mobile augmented reality services.

  1. Telescopic multi-resolution augmented reality

    NASA Astrophysics Data System (ADS)

    Jenkins, Jeffrey; Frenchi, Christopher; Szu, Harold

    2014-05-01

    To ensure a self-consistent scaling approximation, the underlying microscopic fluctuation components can naturally influence macroscopic means, which may give rise to emergent observable phenomena. In this paper, we describe a consistent macroscopic (cm-scale), mesoscopic (micron-scale), and microscopic (nano-scale) approach to introduce Telescopic Multi-Resolution (TMR) into current Augmented Reality (AR) visualization technology. We propose to couple TMR-AR by introducing an energy-matter interaction engine framework that is based on known Physics, Biology, Chemistry principles. An immediate payoff of TMR-AR is a self-consistent approximation of the interaction between microscopic observables and their direct effect on the macroscopic system that is driven by real-world measurements. Such an interdisciplinary approach enables us to achieve more than multiple scale, telescopic visualization of real and virtual information but also conducting thought experiments through AR. As a result of the consistency, this framework allows us to explore a large dimensionality parameter space of measured and unmeasured regions. Towards this direction, we explore how to build learnable libraries of biological, physical, and chemical mechanisms. Fusing analytical sensors with TMR-AR libraries provides a robust framework to optimize testing and evaluation through data-driven or virtual synthetic simulations. Visualizing mechanisms of interactions requires identification of observable image features that can indicate the presence of information in multiple spatial and temporal scales of analog data. The AR methodology was originally developed to enhance pilot-training as well as `make believe' entertainment industries in a user-friendly digital environment We believe TMR-AR can someday help us conduct thought experiments scientifically, to be pedagogically visualized in a zoom-in-and-out, consistent, multi-scale approximations.

  2. Toward natural fiducials for augmented reality

    NASA Astrophysics Data System (ADS)

    Kitchin, Paul; Martinez, Kirk

    2005-03-01

    Augmented Reality (AR) requires a mapping between the camera(s) and the world, so that virtual objects can be correctly registered. Current AR applications either use pre-prepared fiducial markers or specialist equipment or impose significant constraints on lighting and background. Each of these approaches has significant drawbacks. Fiducial markers are susceptible to loss or damage, can be awkward to work with and may require significant effort to prepare an area for Augmented interaction. Use of such markers may also present an imposition to non-augmented observers, especially in environments such as museums or historical landmarks. Specialist equipment is expensive and not universally available. Lighting and background constraints are often impractical for real-world applications. This paper presents initial results in using the palm of the hand as a pseudo-fiducial marker in a natural real-world environment, through colour, feature and edge analysis. The eventual aim of this research is to enable fiducial marker cards to be dispensed with entirely in some situations in order to allow more natural interaction in Augmented environments. Examples of this would be allowing users to "hold" virtual 3D objects in the palm of their hand or use gestures to interact with virtual objects.

  3. NASA's Wireless Augmented Reality Prototype (WARP)

    NASA Astrophysics Data System (ADS)

    Agan, Martin; Voisinet, Leeann; Devereaux, Ann

    1998-01-01

    The objective of Wireless Augmented Reality Prototype (WARP) effort is to develop and integrate advanced technologies for real-time personal display of information relevant to the health and safety of space station/shuttle personnel. The WARP effort will develop and demonstrate technologies that will ultimately be incorporated into operational Space Station systems and that have potential earth applications such as aircraft pilot alertness monitoring and in various medical and consumer environments where augmented reality is required. To this end a two phase effort will be undertaken to rapidly develop a prototype (Phase I) and an advanced prototype (Phase II) to demonstrate the following key technology features that could be applied to astronaut internal vehicle activity (IVA) and potentially external vehicle activity (EVA) as well: 1) mobile visualization, and 2) distributed information system access. Specifically, Phase I will integrate a low power, miniature wireless communication link and a commercial biosensor with a head mounted display. The Phase I design will emphasize the development of a relatively small, lightweight, and unobtrusive body worn prototype system. Phase II will put increased effort on miniaturization, power consumption reduction, increased throughput, higher resolution, and ``wire removal'' of the subsystems developed in Phase I.

  4. The Local Games Lab ABQ: Homegrown Augmented Reality

    ERIC Educational Resources Information Center

    Holden, Christopher

    2014-01-01

    Experiments in the use of augmented reality games formerly required extensive material resources and expertise to implement above and beyond what might be possible within the usual educational contexts. Currently, the more common availability of hardware in these contexts and the existence of easy-to-use, general purpose augmented reality design…

  5. Augmenting a Child's Reality: Using Educational Tablet Technology

    ERIC Educational Resources Information Center

    Tanner, Patricia; Karas, Carly; Schofield, Damian

    2014-01-01

    This study investigates the classroom integration of an innovative technology, augmented reality. Although the process of adding new technologies into a classroom setting can be daunting, the concept of augmented reality has demonstrated the ability to educate students and to assist with their comprehension of a procedural task. One half of the…

  6. On Location Learning: Authentic Applied Science with Networked Augmented Realities

    ERIC Educational Resources Information Center

    Rosenbaum, Eric; Klopfer, Eric; Perry, Judy

    2007-01-01

    The learning of science can be made more like the practice of science through authentic simulated experiences. We have created a networked handheld Augmented Reality environment that combines the authentic role-playing of Augmented Realities and the underlying models of Participatory Simulations. This game, known as Outbreak @ The Institute, is…

  7. Temporal Coherence Strategies for Augmented Reality Labeling.

    PubMed

    Madsen, Jacob Boesen; Tatzqern, Markus; Madsen, Claus B; Schmalstieg, Dieter; Kalkofen, Denis

    2016-04-01

    Temporal coherence of annotations is an important factor in augmented reality user interfaces and for information visualization. In this paper, we empirically evaluate four different techniques for annotation. Based on these findings, we follow up with subjective evaluations in a second experiment. Results show that presenting annotations in object space or image space leads to a significant difference in task performance. Furthermore, there is a significant interaction between rendering space and update frequency of annotations. Participants improve significantly in locating annotations, when annotations are presented in object space, and view management update rate is limited. In a follow-up experiment, participants appear to be more satisfied with limited update rate in comparison to a continuous update rate of the view management system. PMID:26780810

  8. Image-processing with augmented reality (AR)

    NASA Astrophysics Data System (ADS)

    Babaei, Hossein R.; Mohurutshe, Pagiel L.; Habibi Lashkari, Arash

    2013-03-01

    In this project, the aim is to discuss and articulate the intent to create an image-based Android Application. The basis of this study is on real-time image detection and processing. It's a new convenient measure that allows users to gain information on imagery right on the spot. Past studies have revealed attempts to create image based applications but have only gone up to crating image finders that only work with images that are already stored within some form of database. Android platform is rapidly spreading around the world and provides by far the most interactive and technical platform for smart-phones. This is why it was important to base the study and research on it. Augmented Reality is this allows the user to maipulate the data and can add enhanced features (video, GPS tags) to the image taken.

  9. Reconfigurable hardware for an augmented reality application

    NASA Astrophysics Data System (ADS)

    Toledo Moreo, F. Javier; Martinez Alvarez, J. Javier; Garrigos Guerrero, F. Javier; Ferrandez Vicente, J. Manuel

    2005-06-01

    An FPGA-based approach is proposed to build an augmented reality system in order to aid people affected by a visual disorder known as tunnel vision. The aim is to increase the user's knowledge of his environment by superimposing on his own view useful information obtained with image processing. Two different alternatives have been explored to perform the required image processing: a specific purpose algorithm to extract edge detection information, and a cellular neural network with the suitable template. Their implementations in reconfigurable hardware pursue to take advantage of the performance and flexibility that show modern FPGAs. This paper describes the hardware implementation of both the Canny algorithm and the cellular neural network, and the overall system architecture. Results of the implementations and examples of the system functionality are presented.

  10. Visualizing Sea Level Rise with Augmented Reality

    NASA Astrophysics Data System (ADS)

    Kintisch, E. S.

    2013-12-01

    Looking Glass is an application on the iPhone that visualizes in 3-D future scenarios of sea level rise, overlaid on live camera imagery in situ. Using a technology known as augmented reality, the app allows a layperson user to explore various scenarios of sea level rise using a visual interface. Then the user can see, in an immersive, dynamic way, how those scenarios would affect a real place. The first part of the experience activates users' cognitive, quantitative thinking process, teaching them how global sea level rise, tides and storm surge contribute to flooding; the second allows an emotional response to a striking visual depiction of possible future catastrophe. This project represents a partnership between a science journalist, MIT, and the Rhode Island School of Design, and the talk will touch on lessons this projects provides on structuring and executing such multidisciplinary efforts on future design projects.

  11. Extended overview techniques for outdoor augmented reality.

    PubMed

    Veas, Eduardo; Grasset, Raphaël; Kruijff, Ernst; Schmalstieg, Dieter

    2012-04-01

    In this paper, we explore techniques that aim to improve site understanding for outdoor Augmented Reality (AR) applications. While the first person perspective in AR is a direct way of filtering and zooming on a portion of the data set, it severely narrows overview of the situation, particularly over large areas. We present two interactive techniques to overcome this problem: multi-view AR and variable perspective view. We describe in details the conceptual, visualization and interaction aspects of these techniques and their evaluation through a comparative user study. The results we have obtained strengthen the validity of our approach and the applicability of our methods to a large range of application domains. PMID:22402683

  12. A Pilot Study of the Effectiveness of Augmented Reality to Enhance the Use of Remote Labs in Electrical Engineering Education

    ERIC Educational Resources Information Center

    Borrero, A. Mejias; Marquez, J. M. Andujar

    2012-01-01

    Lab practices are an essential part of teaching in Engineering. However, traditional laboratory lessons developed in classroom labs (CL) must be adapted to teaching and learning strategies that go far beyond the common concept of e-learning, in the sense that completely virtualized distance education disconnects teachers and students from the real…

  13. Augmented Reality as a Countermeasure for Sleep Deprivation.

    PubMed

    Baumeister, James; Dorrlan, Jillian; Banks, Siobhan; Chatburn, Alex; Smith, Ross T; Carskadon, Mary A; Lushington, Kurt; Thomas, Bruce H

    2016-04-01

    Sleep deprivation is known to have serious deleterious effects on executive functioning and job performance. Augmented reality has an ability to place pertinent information at the fore, guiding visual focus and reducing instructional complexity. This paper presents a study to explore how spatial augmented reality instructions impact procedural task performance on sleep deprived users. The user study was conducted to examine performance on a procedural task at six time points over the course of a night of total sleep deprivation. Tasks were provided either by spatial augmented reality-based projections or on an adjacent monitor. The results indicate that participant errors significantly increased with the monitor condition when sleep deprived. The augmented reality condition exhibited a positive influence with participant errors and completion time having no significant increase when sleep deprived. The results of our study show that spatial augmented reality is an effective sleep deprivation countermeasure under laboratory conditions. PMID:26780802

  14. 3D augmented reality with integral imaging display

    NASA Astrophysics Data System (ADS)

    Shen, Xin; Hua, Hong; Javidi, Bahram

    2016-06-01

    In this paper, a three-dimensional (3D) integral imaging display for augmented reality is presented. By implementing the pseudoscopic-to-orthoscopic conversion method, elemental image arrays with different capturing parameters can be transferred into the identical format for 3D display. With the proposed merging algorithm, a new set of elemental images for augmented reality display is generated. The newly generated elemental images contain both the virtual objects and real world scene with desired depth information and transparency parameters. The experimental results indicate the feasibility of the proposed 3D augmented reality with integral imaging.

  15. LCD masks for spatial augmented reality

    NASA Astrophysics Data System (ADS)

    Smithwick, Quinn Y. J.; Reetz, Daniel; Smoot, Lanny

    2014-03-01

    One aim of Spatial Augmented Reality is to visually integrate synthetic objects into real-world spaces amongst physical objects, viewable by many observers without 3D glasses, head-mounted displays or mobile screens. In common implementations, using beam-combiners, scrim projection, or transparent self-emissive displays, the synthetic object's and real-world scene's light combine additively. As a result, synthetic objects appear low-contrast and semitransparent against well-lit backgrounds, and do not cast shadows. These limitations prevent synthetic objects from appearing solid and visually integrated into the real-world space. We use a transparent LCD panel as a programmable dynamic mask. The LCD panel displaying the synthetic object's silhouette mask is colocated with the object's color image, both staying aligned for all points-of-view. The mask blocks the background providing occlusion, presents a black level for high-contrast images, blocks scene illumination thus casting true shadows, and prevents blow-by in projection scrim arrangements. We have several implementations of SAR with LCD masks: 1) beam-combiner with an LCD mask, 2) scrim projection with an LCD mask, and 3) transparent OLED display with an LCD mask. Large format (80" diagonal) and dual layer volumetric variations are also implemented.

  16. Directing driver attention with augmented reality cues

    PubMed Central

    Rusch, Michelle L.; Schall, Mark C.; Gavin, Patrick; Lee, John D.; Dawson, Jeffrey D.; Vecera, Shaun; Rizzo, Matthew

    2013-01-01

    This simulator study evaluated the effects of augmented reality (AR) cues designed to direct the attention of experienced drivers to roadside hazards. Twenty-seven healthy middle-aged licensed drivers with a range of attention capacity participated in a 54 mile (1.5 hour) drive in an interactive fixed-base driving simulator. Each participant received AR cues to potential roadside hazards in six simulated straight (9 mile long) rural roadway segments. Drivers were evaluated on response time for detecting a potentially hazardous event, detection accuracy for target (hazard) and non-target objects, and headway with respect to the hazards. Results showed no negative outcomes associated with interference. AR cues did not impair perception of non-target objects, including for drivers with lower attentional capacity. Results showed near significant response time benefits for AR cued hazards. AR cueing increased response rate for detecting pedestrians and warning signs but not vehicles. AR system false alarms and misses did not impair driver responses to potential hazards. PMID:24436635

  17. Videometric head tracker for augmented reality applications

    NASA Astrophysics Data System (ADS)

    Janin, Adam L.; Zikan, Karel; Mizell, David; Banner, Mike; Sowizral, Henry A.

    1995-12-01

    For the past three years, we have been developing augmented reality technology for application to a variety of touch labor tasks in aircraft manufacturing and assembly. The system would be worn by factory workers to provide them with better-quality information for performing their tasks than was previously available. Using a see-through head-mounted display (HMD) whose optics are set at a focal length of about 18 in., the display and its associated head tracking system can be used to superimpose and stabilize graphics on the surface of a work piece. This technology would obviate many expensive marking systems now used in aerospace manufacturing. The most challenging technical issue with respect to factory applications of AR is head position and orientation tracking. It requires high accuracy, long- range tracking in a high-noise environment. The approach we have chosen uses a head- mounted miniature video camera. The user's wearable computer system utilizes the camera to find fiducial markings that have been placed on known coordinates on or near the work piece. The system then computes the user's position and orientation relative to the fiducial marks. It is referred to as a `videometric' head tracker. In this paper, we describe the steps we took and the results we obtained in the process of prototyping our videometric head tracker, beginning with analytical and simulation results, and continuing through the working prototypes.

  18. Augmented Reality a Review on Technology and Applications

    NASA Astrophysics Data System (ADS)

    Petruse, Radu Emanuil; Bondrea, Ioan

    2014-11-01

    We present in this paper an overview of the concepts and potential industrial Augmented Reality applications that can be very efficient. We also present the basic technological requirements for an AR system

  19. Augmented Reality System for E-maintenance Application

    NASA Astrophysics Data System (ADS)

    Benbelkacem, S.; Zenati-Henda, N.; Belhocine, M.; Malek, S.

    2009-03-01

    We present in this paper an Augmented Reality platform for e-maintenance application. In our case, the aim is not to develop a vision system based on augmented reality concept, but to show the relationship between the different actors in the proposed architecture and to facilitate maintenance of the machine. With this platform we identify all possible scenarios which allow the technician to intervene on a machine breakdown using distant expert if necessary. Each scenario depends on the machine parameters and the technician competences. To implement a configuration of Augmented Reality system, we chose a case study of maintenance scenario for machine breakdown. Then we represent this scenario by an interaction model which allows establishing Augmented Reality configuration.

  20. Graphical user interface concepts for tactical augmented reality

    NASA Astrophysics Data System (ADS)

    Argenta, Chris; Murphy, Anne; Hinton, Jeremy; Cook, James; Sherrill, Todd; Snarski, Steve

    2010-04-01

    Applied Research Associates and BAE Systems are working together to develop a wearable augmented reality system under the DARPA ULTRA-Vis program†. Our approach to achieve the objectives of ULTRAVis, called iLeader, incorporates a full color 40° field of view (FOV) see-thru holographic waveguide integrated with sensors for full position and head tracking to provide an unobtrusive information system for operational maneuvers. iLeader will enable warfighters to mark-up the 3D battle-space with symbologic identification of graphical control measures, friendly force positions and enemy/target locations. Our augmented reality display provides dynamic real-time painting of symbols on real objects, a pose-sensitive 360° representation of relevant object positions, and visual feedback for a variety of system activities. The iLeader user interface and situational awareness graphical representations are highly intuitive, nondisruptive, and always tactically relevant. We used best human-factors practices, system engineering expertise, and cognitive task analysis to design effective strategies for presenting real-time situational awareness to the military user without distorting their natural senses and perception. We present requirements identified for presenting information within a see-through display in combat environments, challenges in designing suitable visualization capabilities, and solutions that enable us to bring real-time iconic command and control to the tactical user community.

  1. Improving Robotic Operator Performance Using Augmented Reality

    NASA Technical Reports Server (NTRS)

    Maida, James C.; Bowen, Charles K.; Pace, John W.

    2007-01-01

    The Special Purpose Dexterous Manipulator (SPDM) is a two-armed robot that functions as an extension to the end effector of the Space Station Robotics Manipulator System (SSRMS), currently in use on the International Space Station (ISS). Crew training for the SPDM is accomplished using a robotic hardware simulator, which performs most of SPDM functions under normal static Earth gravitational forces. Both the simulator and SPDM are controlled from a standard robotic workstation using a laptop for the user interface and three monitors for camera views. Most operations anticipated for the SPDM involve the manipulation, insertion, and removal of any of several types of Orbital Replaceable Unit (ORU), modules which control various ISS functions. Alignment tolerances for insertion of the ORU into its receptacle are 0.25 inch and 0.5 degree from nominal values. The pre-insertion alignment task must be performed within these tolerances by using available video camera views of the intrinsic features of the ORU and receptacle, without special registration markings. Since optimum camera views may not be available, and dynamic orbital lighting conditions may limit periods of viewing, a successful ORU insertion operation may require an extended period of time. This study explored the feasibility of using augmented reality (AR) to assist SPDM operations. Geometric graphical symbols were overlaid on one of the workstation monitors to afford cues to assist the operator in attaining adequate pre-insertion ORU alignment. Twelve skilled subjects performed eight ORU insertion tasks using the simulator with and without the AR symbols in a repeated measures experimental design. Results indicated that using the AR symbols reduced pre-insertion alignment error for all subjects and reduced the time to complete pre-insertion alignment for most subjects.

  2. Transparent 3D display for augmented reality

    NASA Astrophysics Data System (ADS)

    Lee, Byoungho; Hong, Jisoo

    2012-11-01

    Two types of transparent three-dimensional display systems applicable for the augmented reality are demonstrated. One of them is a head-mounted-display-type implementation which utilizes the principle of the system adopting the concave floating lens to the virtual mode integral imaging. Such configuration has an advantage in that the threedimensional image can be displayed at sufficiently far distance resolving the accommodation conflict with the real world scene. Incorporating the convex half mirror, which shows a partial transparency, instead of the concave floating lens, makes it possible to implement the transparent three-dimensional display system. The other type is the projection-type implementation, which is more appropriate for the general use than the head-mounted-display-type implementation. Its imaging principle is based on the well-known reflection-type integral imaging. We realize the feature of transparent display by imposing the partial transparency to the array of concave mirror which is used for the screen of reflection-type integral imaging. Two types of configurations, relying on incoherent and coherent light sources, are both possible. For the incoherent configuration, we introduce the concave half mirror array, whereas the coherent one adopts the holographic optical element which replicates the functionality of the lenslet array. Though the projection-type implementation is beneficial than the head-mounted-display in principle, the present status of the technical advance of the spatial light modulator still does not provide the satisfactory visual quality of the displayed three-dimensional image. Hence we expect that the head-mounted-display-type and projection-type implementations will come up in the market in sequence.

  3. Augmented Reality, the Future of Contextual Mobile Learning

    ERIC Educational Resources Information Center

    Sungkur, Roopesh Kevin; Panchoo, Akshay; Bhoyroo, Nitisha Kirtee

    2016-01-01

    Purpose: This study aims to show the relevance of augmented reality (AR) in mobile learning for the 21st century. With AR, any real-world environment can be augmented by providing users with accurate digital overlays. AR is a promising technology that has the potential to encourage learners to explore learning materials from a totally new…

  4. Augmented reality aiding collimator exchange at the LHC

    NASA Astrophysics Data System (ADS)

    Martínez, Héctor; Fabry, Thomas; Laukkanen, Seppo; Mattila, Jouni; Tabourot, Laurent

    2014-11-01

    Novel Augmented Reality techniques have the potential to have a large positive impact on the way remote maintenance operations are carried out in hazardous areas, e.g. areas where radiation doses that imply careful planning and optimization of maintenance operations are present. This paper describes an Augmented Reality strategy, system and implementation for aiding the remote collimator exchange in the LHC, currently the world's largest and highest-energy particle accelerator. The proposed system relies on marker detection and multi-modal augmentation in real-time. A database system has been used to ensure flexibility. The system has been tested in a mock-up facility, showing real time performance and great potential for future use in the LHC. The technical-scientific difficulties identified during the development of the system and the proposed solutions described in this paper may help the development of future Augmented Reality systems for remote handling in scientific facilities.

  5. Augmented reality for biomedical wellness sensor systems

    NASA Astrophysics Data System (ADS)

    Jenkins, Jeffrey; Szu, Harold

    2013-05-01

    Due to the commercial move and gaming industries, Augmented Reality (AR) technology has matured. By definition of AR, both artificial and real humans can be simultaneously present and realistically interact among one another. With the help of physics and physiology, we can build in the AR tool together with real human day-night webcam inputs through a simple interaction of heat transfer -getting hot, action and reaction -walking or falling, as well as the physiology -sweating due to activity. Knowing the person age, weight and 3D coordinates of joints in the body, we deduce the force, the torque, and the energy expenditure during real human movements and apply to an AR human model. We wish to support the physics-physiology AR version, PPAR, as a BMW surveillance tool for senior home alone (SHA). The functionality is to record senior walking and hand movements inside a home environment. Besides the fringe benefit of enabling more visits from grand children through AR video games, the PP-AR surveillance tool may serve as a means to screen patients in the home for potential falls at points around in house. Moreover, we anticipate PP-AR may help analyze the behavior history of SHA, e.g. enhancing the Smartphone SHA Ubiquitous Care Program, by discovering early symptoms of candidate Alzheimer-like midnight excursions, or Parkinson-like trembling motion for when performing challenging muscular joint movements. Using a set of coordinates corresponding to a set of 3D positions representing human joint locations, we compute the Kinetic Energy (KE) generated by each body segment over time. The Work is then calculated, and converted into calories. Using common graphics rendering pipelines, one could invoke AR technology to provide more information about patients to caretakers. Alerts to caretakers can be prompted by a patient's departure from their personal baseline, and the patient's time ordered joint information can be loaded to a graphics viewer allowing for high

  6. Gunner Goggles: Implementing Augmented Reality into Medical Education.

    PubMed

    Wang, Leo L; Wu, Hao-Hua; Bilici, Nadir; Tenney-Soeiro, Rebecca

    2016-01-01

    There is evidence that both smartphone and tablet integration into medical education has been lacking. At the same time, there is a niche for augmented reality (AR) to improve this process through the enhancement of textbook learning. Gunner Goggles is an attempt to enhance textbook learning in shelf exam preparatory review with augmented reality. Here we describe our initial prototype and detail the process by which augmented reality was implemented into our textbook through Layar. We describe the unique functionalities of our textbook pages upon augmented reality implementation, which includes links, videos and 3D figures, and surveyed 24 third year medical students for their impression of the technology. Upon demonstrating an initial prototype textbook chapter, 100% (24/24) of students felt that augmented reality improved the quality of our textbook chapter as a learning tool. Of these students, 92% (22/24) agreed that their shelf exam review was inadequate and 19/24 (79%) felt that a completed Gunner Goggles product would have been a viable alternative to their shelf exam review. Thus, while students report interest in the integration of AR into medical education test prep, future investigation into how the use of AR can improve performance on exams is warranted. PMID:27046620

  7. Transforming Polar Research with Google Glass Augmented Reality (Invited)

    NASA Astrophysics Data System (ADS)

    Ramachandran, R.; McEniry, M.; Maskey, M.

    2011-12-01

    Augmented reality is a new technology with the potential to accelerate the advancement of science, particularly in geophysical research. Augmented reality is defined as a live, direct or indirect, view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated sensory input such as sound, video, graphics or GPS data. When paired with advanced computing techniques on cloud resources, augmented reality has the potential to improve data collection techniques, visualizations, as well as in-situ analysis for many areas of research. Google is currently a pioneer of augmented reality technology and has released beta versions of their wearable computing device, Google Glass, to a select number of developers and beta testers. This community of 'Glass Explorers' is the vehicle from which Google shapes the future of their augmented reality device. Example applications of Google Glass in geophysical research range from use as a data gathering interface in harsh climates to an on-site visualization and analysis tool. Early participation in the shaping of the Google Glass device is an opportunity for researchers to tailor this new technology to their specific needs. The purpose of this presentation is to provide geophysical researchers with a hands-on first look at Google Glass and its potential as a scientific tool. Attendees will be given an overview of the technical specifications as well as a live demonstration of the device. Potential applications to geophysical research in polar regions will be the primary focus. The presentation will conclude with an open call to participate, during which attendees may indicate interest in developing projects that integrate Google Glass into their research. Application Mockup: Penguin Counter Google Glass Augmented Reality Device

  8. Transforming Polar Research with Google Glass Augmented Reality (Invited)

    NASA Astrophysics Data System (ADS)

    Ruthkoski, T.

    2013-12-01

    Augmented reality is a new technology with the potential to accelerate the advancement of science, particularly in geophysical research. Augmented reality is defined as a live, direct or indirect, view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated sensory input such as sound, video, graphics or GPS data. When paired with advanced computing techniques on cloud resources, augmented reality has the potential to improve data collection techniques, visualizations, as well as in-situ analysis for many areas of research. Google is currently a pioneer of augmented reality technology and has released beta versions of their wearable computing device, Google Glass, to a select number of developers and beta testers. This community of 'Glass Explorers' is the vehicle from which Google shapes the future of their augmented reality device. Example applications of Google Glass in geophysical research range from use as a data gathering interface in harsh climates to an on-site visualization and analysis tool. Early participation in the shaping of the Google Glass device is an opportunity for researchers to tailor this new technology to their specific needs. The purpose of this presentation is to provide geophysical researchers with a hands-on first look at Google Glass and its potential as a scientific tool. Attendees will be given an overview of the technical specifications as well as a live demonstration of the device. Potential applications to geophysical research in polar regions will be the primary focus. The presentation will conclude with an open call to participate, during which attendees may indicate interest in developing projects that integrate Google Glass into their research. Application Mockup: Penguin Counter Google Glass Augmented Reality Device

  9. Soldier-worn augmented reality system for tactical icon visualization

    NASA Astrophysics Data System (ADS)

    Roberts, David; Menozzi, Alberico; Clipp, Brian; Russler, Patrick; Cook, James; Karl, Robert; Wenger, Eric; Church, William; Mauger, Jennifer; Volpe, Chris; Argenta, Chris; Wille, Mark; Snarski, Stephen; Sherrill, Todd; Lupo, Jasper; Hobson, Ross; Frahm, Jan-Michael; Heinly, Jared

    2012-06-01

    This paper describes the development and demonstration of a soldier-worn augmented reality system testbed that provides intuitive 'heads-up' visualization of tactically-relevant geo-registered icons. Our system combines a robust soldier pose estimation capability with a helmet mounted see-through display to accurately overlay geo-registered iconography (i.e., navigation waypoints, blue forces, aircraft) on the soldier's view of reality. Applied Research Associates (ARA), in partnership with BAE Systems and the University of North Carolina - Chapel Hill (UNC-CH), has developed this testbed system in Phase 2 of the DARPA ULTRA-Vis (Urban Leader Tactical, Response, Awareness, and Visualization) program. The ULTRA-Vis testbed system functions in unprepared outdoor environments and is robust to numerous magnetic disturbances. We achieve accurate and robust pose estimation through fusion of inertial, magnetic, GPS, and computer vision data acquired from helmet kit sensors. Icons are rendered on a high-brightness, 40°×30° field of view see-through display. The system incorporates an information management engine to convert CoT (Cursor-on-Target) external data feeds into mil-standard icons for visualization. The user interface provides intuitive information display to support soldier navigation and situational awareness of mission-critical tactical information.

  10. Towards Robot teaching based on Virtual and Augmented Reality Concepts

    NASA Astrophysics Data System (ADS)

    Ennakr, Said; Domingues, Christophe; Benchikh, Laredj; Otmane, Samir; Mallem, Malik

    2009-03-01

    A complex system is a system made up of a great number of entities in local and simultaneous interaction. Its design requires the collaboration of engineers of various complementary specialties, so that it is necessary to invent new design methods. Indeed, currently the industry loses much time between the moment when the product model is designed and when the latter is serially produced on the lines of factories. This production is generally ensured by automated and more often robotized means. A deadline is thus necessary for the development of the automatisms and the robots work on a new product model. In this context we launched a study based on the principle of the mechatronics design in Augmented Reality-Virtual Reality. This new approach will bring solutions to problems encountered in many application scopes, but also to problems involved in the distance which separates the offices from design of vehicles and their production sites. This new approach will minimize the differences of errors between the design model and real prototype.

  11. Augmented reality three-dimensional display with light field fusion.

    PubMed

    Xie, Songlin; Wang, Peng; Sang, Xinzhu; Li, Chengyu

    2016-05-30

    A video see-through augmented reality three-dimensional display method is presented. The system that is used for dense viewpoint augmented reality presentation fuses the light fields of the real scene and the virtual model naturally. Inherently benefiting from the rich information of the light field, depth sense and occlusion can be handled under no priori depth information of the real scene. A series of processes are proposed to optimize the augmented reality performance. Experimental results show that the reconstructed fused 3D light field on the autostereoscopic display is well presented. The virtual model is naturally integrated into the real scene with a consistence between binocular parallax and monocular depth cues. PMID:27410076

  12. E-maintenance Scenarios Based on Augmented Reality Software Architecture

    NASA Astrophysics Data System (ADS)

    Benbelkacem, S.; Zenati-Henda, N.; Belhocine, M.

    2008-06-01

    This paper presents architecture of augmented reality for e-maintenance application. In our case, the aim is not to develop a vision system based on augmented reality concept, but to show the relationship between the different actors in the proposed architecture and to facilitate maintenance of the machine. This architecture allows implementing different scenarios which give to the technician possibilities to intervene on a breakdown device with a distant expert help. Each scenario is established according to machine parameters and technician competences. In our case, a hardware platform is designed to carry out e-maintenance scenarios. An example of e-maintenance scenario is then presented.

  13. Cranial implant design using augmented reality immersive system.

    PubMed

    Ai, Zhuming; Evenhouse, Ray; Leigh, Jason; Charbel, Fady; Rasmussen, Mary

    2007-01-01

    Software tools that utilize haptics for sculpting precise fitting cranial implants are utilized in an augmented reality immersive system to create a virtual working environment for the modelers. The virtual environment is designed to mimic the traditional working environment as closely as possible, providing more functionality for the users. The implant design process uses patient CT data of a defective area. This volumetric data is displayed in an implant modeling tele-immersive augmented reality system where the modeler can build a patient specific implant that precisely fits the defect. To mimic the traditional sculpting workspace, the implant modeling augmented reality system includes stereo vision, viewer centered perspective, sense of touch, and collaboration. To achieve optimized performance, this system includes a dual-processor PC, fast volume rendering with three-dimensional texture mapping, the fast haptic rendering algorithm, and a multi-threading architecture. The system replaces the expensive and time consuming traditional sculpting steps such as physical sculpting, mold making, and defect stereolithography. This augmented reality system is part of a comprehensive tele-immersive system that includes a conference-room-sized system for tele-immersive small group consultation and an inexpensive, easily deployable networked desktop virtual reality system for surgical consultation, evaluation and collaboration. This system has been used to design patient-specific cranial implants with precise fit. PMID:17377223

  14. Evaluating Virtual Reality and Augmented Reality Training for Industrial Maintenance and Assembly Tasks

    ERIC Educational Resources Information Center

    Gavish, Nirit; Gutiérrez, Teresa; Webel, Sabine; Rodríguez, Jorge; Peveri, Matteo; Bockholt, Uli; Tecchia, Franco

    2015-01-01

    The current study evaluated the use of virtual reality (VR) and augmented reality (AR) platforms, developed within the scope of the SKILLS Integrated Project, for industrial maintenance and assembly (IMA) tasks training. VR and AR systems are now widely regarded as promising training platforms for complex and highly demanding IMA tasks. However,…

  15. Computer-vision-based registration techniques for augmented reality

    NASA Astrophysics Data System (ADS)

    Hoff, William A.; Nguyen, Khoi; Lyon, Torsten

    1996-10-01

    Augmented reality is a term used to describe systems in which computer-generated information is superimposed on top of the real world; for example, through the use of a see- through head-mounted display. A human user of such a system could still see and interact with the real world, but have valuable additional information, such as descriptions of important features or instructions for performing physical tasks, superimposed on the world. For example, the computer could identify and overlay them with graphic outlines, labels, and schematics. The graphics are registered to the real-world objects and appear to be 'painted' onto those objects. Augmented reality systems can be used to make productivity aids for tasks such as inspection, manufacturing, and navigation. One of the most critical requirements for augmented reality is to recognize and locate real-world objects with respect to the person's head. Accurate registration is necessary in order to overlay graphics accurately on top of the real-world objects. At the Colorado School of Mines, we have developed a prototype augmented reality system that uses head-mounted cameras and computer vision techniques to accurately register the head to the scene. The current system locates and tracks a set of pre-placed passive fiducial targets placed on the real-world objects. The system computes the pose of the objects and displays graphics overlays using a see-through head-mounted display. This paper describes the architecture of the system and outlines the computer vision techniques used.

  16. Augmented Reality in Education--Cases, Places and Potentials

    ERIC Educational Resources Information Center

    Bower, Matt; Howe, Cathie; McCredie, Nerida; Robinson, Austin; Grover, David

    2014-01-01

    Augmented Reality is poised to profoundly transform Education as we know it. The capacity to overlay rich media onto the real world for viewing through web-enabled devices such as phones and tablet devices means that information can be made available to students at the exact time and place of need. This has the potential to reduce cognitive…

  17. Current Status, Opportunities and Challenges of Augmented Reality in Education

    ERIC Educational Resources Information Center

    Wu, Hsin-Kai; Lee, Silvia Wen-Yu; Chang, Hsin-Yi; Liang, Jyh-Chong

    2013-01-01

    Although augmented reality (AR) has gained much research attention in recent years, the term AR was given different meanings by varying researchers. In this article, we first provide an overview of definitions, taxonomies, and technologies of AR. We argue that viewing AR as a concept rather than a type of technology would be more productive for…

  18. Using Augmented Reality Tools to Enhance Children's Library Services

    ERIC Educational Resources Information Center

    Meredith, Tamara R.

    2015-01-01

    Augmented reality (AR) has been used and documented for a variety of commercial and educational purposes, and the proliferation of mobile devices has increased the average person's access to AR systems and tools. However, little research has been done in the area of using AR to supplement traditional library services, specifically for patrons aged…

  19. Augmented Reality Games: Using Technology on a Budget

    ERIC Educational Resources Information Center

    Annetta, Leonard; Burton, Erin Peters; Frazier, Wendy; Cheng, Rebecca; Chmiel, Margaret

    2012-01-01

    As smartphones become more ubiquitous among adolescents, there is increasing potential for these as a tool to engage students in science instruction through innovative learning environments such as augmented reality (AR). Aligned with the National Science Education Standards (NRC 1996) and integrating the three dimensions of "A Framework for K-12…

  20. Learning Physics through Play in an Augmented Reality Environment

    ERIC Educational Resources Information Center

    Enyedy, Noel; Danish, Joshua A.; Delacruz, Girlie; Kumar, Melissa

    2012-01-01

    The Learning Physics through Play Project (LPP) engaged 6-8-year old students (n = 43) in a series of scientific investigations of Newtonian force and motion including a series of augmented reality activities. We outline the two design principles behind the LPP curriculum: 1) the use of socio-dramatic, embodied play in the form of participatory…

  1. Sensors for Location-Based Augmented Reality the Example of Galileo and Egnos

    NASA Astrophysics Data System (ADS)

    Pagani, Alain; Henriques, José; Stricker, Didier

    2016-06-01

    Augmented Reality has long been approached from the point of view of Computer Vision and Image Analysis only. However, much more sensors can be used, in particular for location-based Augmented Reality scenarios. This paper reviews the various sensors that can be used for location-based Augmented Reality. It then presents and discusses several examples of the usage of Galileo and EGNOS in conjonction with Augmented Reality.

  2. Augmented reality based real-time subcutaneous vein imaging system.

    PubMed

    Ai, Danni; Yang, Jian; Fan, Jingfan; Zhao, Yitian; Song, Xianzheng; Shen, Jianbing; Shao, Ling; Wang, Yongtian

    2016-07-01

    A novel 3D reconstruction and fast imaging system for subcutaneous veins by augmented reality is presented. The study was performed to reduce the failure rate and time required in intravenous injection by providing augmented vein structures that back-project superimposed veins on the skin surface of the hand. Images of the subcutaneous vein are captured by two industrial cameras with extra reflective near-infrared lights. The veins are then segmented by a multiple-feature clustering method. Vein structures captured by the two cameras are matched and reconstructed based on the epipolar constraint and homographic property. The skin surface is reconstructed by active structured light with spatial encoding values and fusion displayed with the reconstructed vein. The vein and skin surface are both reconstructed in the 3D space. Results show that the structures can be precisely back-projected to the back of the hand for further augmented display and visualization. The overall system performance is evaluated in terms of vein segmentation, accuracy of vein matching, feature points distance error, duration times, accuracy of skin reconstruction, and augmented display. All experiments are validated with sets of real vein data. The imaging and augmented system produces good imaging and augmented reality results with high speed. PMID:27446690

  3. Augmented reality based real-time subcutaneous vein imaging system

    PubMed Central

    Ai, Danni; Yang, Jian; Fan, Jingfan; Zhao, Yitian; Song, Xianzheng; Shen, Jianbing; Shao, Ling; Wang, Yongtian

    2016-01-01

    A novel 3D reconstruction and fast imaging system for subcutaneous veins by augmented reality is presented. The study was performed to reduce the failure rate and time required in intravenous injection by providing augmented vein structures that back-project superimposed veins on the skin surface of the hand. Images of the subcutaneous vein are captured by two industrial cameras with extra reflective near-infrared lights. The veins are then segmented by a multiple-feature clustering method. Vein structures captured by the two cameras are matched and reconstructed based on the epipolar constraint and homographic property. The skin surface is reconstructed by active structured light with spatial encoding values and fusion displayed with the reconstructed vein. The vein and skin surface are both reconstructed in the 3D space. Results show that the structures can be precisely back-projected to the back of the hand for further augmented display and visualization. The overall system performance is evaluated in terms of vein segmentation, accuracy of vein matching, feature points distance error, duration times, accuracy of skin reconstruction, and augmented display. All experiments are validated with sets of real vein data. The imaging and augmented system produces good imaging and augmented reality results with high speed. PMID:27446690

  4. The Effect of an Augmented Reality Enhanced Mathematics Lesson on Student Achievement and Motivation

    ERIC Educational Resources Information Center

    Estapa, Anne; Nadolny, Larysa

    2015-01-01

    The purpose of the study was to assess student achievement and motivation during a high school augmented reality mathematics activity focused on dimensional analysis. Included in this article is a review of the literature on the use of augmented reality in mathematics and the combination of print with augmented reality, also known as interactive…

  5. An Analysis of Engagement in a Combination Indoor/Outdoor Augmented Reality Educational Game

    ERIC Educational Resources Information Center

    Folkestad, James; O'Shea, Patrick

    2011-01-01

    This paper describes the results of a qualitative analysis of video captured during a dual indoor/outdoor Augmented Reality experience. Augmented Reality is the layering of virtual information on top of the physical world. This Augmented Reality experience asked students to interact with the San Diego Museum of Art and the Botanical Gardens in San…

  6. Use of augmented reality in aircraft maintenance operations

    NASA Astrophysics Data System (ADS)

    De Marchi, L.; Ceruti, A.; Testoni, N.; Marzani, A.; Liverani, A.

    2014-03-01

    This paper illustrates a Human-Machine Interface based on Augmented Reality (AR) conceived to provide to maintenance operators the results of an impact detection methodology. In particular, the implemented tool dynamically interacts with a head portable visualization device allowing the inspector to see the estimated impact position on the structure. The impact detection methodology combines the signals collected by a network of piezosensors bonded on the structure to be monitored. Then a signal processing algorithm is applied to compensate for dispersion the acquired guided waves. The compensated waveforms yield to a robust estimation of guided waves difference in distance of propagation (DDOP), used to feed hyperbolic algorithms for impact location determination. The output of the impact methodology is passed to an AR visualization technology that is meant to support the inspector during the on-field inspection/diagnosis as well as the maintenance operations. The inspector, in fact, can see interactively in real time the impact data directly on the surface of the structure. Here the proposed approach is tested on the engine cowling of a Cessna 150 general aviation airplane. Preliminary results confirm the feasibility of the method and its exploitability in maintenance practice.

  7. Augmented-reality-based object modeling in Internet robotics

    NASA Astrophysics Data System (ADS)

    Dalton, Barney; Friz, Harald; Taylor, Kenneth

    1998-12-01

    This paper introduces an Augmented Reality interface that can be used in supervisory control of a robot over the Internet. The operator's client interface requires only a modest computer to run the Java control applet. Operators can completely specify the remote manipulator's path, using an Augmented Realty stick cursor, superimposed on multiple monoscopic images of the workspace. The same cursor is used to identify objects in the workspace, allowing interactive modeling of the workspace environment. Operating place predefined wireframe models of objects on an image of the workspace and use the cursor to align them with corresponding objects in the image.

  8. Spatial augmented reality based high accuracy human face projection

    NASA Astrophysics Data System (ADS)

    Li, Dong; Xie, Jinghui; Li, Yufeng; Weng, Dongdong; Liu, Yue

    2015-08-01

    This paper discusses the imaging principles and the technical difficulties of spatial augmented reality based human face projection. A novel geometry correction method is proposed to realize fast, high-accuracy face model projection. Using a depth camera to reconstruct the projected object, the relative position from the rendered model to the projector can be accessed and the initial projection image is generated. Then the projected image is distorted by using Bezier interpolation to guarantee that the projected texture matches with the object surface. The proposed method is under a simple process flow and can achieve high perception registration of virtual and real object. In addition, this method has a good performance in the condition that the reconstructed model is not exactly same with the rendered virtual model which extends its application area in the spatial augmented reality based human face projection.

  9. a Generic Augmented Reality Telescope for Heritage Valorization

    NASA Astrophysics Data System (ADS)

    Chendeb, S.; Ridene, T.; Leroy, L.

    2013-08-01

    Heritage valorisation is one of the greatest challenges that face countries in preserving their own identity from the globalization process. One of those scientific areas which allow this valorisation to be more attractive and at its bravest is the augmented reality. In this paper, we present an innovative augmented reality telescope used by tourists to explore a panoramic view with optional zooming facility, allowing thereby an accurate access to heritage information. The telescope we produced is generic, ergonomic, extensible, and modular by nature. It is designed to be conveniently set up anywhere in the world. We improve the practical use of our system by testing it right in the heart of Paris within a specific use case.

  10. Stereoscopic augmented reality with pseudo-realistic global illumination effects

    NASA Astrophysics Data System (ADS)

    de Sorbier, Francois; Saito, Hideo

    2014-03-01

    Recently, augmented reality has become very popular and has appeared in our daily life with gaming, guiding systems or mobile phone applications. However, inserting object in such a way their appearance seems natural is still an issue, especially in an unknown environment. This paper presents a framework that demonstrates the capabilities of Kinect for convincing augmented reality in an unknown environment. Rather than pre-computing a reconstruction of the scene like proposed by most of the previous method, we propose a dynamic capture of the scene that allows adapting to live changes of the environment. Our approach, based on the update of an environment map, can also detect the position of the light sources. Combining information from the environment map, the light sources and the camera tracking, we can display virtual objects using stereoscopic devices with global illumination effects such as diffuse and mirror reflections, refractions and shadows in real time.

  11. Fourier holographic display for augmented reality using holographic optical element

    NASA Astrophysics Data System (ADS)

    Li, Gang; Lee, Dukho; Jeong, Youngmo; Lee, Byoungho

    2016-03-01

    A method for realizing a three-dimensional see-through augmented reality in Fourier holographic display is proposed. A holographic optical element (HOE) with the function of Fourier lens is adopted in the system. The Fourier hologram configuration causes the real scene located behind the lens to be distorted. In the proposed method, since the HOE is transparent and it functions as the lens just for Bragg matched condition, there is not any distortion when people observe the real scene through the lens HOE (LHOE). Furthermore, two optical characteristics of the recording material are measured for confirming the feasibility of using LHOE in the proposed see-through augmented reality holographic display. The results are verified experimentally.

  12. Use of display technologies for augmented reality enhancement

    NASA Astrophysics Data System (ADS)

    Harding, Kevin

    2016-06-01

    Augmented reality (AR) is seen as an important tool for the future of user interfaces as well as training applications. An important application area for AR is expected to be in the digitization of training and worker instructions used in the Brilliant Factory environment. The transition of work instructions methods from printed pages in a book or taped to a machine to virtual simulations is a long step with many challenges along the way. A variety of augmented reality tools are being explored today for industrial applications that range from simple programmable projections in the work space to 3D displays and head mounted gear. This paper will review where some of these tool are today and some of the pros and cons being considered for the future worker environment.

  13. Augmented Reality Based Doppler Lidar Data Visualization: Promises and Challenges

    NASA Astrophysics Data System (ADS)

    Cherukuru, N. W.; Calhoun, R.

    2016-06-01

    Augmented reality (AR) is a technology in which the enables the user to view virtual content as if it existed in real world. We are exploring the possibility of using this technology to view radial velocities or processed wind vectors from a Doppler wind lidar, thus giving the user an ability to see the wind in a literal sense. This approach could find possible applications in aviation safety, atmospheric data visualization as well as in weather education and public outreach. As a proof of concept, we used the lidar data from a recent field campaign and developed a smartphone application to view the lidar scan in augmented reality. In this paper, we give a brief methodology of this feasibility study, present the challenges and promises of using AR technology in conjunction with Doppler wind lidars.

  14. An augmented reality framework for soft tissue surgery.

    PubMed

    Mountney, Peter; Fallert, Johannes; Nicolau, Stephane; Soler, Luc; Mewes, Philip W

    2014-01-01

    Augmented reality for soft tissue laparoscopic surgery is a growing topic of interest in the medical community and has potential application in intra-operative planning and image guidance. Delivery of such systems to the operating room remains complex with theoretical challenges related to tissue deformation and the practical limitations of imaging equipment. Current research in this area generally only solves part of the registration pipeline or relies on fiducials, manual model alignment or assumes that tissue is static. This paper proposes a novel augmented reality framework for intra-operative planning: the approach co-registers pre-operative CT with stereo laparoscopic images using cone beam CT and fluoroscopy as bridging modalities. It does not require fiducials or manual alignment and compensates for tissue deformation from insufflation and respiration while allowing the laparoscope to be navigated. The paper's theoretical and practical contributions are validated using simulated, phantom, ex vivo, in vivo and non medical data. PMID:25333146

  15. Augmented reality assisted surgery: a urologic training tool.

    PubMed

    Dickey, Ryan M; Srikishen, Neel; Lipshultz, Larry I; Spiess, Philippe E; Carrion, Rafael E; Hakky, Tariq S

    2016-01-01

    Augmented reality is widely used in aeronautics and is a developing concept within surgery. In this pilot study, we developed an application for use on Google Glass ® optical head-mounted display to train urology residents in how to place an inflatable penile prosthesis. We use the phrase Augmented Reality Assisted Surgery to describe this novel application of augmented reality in the setting of surgery. The application demonstrates the steps of the surgical procedure of inflatable penile prosthesis placement. It also contains software that allows for detection of interest points using a camera feed from the optical head-mounted display to enable faculty to interact with residents during placement of the penile prosthesis. Urology trainees and faculty who volunteered to take part in the study were given time to experience the technology in the operative or perioperative setting and asked to complete a feedback survey. From 30 total participants using a 10-point scale, educational usefulness was rated 8.6, ease of navigation was rated 7.6, likelihood to use was rated 7.4, and distraction in operating room was rated 4.9. When stratified between trainees and faculty, trainees found the technology more educationally useful, and less distracting. Overall, 81% of the participants want this technology in their residency program, and 93% see this technology in the operating room in the future. Further development of this technology is warranted before full release, and further studies are necessary to better characterize the effectiveness of Augmented Reality Assisted Surgery in urologic surgical training. PMID:26620455

  16. Augmented reality assisted surgery: a urologic training tool

    PubMed Central

    Dickey, Ryan M; Srikishen, Neel; Lipshultz, Larry I; Spiess, Philippe E; Carrion, Rafael E; Hakky, Tariq S

    2016-01-01

    Augmented reality is widely used in aeronautics and is a developing concept within surgery. In this pilot study, we developed an application for use on Google Glass® optical head-mounted display to train urology residents in how to place an inflatable penile prosthesis. We use the phrase Augmented Reality Assisted Surgery to describe this novel application of augmented reality in the setting of surgery. The application demonstrates the steps of the surgical procedure of inflatable penile prosthesis placement. It also contains software that allows for detection of interest points using a camera feed from the optical head-mounted display to enable faculty to interact with residents during placement of the penile prosthesis. Urology trainees and faculty who volunteered to take part in the study were given time to experience the technology in the operative or perioperative setting and asked to complete a feedback survey. From 30 total participants using a 10-point scale, educational usefulness was rated 8.6, ease of navigation was rated 7.6, likelihood to use was rated 7.4, and distraction in operating room was rated 4.9. When stratified between trainees and faculty, trainees found the technology more educationally useful, and less distracting. Overall, 81% of the participants want this technology in their residency program, and 93% see this technology in the operating room in the future. Further development of this technology is warranted before full release, and further studies are necessary to better characterize the effectiveness of Augmented Reality Assisted Surgery in urologic surgical training. PMID:26620455

  17. Flexible augmented reality architecture applied to environmental management

    NASA Astrophysics Data System (ADS)

    Correia, Nuno M. R.; Romao, Teresa; Santos, Carlos; Trabuco, Adelaide; Santos, Rossana; Romero, Luis; Danado, Jose; Dias, Eduardo; Camara, Antonio; Nobre, Edmundo

    2003-05-01

    Environmental management often requires in loco observation of the area under analysis. Augmented Reality (AR) technologies allow real time superimposition of synthetic objects on real images, providing augmented knowledge about the surrounding world. Users of an AR system can visualize the real surrounding world together with additional data generated in real time in a contextual way. The work reported in this paper was done in the scope of ANTS (Augmented Environments) project. ANTS is an AR project that explores the development of an augmented reality technological infrastructure for environmental management. This paper presents the architecture and the most relevant modules of ANTS. The system"s architecture follows the client-server model and is based on several independent, but functionally interdependent modules. It has a flexible design, which allows the transfer of some modules to and from the client side, according to the available processing capacities of the client device and the application"s requirements. It combines several techniques to identify the user"s position and orientation allowing the system to adapt to the particular characteristics of each environment. The determination of the data associated to a certain location involves the use of both a 3D Model of the location and the multimedia geo-referenced database.

  18. Contextualized Interdisciplinary Learning in Mainstream Schools Using Augmented Reality-Based Technology: A Dream or Reality?

    ERIC Educational Resources Information Center

    Ong, Alex

    2010-01-01

    The use of augmented reality (AR) tools, where virtual objects such as tables and graphs can be displayed and be interacted with in real scenes created from imaging devices, in mainstream school curriculum is uncommon, as they are potentially costly and sometimes bulky. Thus, such learning tools are mainly applied in tertiary institutions, such as…

  19. Augmented Reality Environments in Learning, Communicational and Professional Contexts in Higher Education

    ERIC Educational Resources Information Center

    Martín Gutiérrez, Jorge; Meneses Fernández, María Dolores

    2014-01-01

    This paper explores educational and professional uses of augmented learning environment concerned with issues of training and entertainment. We analyze the state-of-art research of some scenarios based on augmented reality. Some examples for the purpose of education and simulation are described. These applications show that augmented reality can…

  20. Live texturing of augmented reality characters from colored drawings.

    PubMed

    Magnenat, Stéphane; Ngo, Dat Tien; Zünd, Fabio; Ryffel, Mattia; Noris, Gioacchino; Rothlin, Gerhard; Marra, Alessia; Nitti, Maurizio; Fua, Pascal; Gross, Markus; Sumner, Robert W

    2015-11-01

    Coloring books capture the imagination of children and provide them with one of their earliest opportunities for creative expression. However, given the proliferation and popularity of digital devices, real-world activities like coloring can seem unexciting, and children become less engaged in them. Augmented reality holds unique potential to impact this situation by providing a bridge between real-world activities and digital enhancements. In this paper, we present an augmented reality coloring book App in which children color characters in a printed coloring book and inspect their work using a mobile device. The drawing is detected and tracked, and the video stream is augmented with an animated 3-D version of the character that is textured according to the child's coloring. This is possible thanks to several novel technical contributions. We present a texturing process that applies the captured texture from a 2-D colored drawing to both the visible and occluded regions of a 3-D character in real time. We develop a deformable surface tracking method designed for colored drawings that uses a new outlier rejection algorithm for real-time tracking and surface deformation recovery. We present a content creation pipeline to efficiently create the 2-D and 3-D content. And, finally, we validate our work with two user studies that examine the quality of our texturing algorithm and the overall App experience. PMID:26340776

  1. FlyAR: augmented reality supported micro aerial vehicle navigation.

    PubMed

    Zollmann, Stefanie; Hoppe, Christof; Langlotz, Tobias; Reitmayr, Gerhard

    2014-04-01

    Micro aerial vehicles equipped with high-resolution cameras can be used to create aerial reconstructions of an area of interest. In that context automatic flight path planning and autonomous flying is often applied but so far cannot fully replace the human in the loop, supervising the flight on-site to assure that there are no collisions with obstacles. Unfortunately, this workflow yields several issues, such as the need to mentally transfer the aerial vehicle’s position between 2D map positions and the physical environment, and the complicated depth perception of objects flying in the distance. Augmented Reality can address these issues by bringing the flight planning process on-site and visualizing the spatial relationship between the planned or current positions of the vehicle and the physical environment. In this paper, we present Augmented Reality supported navigation and flight planning of micro aerial vehicles by augmenting the user’s view with relevant information for flight planning and live feedback for flight supervision. Furthermore, we introduce additional depth hints supporting the user in understanding the spatial relationship of virtual waypoints in the physical world and investigate the effect of these visualization techniques on the spatial understanding. PMID:24650983

  2. Fast Markerless Tracking for Augmented Reality in Planar Environment

    NASA Astrophysics Data System (ADS)

    Basori, Ahmad Hoirul; Afif, Fadhil Noer; Almazyad, Abdulaziz S.; AbuJabal, Hamza Ali S.; Rehman, Amjad; Alkawaz, Mohammed Hazim

    2015-12-01

    Markerless tracking for augmented reality should not only be accurate but also fast enough to provide a seamless synchronization between real and virtual beings. Current reported methods showed that a vision-based tracking is accurate but requires high computational power. This paper proposes a real-time hybrid-based method for tracking unknown environments in markerless augmented reality. The proposed method provides collaboration of vision-based approach with accelerometers and gyroscopes sensors as camera pose predictor. To align the augmentation relative to camera motion, the tracking method is done by substituting feature-based camera estimation with combination of inertial sensors with complementary filter to provide more dynamic response. The proposed method managed to track unknown environment with faster processing time compared to available feature-based approaches. Moreover, the proposed method can sustain its estimation in a situation where feature-based tracking loses its track. The collaboration of sensor tracking managed to perform the task for about 22.97 FPS, up to five times faster than feature-based tracking method used as comparison. Therefore, the proposed method can be used to track unknown environments without depending on amount of features on scene, while requiring lower computational cost.

  3. Marshall Engineers Use Virtual Reality

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. Marshall Spce Flight Center (MSFC) is begirning to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models are used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup is to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provides general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC).

  4. Augmented reality technology and application in aerodrome building

    NASA Astrophysics Data System (ADS)

    Guo, Yan; Du, Qingyun

    2007-06-01

    Paper discusses the function and meaning of AR in the aerodrome construction project. In the initial stages of the aerodrome building, it applies the advanced technology including 3S (RS, GIS and GPS) techniques, augmented reality, virtual reality and digital image processing techniques, and so on. Virtual image or other information that is created by the computer is superimposed with the surveying district which the observer stands is looking at. When the observer is moving in the district, the virtual information is changing correspondingly, just like the virtual information really exists in real environment. The observer can see the scene of aerodrome if he puts on clairvoyant HMD (head mounted display). If we have structural information of the aerodrome in database, AR can supply X-ray of the building just like pipeline, wire and framework in walls.

  5. Context-aware Augmented Reality in laparoscopic surgery.

    PubMed

    Katić, Darko; Wekerle, Anna-Laura; Görtler, Jochen; Spengler, Patrick; Bodenstedt, Sebastian; Röhl, Sebastian; Suwelack, Stefan; Kenngott, Hannes Götz; Wagner, Martin; Müller-Stich, Beat Peter; Dillmann, Rüdiger; Speidel, Stefanie

    2013-03-01

    Augmented Reality is a promising paradigm for intraoperative assistance. Yet, apart from technical issues, a major obstacle to its clinical application is the man-machine interaction. Visualization of unnecessary, obsolete or redundant information may cause confusion and distraction, reducing usefulness and acceptance of the assistance system. We propose a system capable of automatically filtering available information based on recognized phases in the operating room. Our system offers a specific selection of available visualizations which suit the surgeon's needs best. The system was implemented for use in laparoscopic liver and gallbladder surgery and evaluated in phantom experiments in conjunction with expert interviews. PMID:23541864

  6. Registration correction in hybrid tracking for augmented reality

    NASA Astrophysics Data System (ADS)

    Li, Xinyu; Chen, Dongyi

    2007-12-01

    One of the most important problems faced when building Augmented Reality (AR) systems is registration error. In mobile and outdoor AR systems, dynamic errors are the largest source of registration errors. Hybrid tracking is a key technique in combating the effects of dynamic errors. In this paper we present a multi-stage close-loop correction algorithm for reducing dynamic registration errors in AR system. The control law in this algorithm is based on 2D visual servoing or image-based camera control that allows control of a virtual camera to reduce registration error. The experimental results illustrate the effectiveness of this algorithm.

  7. Context-Aware Based Efficient Training System Using Augmented Reality and Gravity Sensor for Healthcare Services

    NASA Astrophysics Data System (ADS)

    Kim, Seoksoo; Jung, Sungmo; Song, Jae-Gu; Kang, Byong-Ho

    As augmented reality and a gravity sensor is of growing interest, siginificant developement is being made on related technology, which allows application of the technology in a variety of areas with greater expectations. In applying Context-aware to augmented reality, it can make useful programs. A traning system suggested in this study helps a user to understand an effcienct training method using augmented reality and make sure if his exercise is being done propery based on the data collected by a gravity sensor. Therefore, this research aims to suggest an efficient training environment that can enhance previous training methods by applying augmented reality and a gravity sensor.

  8. On Location Learning: Authentic Applied Science with Networked Augmented Realities

    NASA Astrophysics Data System (ADS)

    Rosenbaum, Eric; Klopfer, Eric; Perry, Judy

    2007-02-01

    The learning of science can be made more like the practice of science through authentic simulated experiences. We have created a networked handheld Augmented Reality environment that combines the authentic role-playing of Augmented Realities and the underlying models of Participatory Simulations. This game, known as Outbreak @ The Institute, is played across a university campus where players take on the roles of doctors, medical technicians, and public health experts to contain a disease outbreak. Players can interact with virtual characters and employ virtual diagnostic tests and medicines. They are challenged to identify the source and prevent the spread of an infectious disease that can spread among real and/or virtual characters according to an underlying model. In this paper, we report on data from three high school classes who played the game. We investigate students' perception of the authenticity of the game in terms of their personal embodiment in the game, their experience playing different roles, and their understanding of the dynamic model underlying the game.

  9. Augmented reality needle guidance improves facet joint injection training

    NASA Astrophysics Data System (ADS)

    Ungi, Tamas; Yeo, Caitlin T.; U-Thainual, Paweena; McGraw, Robert C.; Fichtinger, Gabor

    2011-03-01

    PURPOSE: The purpose of this study was to determine if medical trainees would benefit from augmented reality image overlay and laser guidance in learning how to set the correct orientation of a needle for percutaneous facet joint injection. METHODS: A total of 28 medical students were randomized into two groups: (1) The Overlay group received a training session of four insertions with image and laser guidance followed by two insertions with laser overlay only; (2) The Control group was trained by carrying out six freehand insertions. After the training session, needle trajectories of two facet joint injections without any guidance were recorded by an electromagnetic tracker and were analyzed. Number of successful needle placements, distance covered by needle tip inside the phantom and procedural time were measured to evaluate performance. RESULTS: Number of successful placements was significantly higher in the Overlay group compared to the Control group (85.7% vs. 57.1%, p = 0.038). Procedure time and distance covered inside phantom have both been found to be less in the Overlay group, although not significantly. CONCLUSION: Training with augmented reality image overlay and laser guidance improves the accuracy of facet joint injections in medical students learning image-guided facet joint needle placement.

  10. Computer-aided liver surgery planning: an augmented reality approach

    NASA Astrophysics Data System (ADS)

    Bornik, Alexander; Beichel, Reinhard; Reitinger, Bernhard; Gotschuli, Georg; Sorantin, Erich; Leberl, Franz W.; Sonka, Milan

    2003-05-01

    Surgical resection of liver tumors requires a detailed three-dimensional understanding of a complex arrangement of vasculature, liver segments and tumors inside the liver. In most cases, surgeons need to develop this understanding by looking at sequences of axial images from modalities like X-ray computed tomography. A system for liver surgery planning is reported that enables physicians to visualize and refine segmented input liver data sets, as well as to simulate and evaluate different resections plans. The system supports surgeons in finding the optimal treatment strategy for each patient and eases the data preparation process. The use of augmented reality contributes to a user-friendly design and simplifies complex interaction with 3D objects. The main function blocks developed so far are: basic augmented reality environment, user interface, rendering, surface reconstruction from segmented volume data sets, surface manipulation and quantitative measurement toolkit. The flexible design allows to add functionality via plug-ins. First practical evaluation steps have shown a good acceptance. Evaluation of the system is ongoing and future feedback from surgeons will be collected and used for design refinements.

  11. archAR: an archaeological augmented reality experience

    NASA Astrophysics Data System (ADS)

    Wiley, Bridgette; Schulze, Jürgen P.

    2015-03-01

    We present an application for Android phones or tablets called "archAR" that uses augmented reality as an alternative, portable way of viewing archaeological information from UCSD's Levantine Archaeology Laboratory. archAR provides a unique experience of flying through an archaeological dig site in the Levantine area and exploring the artifacts uncovered there. Using a Google Nexus tablet and Qualcomm's Vuforia API, we use an image target as a map and overlay a three-dimensional model of the dig site onto it, augmenting reality such that we are able to interact with the plotted artifacts. The user can physically move the Android device around the image target and see the dig site model from any perspective. The user can also move the device closer to the model in order to "zoom" into the view of a particular section of the model and its associated artifacts. This is especially useful, as the dig site model and the collection of artifacts are very detailed. The artifacts are plotted as points, colored by type. The user can touch the virtual points to trigger a popup information window that contains details of the artifact, such as photographs, material descriptions, and more.

  12. Verification of Image Based Augmented Reality for Urban Visualization

    NASA Astrophysics Data System (ADS)

    Fuse, T.; Nishikawa, S.; Fukunishi, Y.

    2012-07-01

    Recently, visualization of urban scenes with various information attracts attention. For the transmission of urban scenes, virtual reality has been widely used. Since the virtual reality requires comprehensive and detailed three dimensional models, the manual dependent modelling takes a lot of time and effort. On the other hand, it has been tackled that various data is superimposed on the scene which the users see at the time instead of comprehensive modelling, which is well known as augmented reality (AR). Simultaneous localization and mapping (SLAM) has been attempted using simple video cameras for the AR. This method estimates exterior orientation factors of the camera, and three dimensional reconstructions of feature points simultaneously. The method, however, has been applied to only small indoor space. This paper investigates the applicability of the popular method of SALM to wide outdoor space, and improves the stability of the method. Through the application, the tracked feature points successfully are greatly reduced compared with application in indoor environment. According to the experimental result, simple markers or GPS are introduced as auxiliary information. The markers gives the stability of optimization, and GPS gives real scale to AR spaces. Additionally, feature points tracking method is modified by assigning amplitude of displacement and depth. The effect of the markers and GPS are confirmed. On the other hand, some limitations of the method are understood. As a result, more impressive visualization will be accomplished.

  13. Engineering applications of virtual reality

    NASA Astrophysics Data System (ADS)

    Smith, James R.; Grimes, Robert V.; Plant, Tony A.

    1996-04-01

    This paper addresses some of the practical applications, advantages and difficulties associated with the engineering applications of virtual reality. The paper tracks actual investigative work in progress on this subject at the BNR research lab in RTP, NC. This work attempts to demonstrate the actual value added to the engineering process by using existing 3D CAD data for interactive information navigation and evaluation of design concepts and products. Specifically, the work includes translation of Parametric Technology's Pro/ENGINEER models into a virtual world to evaluate potential attributes such as multiple concept exploration and product installation assessment. Other work discussed in this paper includes extensive evaluation of two new tools, VRML and SGI's/Template Graphics' WebSpace for navigation through Pro/ENGINEER models with links to supporting technical documentation and data. The benefits of using these tolls for 3D interactive navigation and exploration throughout three key phases of the physical design process is discussed in depth. The three phases are Design Concept Development, Product Design Evaluation and Product Design Networking. The predicted values added include reduced time to `concept ready', reduced prototype iterations, increased `design readiness' and shorter manufacturing introduction cycles.

  14. Utilization of the Space Vision System as an Augmented Reality System For Mission Operations

    NASA Technical Reports Server (NTRS)

    Maida, James C.; Bowen, Charles

    2003-01-01

    Augmented reality is a technique whereby computer generated images are superimposed on live images for visual enhancement. Augmented reality can also be characterized as dynamic overlays when computer generated images are registered with moving objects in a live image. This technique has been successfully implemented, with low to medium levels of registration precision, in an NRA funded project entitled, "Improving Human Task Performance with Luminance Images and Dynamic Overlays". Future research is already being planned to also utilize a laboratory-based system where more extensive subject testing can be performed. However successful this might be, the problem will still be whether such a technology can be used with flight hardware. To answer this question, the Canadian Space Vision System (SVS) will be tested as an augmented reality system capable of improving human performance where the operation requires indirect viewing. This system has already been certified for flight and is currently flown on each shuttle mission for station assembly. Successful development and utilization of this system in a ground-based experiment will expand its utilization for on-orbit mission operations. Current research and development regarding the use of augmented reality technology is being simulated using ground-based equipment. This is an appropriate approach for development of symbology (graphics and annotation) optimal for human performance and for development of optimal image registration techniques. It is anticipated that this technology will become more pervasive as it matures. Because we know what and where almost everything is on ISS, this reduces the registration problem and improves the computer model of that reality, making augmented reality an attractive tool, provided we know how to use it. This is the basis for current research in this area. However, there is a missing element to this process. It is the link from this research to the current ISS video system and to

  15. A Mobile Service Oriented Multiple Object Tracking Augmented Reality Architecture for Education and Learning Experiences

    ERIC Educational Resources Information Center

    Rattanarungrot, Sasithorn; White, Martin; Newbury, Paul

    2014-01-01

    This paper describes the design of our service-oriented architecture to support mobile multiple object tracking augmented reality applications applied to education and learning scenarios. The architecture is composed of a mobile multiple object tracking augmented reality client, a web service framework, and dynamic content providers. Tracking of…

  16. Augmenting a Child's Reality: Using Educational Tablet Technology

    ERIC Educational Resources Information Center

    Tanner, Patricia; Karas, Carly; Schofield, Damian

    2014-01-01

    This study investigates the classroom integration of an innovative technology, augmented reality. Although the process of adding new technologies into a classroom setting can be daunting, the concept of augmented reality has demonstrated the ability to educate students and to assist with their comprehension of a procedural task. One half of the…

  17. Augmented Reality Trends in Education: A Systematic Review of Research and Applications

    ERIC Educational Resources Information Center

    Bacca, Jorge; Baldiris, Silvia; Fabregat, Ramon; Graf, Sabine; Kinshuk

    2014-01-01

    In recent years, there has been an increasing interest in applying Augmented Reality (AR) to create unique educational settings. So far, however, there is a lack of review studies with focus on investigating factors such as: the uses, advantages, limitations, effectiveness, challenges and features of augmented reality in educational settings.…

  18. Mobile Augmented Reality in Supporting Peer Assessment: An Implementation in a Fundamental Design Course

    ERIC Educational Resources Information Center

    Lan, Chung-Hsien; Chao, Stefan; Kinshuk; Chao, Kuo-Hung

    2013-01-01

    This study presents a conceptual framework for supporting mobile peer assessment by incorporating augmented reality technology to eliminate limitation of reviewing and assessing. According to the characteristics of mobile technology and augmented reality, students' work can be shown in various ways by considering the locations and situations. This…

  19. Social Augmented Reality: Enhancing Context-Dependent Communication and Informal Learning at Work

    ERIC Educational Resources Information Center

    Pejoska, Jana; Bauters, Merja; Purma, Jukka; Leinonen, Teemu

    2016-01-01

    Our design proposal of social augmented reality (SoAR) grows from the observed difficulties of practical applications of augmented reality (AR) in workplace learning. In our research we investigated construction workers doing physical work in the field and analyzed the data using qualitative methods in various workshops. The challenges related to…

  20. Augmented Reality: A Brand New Challenge for the Assessment and Treatment of Psychological Disorders

    PubMed Central

    Chicchi Giglioli, Irene Alice; Pallavicini, Federica; Pedroli, Elisa; Serino, Silvia; Riva, Giuseppe

    2015-01-01

    Augmented Reality is a new technological system that allows introducing virtual contents in the real world in order to run in the same representation and, in real time, enhancing the user's sensory perception of reality. From another point of view, Augmented Reality can be defined as a set of techniques and tools that add information to the physical reality. To date, Augmented Reality has been used in many fields, such as medicine, entertainment, maintenance, architecture, education, and cognitive and motor rehabilitation but very few studies and applications of AR exist in clinical psychology. In the treatment of psychological disorders, Augmented Reality has given preliminary evidence to be a useful tool due to its adaptability to the patient needs and therapeutic purposes and interactivity. Another relevant factor is the quality of the user's experience in the Augmented Reality system determined from emotional engagement and sense of presence. This experience could increase the AR ecological validity in the treatment of psychological disorders. This paper reviews the recent studies on the use of Augmented Reality in the evaluation and treatment of psychological disorders, focusing on current uses of this technology and on the specific features that delineate Augmented Reality a new technique useful for psychology. PMID:26339283

  1. Augmented Reality: A Brand New Challenge for the Assessment and Treatment of Psychological Disorders.

    PubMed

    Chicchi Giglioli, Irene Alice; Pallavicini, Federica; Pedroli, Elisa; Serino, Silvia; Riva, Giuseppe

    2015-01-01

    Augmented Reality is a new technological system that allows introducing virtual contents in the real world in order to run in the same representation and, in real time, enhancing the user's sensory perception of reality. From another point of view, Augmented Reality can be defined as a set of techniques and tools that add information to the physical reality. To date, Augmented Reality has been used in many fields, such as medicine, entertainment, maintenance, architecture, education, and cognitive and motor rehabilitation but very few studies and applications of AR exist in clinical psychology. In the treatment of psychological disorders, Augmented Reality has given preliminary evidence to be a useful tool due to its adaptability to the patient needs and therapeutic purposes and interactivity. Another relevant factor is the quality of the user's experience in the Augmented Reality system determined from emotional engagement and sense of presence. This experience could increase the AR ecological validity in the treatment of psychological disorders. This paper reviews the recent studies on the use of Augmented Reality in the evaluation and treatment of psychological disorders, focusing on current uses of this technology and on the specific features that delineate Augmented Reality a new technique useful for psychology. PMID:26339283

  2. Computer vision and augmented reality in gastrointestinal endoscopy

    PubMed Central

    Mahmud, Nadim; Cohen, Jonah; Tsourides, Kleovoulos; Berzin, Tyler M.

    2015-01-01

    Augmented reality (AR) is an environment-enhancing technology, widely applied in the computer sciences, which has only recently begun to permeate the medical field. Gastrointestinal endoscopy—which relies on the integration of high-definition video data with pathologic correlates—requires endoscopists to assimilate and process a tremendous amount of data in real time. We believe that AR is well positioned to provide computer-guided assistance with a wide variety of endoscopic applications, beginning with polyp detection. In this article, we review the principles of AR, describe its potential integration into an endoscopy set-up, and envisage a series of novel uses. With close collaboration between physicians and computer scientists, AR promises to contribute significant improvements to the field of endoscopy. PMID:26133175

  3. An augmented reality assistance platform for eye laser surgery.

    PubMed

    Ee Ping Ong; Lee, Jimmy Addison; Jun Cheng; Beng Hai Lee; Guozhen Xu; Laude, Augustinus; Teoh, Stephen; Tock Han Lim; Wong, Damon W K; Jiang Liu

    2015-08-01

    This paper presents a novel augmented reality assistance platform for eye laser surgery. The aims of the proposed system are for the application of assisting eye doctors in pre-planning as well as providing guidance and protection during laser surgery. We developed algorithms to automatically register multi-modal images, detect macula and optic disc regions, and demarcate these as protected areas from laser surgery. The doctor will then be able to plan the laser treatment pre-surgery using the registered images and segmented regions. Thereafter, during live surgery, the system will automatically register and track the slit lamp video frames on the registered retina images, send appropriate warning when the laser is near protected areas, and disable the laser function when it points into the protected areas. The proposed system prototype can help doctors to speed up laser surgery with confidence without fearing that they may unintentionally fire laser in the protected areas. PMID:26737252

  4. Real-time tomographic holography for augmented reality

    PubMed Central

    Galeotti, John M.; Siegel, Mel; Stetten, George

    2011-01-01

    The concept and instantiation of Real-Time Tomographic Holography (RTTH) for augmented reality is presented. RTTH enables natural hand-eye coordination to guide invasive medical procedures without requiring tracking or a head-mounted device. It places a real-time virtual image of an object's cross-section into its actual location, without noticeable viewpoint dependence (e.g. parallax error). The virtual image is viewed through a flat narrow-band Holographic Optical Element with optical power that generates an in-situ virtual image (within 1 m of the HOE) from a small SLM display without obscuring a direct view of the physical world. Rigidly fixed upon a medical ultrasound probe, an RTTH device could show the scan in its actual location inside the patient, even as the probe was moved relative to the patient. PMID:20634827

  5. Computer vision and augmented reality in gastrointestinal endoscopy.

    PubMed

    Mahmud, Nadim; Cohen, Jonah; Tsourides, Kleovoulos; Berzin, Tyler M

    2015-08-01

    Augmented reality (AR) is an environment-enhancing technology, widely applied in the computer sciences, which has only recently begun to permeate the medical field. Gastrointestinal endoscopy-which relies on the integration of high-definition video data with pathologic correlates-requires endoscopists to assimilate and process a tremendous amount of data in real time. We believe that AR is well positioned to provide computer-guided assistance with a wide variety of endoscopic applications, beginning with polyp detection. In this article, we review the principles of AR, describe its potential integration into an endoscopy set-up, and envisage a series of novel uses. With close collaboration between physicians and computer scientists, AR promises to contribute significant improvements to the field of endoscopy. PMID:26133175

  6. Toward Standard Usability Questionnaires for Handheld Augmented Reality.

    PubMed

    Santos, Marc Ericson C; Polvi, Jarkko; Taketomi, Takafumi; Yamamoto, Goshiro; Sandor, Christian; Kato, Hirokazu

    2015-01-01

    Usability evaluations are important to improving handheld augmented reality (HAR) systems. However, no standard questionnaire considers perceptual and ergonomic issues found in HAR. The authors performed a systematic literature review to enumerate these issues. Based on these issues, they created a HAR usability scale that consists of comprehensibility and manipulability scales. These scales measure general system usability, ease of understanding the information presented, and ease of handling the device. The questionnaires' validity and reliability were evaluated in four experiments, and the results show that the questionnaires consistently correlate with other subjective and objective measures of usability. The questionnaires also have good reliability based on the Cronbach's alpha. Researchers and professionals can directly use these questionnaires to evaluate their own HAR applications or modify them with the insights presented in this article. PMID:26416363

  7. Multithreaded hybrid feature tracking for markerless augmented reality.

    PubMed

    Lee, Taehee; Höllerer, Tobias

    2009-01-01

    We describe a novel markerless camera tracking approach and user interaction methodology for augmented reality (AR) on unprepared tabletop environments. We propose a real-time system architecture that combines two types of feature tracking. Distinctive image features of the scene are detected and tracked frame-to-frame by computing optical flow. In order to achieve real-time performance, multiple operations are processed in a synchronized multi-threaded manner: capturing a video frame, tracking features using optical flow, detecting distinctive invariant features, and rendering an output frame. We also introduce user interaction methodology for establishing a global coordinate system and for placing virtual objects in the AR environment by tracking a user's outstretched hand and estimating a camera pose relative to it. We evaluate the speed and accuracy of our hybrid feature tracking approach, and demonstrate a proof-of-concept application for enabling AR in unprepared tabletop environments, using bare hands for interaction. PMID:19282544

  8. Efficient Verification of Holograms Using Mobile Augmented Reality.

    PubMed

    Hartl, Andreas Daniel; Arth, Clemens; Grubert, Jens; Schmalstieg, Dieter

    2016-07-01

    Paper documents such as passports, visas and banknotes are frequently checked by inspection of security elements. In particular, optically variable devices such as holograms are important, but difficult to inspect. Augmented Reality can provide all relevant information on standard mobile devices. However, hologram verification on mobiles still takes long and provides lower accuracy than inspection by human individuals using appropriate reference information. We aim to address these drawbacks by automatic matching combined with a special parametrization of an efficient goal-oriented user interface which supports constrained navigation. We first evaluate a series of similarity measures for matching hologram patches to provide a sound basis for automatic decisions. Then a re-parametrized user interface is proposed based on observations of typical user behavior during document capture. These measures help to reduce capture time to approximately 15 s with better decisions regarding the evaluated samples than what can be achieved by untrained users. PMID:26561461

  9. Simulating low-cost cameras for augmented reality compositing.

    PubMed

    Klein, Georg; Murray, David W

    2010-01-01

    Video see-through Augmented Reality adds computer graphics to the real world in real time by overlaying graphics onto a live video feed. To achieve a realistic integration of the virtual and real imagery, the rendered images should have a similar appearance and quality to those produced by the video camera. This paper describes a compositing method which models the artifacts produced by a small low-cost camera, and adds these effects to an ideal pinhole image produced by conventional rendering methods. We attempt to model and simulate each step of the imaging process, including distortions, chromatic aberrations, blur, Bayer masking, noise, sharpening, and color-space compression, all while requiring only an RGBA image and an estimate of camera velocity as inputs. PMID:20224133

  10. Tangible Interaction in Learning Astronomy through Augmented Reality Book-Based Educational Tool

    NASA Astrophysics Data System (ADS)

    Sin, Aw Kien; Badioze Zaman, Halimah

    Live Solar System (LSS) is an Augmented Reality book-based educational tool. Augmented Reality (AR) has its own potential in the education field, because it can provide a seamless interaction between real and virtual objects. LSS applied the Tangible Augmented Reality approach in designing its user interface and interaction. Tangible Augmented Reality is an interface which combines the Tangible User Interface and Augmented Reality Interface. They are naturally complement each other. This paper highlights the tangible interaction in LSS. LSS adopts the 'cube' as the common physical object input device. Thus, LSS does not use the traditional computer input devices such as the mouse or keyboard. To give users a better exploration experience, Visual Information Seeking Mantra principle was applied in the design of LSS. Hence, LSS gives users an effective interactive-intuitive horizontal surface learning environment.

  11. Training for planning tumour resection: augmented reality and human factors.

    PubMed

    Abhari, Kamyar; Baxter, John S H; Chen, Elvis C S; Khan, Ali R; Peters, Terry M; de Ribaupierre, Sandrine; Eagleson, Roy

    2015-06-01

    Planning surgical interventions is a complex task, demanding a high degree of perceptual, cognitive, and sensorimotor skills to reduce intra- and post-operative complications. This process requires spatial reasoning to coordinate between the preoperatively acquired medical images and patient reference frames. In the case of neurosurgical interventions, traditional approaches to planning tend to focus on providing a means for visualizing medical images, but rarely support transformation between different spatial reference frames. Thus, surgeons often rely on their previous experience and intuition as their sole guide is to perform mental transformation. In case of junior residents, this may lead to longer operation times or increased chance of error under additional cognitive demands. In this paper, we introduce a mixed augmented-/virtual-reality system to facilitate training for planning a common neurosurgical procedure, brain tumour resection. The proposed system is designed and evaluated with human factors explicitly in mind, alleviating the difficulty of mental transformation. Our results indicate that, compared to conventional planning environments, the proposed system greatly improves the nonclinicians' performance, independent of the sensorimotor tasks performed ( ). Furthermore, the use of the proposed system by clinicians resulted in a significant reduction in time to perform clinically relevant tasks ( ). These results demonstrate the role of mixed-reality systems in assisting residents to develop necessary spatial reasoning skills needed for planning brain tumour resection, improving patient outcomes. PMID:25546854

  12. Invisible waves and hidden realms: augmented reality and experimental art

    NASA Astrophysics Data System (ADS)

    Ruzanka, Silvia

    2012-03-01

    Augmented reality is way of both altering the visible and revealing the invisible. It offers new opportunities for artistic exploration through virtual interventions in real space. In this paper, the author describes the implementation of two art installations using different AR technologies, one using optical marker tracking on mobile devices and one integrating stereoscopic projections into the physical environment. The first artwork, De Ondas y Abejas (The Waves and the Bees), is based on the widely publicized (but unproven) hypothesis of a link between cellphone radiation and the phenomenon of bee colony collapse disorder. Using an Android tablet, viewers search out small fiducial markers in the shape of electromagnetic waves hidden throughout the gallery, which reveal swarms of bees scattered on the floor. The piece also creates a generative soundscape based on electromagnetic fields. The second artwork, Urban Fauna, is a series of animations in which features of the urban landscape become plants and animals. Surveillance cameras become flocks of birds while miniature cellphone towers, lampposts, and telephone poles grow like small seedlings in time-lapse animation. The animations are presented as small stereoscopic projections, integrated into the physical space of the gallery. These two pieces explore the relationship between nature and technology through the visualization of invisible forces and hidden alternate realities.

  13. Augmented reality and stereo vision for remote scene characterization

    NASA Astrophysics Data System (ADS)

    Lawson, Shaun W.; Pretlove, John R. G.

    1999-11-01

    In this paper we present our progress in the research and development of an augmented reality (AR) system for the remote inspection of hazardous environments. It specifically addresses one particular application with which we are involved--that of improving the inspection of underground sewer pipes using robotic vehicles and 3D graphical overlays coupled with stereoscopic visual data. Traditional sewer inspection using a human operator and CCTV systems is a mature technology--though the task itself is difficult, subjective and prone to error. The work described here proposes not to replace the expert human inspector--but to enhance and increase the information that is available to him and to augment that information with other previously stored data. We describe our current system components which comprise a robotic stereo head device, a simulated sewer crawling vehicle and our AR system. We then go on to discuss the lengthy calibration procedures which are necessary in to align any graphical overlay information with live video data. Some experiments in determining alignment errors under head motion and some investigations into the use of a calibrated virtual cursor are then described.

  14. Augmented reality to enhance an active telepresence system

    NASA Astrophysics Data System (ADS)

    Wheeler, Alison; Pretlove, John R. G.; Parker, Graham A.

    1996-12-01

    Tasks carried out remotely via a telerobotic system are typically complex, occur in hazardous environments and require fine control of the robot's movements. Telepresence systems provide the teleoperator with a feeling of being physically present at the remote site. Stereoscopic video has been successfully applied to telepresence vision systems to increase the operator's perception of depth in the remote scene and this sense of presence can be further enhanced using computer generated stereo graphics to augment the visual information presented to the operator. The Mechatronic Systems and Robotics Research Group have over seven years developed a number of high performance active stereo vision systems culminating in the latest, a four degree-of-freedom stereohead. This carries two miniature color cameras and is controlled in real time by the motion of the operator's head, who views the stereoscopic video images on an immersive head mounted display or stereo monitor. The stereohead is mounted on a mobile robot, the movement of which is controlled by a joystick interface. This paper describes the active telepresence system and the development of a prototype augmented reality (AR) application to enhance the operator's sense of presence at the remote site. The initial enhancements are a virtual map and compass to aid navigation in degraded visual conditions and a virtual cursor that provides a means for the operator to interact with the remote environment. The results of preliminary experiments using the initial enhancements are presented.

  15. Applied Operations Research: Augmented Reality in an Industrial Environment

    NASA Technical Reports Server (NTRS)

    Cole, Stuart K.

    2015-01-01

    Augmented reality is the application of computer generated data or graphics onto a real world view. Its use provides the operator additional information or a heightened situational awareness. While advancements have been made in automation and diagnostics of high value critical equipment to improve readiness, reliability and maintenance, the need for assisting and support to Operations and Maintenance staff persists. AR can improve the human machine interface where computer capabilities maximize the human experience and analysis capabilities. NASA operates multiple facilities with complex ground based HVCE in support of national aerodynamics and space exploration, and the need exists to improve operational support and close a gap related to capability sustainment where key and experienced staff consistently rotate work assignments and reach their expiration of term of service. The initiation of an AR capability to augment and improve human abilities and training experience in the industrial environment requires planning and establishment of a goal and objectives for the systems and specific applications. This paper explored use of AR in support of Operation staff in real time operation of HVCE and its maintenance. The results identified include identification of specific goal and objectives, challenges related to availability and computer system infrastructure.

  16. D Tracking Based Augmented Reality for Cultural Heritage Data Management

    NASA Astrophysics Data System (ADS)

    Battini, C.; Landi, G.

    2015-02-01

    The development of contactless documentation techniques is allowing researchers to collect high volumes of three-dimensional data in a short time but with high levels of accuracy. The digitalisation of cultural heritage opens up the possibility of using image processing and analysis, and computer graphics techniques, to preserve this heritage for future generations; augmenting it with additional information or with new possibilities for its enjoyment and use. The collection of precise datasets about cultural heritage status is crucial for its interpretation, its conservation and during the restoration processes. The application of digital-imaging solutions for various feature extraction, image data-analysis techniques, and three-dimensional reconstruction of ancient artworks, allows the creation of multidimensional models that can incorporate information coming from heterogeneous data sets, research results and historical sources. Real objects can be scanned and reconstructed virtually, with high levels of data accuracy and resolution. Real-time visualisation software and hardware is rapidly evolving and complex three-dimensional models can be interactively visualised and explored on applications developed for mobile devices. This paper will show how a 3D reconstruction of an object, with multiple layers of information, can be stored and visualised through a mobile application that will allow interaction with a physical object for its study and analysis, using 3D Tracking based Augmented Reality techniques.

  17. Registration using natural features for augmented reality systems.

    PubMed

    Yuan, M L; Ong, S K; Nee, A Y C

    2006-01-01

    Registration is one of the most difficult problems in augmented reality (AR) systems. In this paper, a simple registration method using natural features based on the projective reconstruction technique is proposed. This method consists of two steps: embedding and rendering. Embedding involves specifying four points to build the world coordinate system on which a virtual object will be superimposed. In rendering, the Kanade-Lucas-Tomasi (KLT) feature tracker is used to track the natural feature correspondences in the live video. The natural features that have been tracked are used to estimate the corresponding projective matrix in the image sequence. Next, the projective reconstruction technique is used to transfer the four specified points to compute the registration matrix for augmentation. This paper also proposes a robust method for estimating the projective matrix, where the natural features that have been tracked are normalized (translation and scaling) and used as the input data. The estimated projective matrix will be used as an initial estimate for a nonlinear optimization method that minimizes the actual residual errors based on the Levenberg-Marquardt (LM) minimization method, thus making the results more robust and stable. The proposed registration method has three major advantages: 1) It is simple, as no predefined fiducials or markers are used for registration for either indoor and outdoor AR applications. 2) It is robust, because it remains effective as long as at least six natural features are tracked during the entire augmentation, and the existence of the corresponding projective matrices in the live video is guaranteed. Meanwhile, the robust method to estimate the projective matrix can obtain stable results even when there are some outliers during the tracking process. 3) Virtual objects can still be superimposed on the specified areas, even if some parts of the areas are occluded during the entire process. Some indoor and outdoor experiments have

  18. D3D augmented reality imaging system: proof of concept in mammography

    PubMed Central

    Douglas, David B; Petricoin, Emanuel F; Liotta, Lance; Wilson, Eugene

    2016-01-01

    Purpose The purpose of this article is to present images from simulated breast microcalcifications and assess the pattern of the microcalcifications with a technical development called “depth 3-dimensional (D3D) augmented reality”. Materials and methods A computer, head display unit, joystick, D3D augmented reality software, and an in-house script of simulated data of breast microcalcifications in a ductal distribution were used. No patient data was used and no statistical analysis was performed. Results The D3D augmented reality system demonstrated stereoscopic depth perception by presenting a unique image to each eye, focal point convergence, head position tracking, 3D cursor, and joystick fly-through. Conclusion The D3D augmented reality imaging system offers image viewing with depth perception and focal point convergence. The D3D augmented reality system should be tested to determine its utility in clinical practice. PMID:27563261

  19. Augmented versus Virtual Reality Laparoscopic Simulation: What Is the Difference?

    PubMed Central

    Botden, Sanne M.B.I.; Buzink, Sonja N.; Schijven, Marlies P.

    2007-01-01

    Background Virtual reality (VR) is an emerging new modality for laparoscopic skills training; however, most simulators lack realistic haptic feedback. Augmented reality (AR) is a new laparoscopic simulation system offering a combination of physical objects and VR simulation. Laparoscopic instruments are used within an hybrid mannequin on tissue or objects while using video tracking. This study was designed to assess the difference in realism, haptic feedback, and didactic value between AR and VR laparoscopic simulation. Methods The ProMIS AR and LapSim VR simulators were used in this study. The participants performed a basic skills task and a suturing task on both simulators, after which they filled out a questionnaire about their demographics and their opinion of both simulators scored on a 5-point Likert scale. The participants were allotted to 3 groups depending on their experience: experts, intermediates and novices. Significant differences were calculated with the paired t-test. Results There was general consensus in all groups that the ProMIS AR laparoscopic simulator is more realistic than the LapSim VR laparoscopic simulator in both the basic skills task (mean 4.22 resp. 2.18, P < 0.000) as well as the suturing task (mean 4.15 resp. 1.85, P < 0.000). The ProMIS is regarded as having better haptic feedback (mean 3.92 resp. 1.92, P < 0.000) and as being more useful for training surgical residents (mean 4.51 resp. 2.94, P < 0.000). Conclusions In comparison with the VR simulator, the AR laparoscopic simulator was regarded by all participants as a better simulator for laparoscopic skills training on all tested features. PMID:17361356

  20. An Augmented-Reality Edge Enhancement Application for Google Glass

    PubMed Central

    Hwang, Alex D.; Peli, Eli

    2014-01-01

    Purpose Google Glass provides a platform that can be easily extended to include a vision enhancement tool. We have implemented an augmented vision system on Glass, which overlays enhanced edge information over the wearer’s real world view, to provide contrast-improved central vision to the Glass wearers. The enhanced central vision can be naturally integrated with scanning. Methods Goggle Glass’s camera lens distortions were corrected by using an image warping. Since the camera and virtual display are horizontally separated by 16mm, and the camera aiming and virtual display projection angle are off by 10°, the warped camera image had to go through a series of 3D transformations to minimize parallax errors before the final projection to the Glass’ see-through virtual display. All image processes were implemented to achieve near real-time performance. The impacts of the contrast enhancements were measured for three normal vision subjects, with and without a diffuser film to simulate vision loss. Results For all three subjects, significantly improved contrast sensitivity was achieved when the subjects used the edge enhancements with a diffuser film. The performance boost is limited by the Glass camera’s performance. The authors assume this accounts for why performance improvements were observed only with the diffuser filter condition (simulating low vision). Conclusions Improvements were measured with simulated visual impairments. With the benefit of see-through augmented reality edge enhancement, natural visual scanning process is possible, and suggests that the device may provide better visual function in a cosmetically and ergonomically attractive format for patients with macular degeneration. PMID:24978871

  1. Study of application of virtual object projection based on augmented reality technology

    NASA Astrophysics Data System (ADS)

    Liu, San-jun; Zhang, Yan-feng

    2013-03-01

    Augmented reality technology is a hot research topic in recent years. Augmented reality is a technology of synthesizing virtual objects generated by the computer or other information to the real world felt by the user. It is a supplement to the real world, rather than the completely replacement of the real world. Displaying technology and tracking registration technology is key technologies in the augmented reality system, and is the focus of the study. This article briefly describes the displaying technology, and the tracking registration technology.

  2. Concealed target detection using augmented reality with SIRE radar

    NASA Astrophysics Data System (ADS)

    Saponaro, Philip; Kambhamettu, Chandra; Ranney, Kenneth; Sullivan, Anders

    2013-05-01

    The Synchronous Impulse Reconstruction (SIRE) forward-looking radar, developed by the U.S. Army Research Laboratory (ARL), can detect concealed targets using ultra-wideband synthetic aperture technology. The SIRE radar has been mounted on a Ford Expedition and combined with other sensors, including a pan/tilt/zoom camera, to test its capabilities of concealed target detection in a realistic environment. Augmented Reality (AR) can be used to combine the SIRE radar image with the live camera stream into one view, which provides the user with information that is quicker to assess and easier to understand than each separated. In this paper we present an AR system which utilizes a global positioning system (GPS) and inertial measurement unit (IMU) to overlay a SIRE radar image onto a live video stream. We describe a method for transforming 3D world points in the UTM coordinate system onto the video stream by calibrating for the intrinsic parameters of the camera. This calibration is performed offline to save computation time and achieve real time performance. Since the intrinsic parameters are affected by the zoom of the camera, we calibrate at eleven different zooms and interpolate. We show the results of a real time transformation of the SAR imagery onto the video stream. Finally, we quantify both the 2D error and 3D residue associated with our transformation and show that the amount of error is reasonable for our application.

  3. Intraoperative surgical photoacoustic microscopy (IS-PAM) using augmented reality

    NASA Astrophysics Data System (ADS)

    Lee, Changho; Han, Seunghoon; Kim, Sehui; Jeon, Mansik; Kim, Jeehyun; Kim, Chulhong

    2014-03-01

    We have developed an intraoperative surgical photoacoustic microscopy (IS-PAM) system by integrating an optical resolution photoacoustic microscopy (OR-PAM) and conventional surgical microscope. Based on the common optical path in the OR-PAM and microscope system, we can acquire the PAM and microscope images at the same time. Furthermore, by utilizing a mini-sized beam projector, 2D PAM images are back-projected onto the microscope view plane as augmented reality. Thus, both the conventional microscopic and 2D cross-sectional PAM images are displayed on the plane through an eyepiece lens of the microscope. In our method, additional image display tool is not required to show the PAM image. Therefore, it potentially offers significant convenience to surgeons without movement of their sights during surgeries. In order to demonstrate the performance of our IS-PAM system, first, we successfully monitored needle intervention in phantoms. Moreover, we successfully guided needle insertion into mice skins in vivo by visualizing surrounding blood vessels from the PAM images and the magnified skin surfaces from the conventional microscopic images simultaneously.

  4. Augmented Reality-Based Navigation System for Wrist Arthroscopy: Feasibility

    PubMed Central

    Zemirline, Ahmed; Agnus, Vincent; Soler, Luc; Mathoulin, Christophe L.; Liverneaux, Philippe A.; Obdeijn, Miryam

    2013-01-01

    Purpose In video surgery, and more specifically in arthroscopy, one of the major problems is positioning the camera and instruments within the anatomic environment. The concept of computer-guided video surgery has already been used in ear, nose, and throat (ENT), gynecology, and even in hip arthroscopy. These systems, however, rely on optical or mechanical sensors, which turn out to be restricting and cumbersome. The aim of our study was to develop and evaluate the accuracy of a navigation system based on electromagnetic sensors in video surgery. Methods We used an electromagnetic localization device (Aurora, Northern Digital Inc., Ontario, Canada) to track the movements in space of both the camera and the instruments. We have developed a dedicated application in the Python language, using the VTK library for the graphic display and the OpenCV library for camera calibration. Results A prototype has been designed and evaluated for wrist arthroscopy. It allows display of the theoretical position of instruments onto the arthroscopic view with useful accuracy. Discussion The augmented reality view represents valuable assistance when surgeons want to position the arthroscope or locate their instruments. It makes the maneuver more intuitive, increases comfort, saves time, and enhances concentration. PMID:24436832

  5. MetaTree: augmented reality narrative explorations of urban forests

    NASA Astrophysics Data System (ADS)

    West, Ruth; Margolis, Todd; O'Neil-Dunne, Jarlath; Mendelowitz, Eitan

    2012-03-01

    As cities world-wide adopt and implement reforestation initiatives to plant millions of trees in urban areas, they are engaging in what is essentially a massive ecological and social experiment. Existing air-borne, space-borne, and fieldbased imaging and inventory mechanisms fail to provide key information on urban tree ecology that is crucial to informing management, policy, and supporting citizen initiatives for the planting and stewardship of trees. The shortcomings of the current approaches include: spatial and temporal resolution, poor vantage point, cost constraints and biological metric limitations. Collectively, this limits their effectiveness as real-time inventory and monitoring tools. Novel methods for imaging and monitoring the status of these emerging urban forests and encouraging their ongoing stewardship by the public are required to ensure their success. This art-science collaboration proposes to re-envision citizens' relationship with urban spaces by foregrounding urban trees in relation to local architectural features and simultaneously creating new methods for urban forest monitoring. We explore creating a shift from overhead imaging or field-based tree survey data acquisition methods to continuous, ongoing monitoring by citizen scientists as part of a mobile augmented reality experience. We consider the possibilities of this experience as a medium for interacting with and visualizing urban forestry data and for creating cultural engagement with urban ecology.

  6. Development of a mobile borehole investigation software using augmented reality

    NASA Astrophysics Data System (ADS)

    Son, J.; Lee, S.; Oh, M.; Yun, D. E.; Kim, S.; Park, H. D.

    2015-12-01

    Augmented reality (AR) is one of the most developing technologies in smartphone and IT areas. While various applications have been developed using the AR, there are a few geological applications which adopt its advantages. In this study, a smartphone application to manage boreholes using AR has been developed. The application is consisted of three major modules, an AR module, a map module and a data management module. The AR module calculates the orientation of the device and displays nearby boreholes distributed in three dimensions using the orientation. This module shows the boreholes in a transparent layer on a live camera screen so the user can find and understand the overall characteristics of the underground geology. The map module displays the boreholes on a 2D map to show their distribution and the location of the user. The database module uses SQLite library which has proper characteristics for mobile platforms, and Binary XML is adopted to enable containing additional customized data. The application is able to provide underground information in an intuitive and refined forms and to decrease time and general equipment required for geological field investigations.

  7. L-split marker for augmented reality in aircraft assembly

    NASA Astrophysics Data System (ADS)

    Han, Pengfei; Zhao, Gang

    2016-04-01

    In order to improve the performance of conventional square markers widely used by marker-based augmented reality systems in aircraft assembly environments, an L-split marker is proposed. Every marker consists of four separate L-shaped parts and each of them contains partial information about the marker. Geometric features of the L-shape, which are more discriminate than the symmetrical square shape adopted by conventional markers, are used to detect proposed markers from the camera images effectively. The marker is split into four separate parts in order to improve the robustness to occlusion and curvature to some extent. The registration process can be successfully completed as long as three parts are detected (up to about 80% of the area could be occluded). Moreover, when we attach the marker on nonplanar surfaces, the curvature status of the marker can be roughly analyzed with every part's normal direction, which can be obtained since their six corners have been explicitly determined in the previous detection process. And based on the marker design, new detection and recognition algorithms are proposed and detailed. The experimental results show that the marker and the algorithms are effective.

  8. A spatially augmented reality sketching interface for architectural daylighting design.

    PubMed

    Sheng, Yu; Yapo, Theodore C; Young, Christopher; Cutler, Barbara

    2011-01-01

    We present an application of interactive global illumination and spatially augmented reality to architectural daylight modeling that allows designers to explore alternative designs and new technologies for improving the sustainability of their buildings. Images of a model in the real world, captured by a camera above the scene, are processed to construct a virtual 3D model. To achieve interactive rendering rates, we use a hybrid rendering technique, leveraging radiosity to simulate the interreflectance between diffuse patches and shadow volumes to generate per-pixel direct illumination. The rendered images are then projected on the real model by four calibrated projectors to help users study the daylighting illumination. The virtual heliodon is a physical design environment in which multiple designers, a designer and a client, or a teacher and students can gather to experience animated visualizations of the natural illumination within a proposed design by controlling the time of day, season, and climate. Furthermore, participants may interactively redesign the geometry and materials of the space by manipulating physical design elements and see the updated lighting simulation. PMID:21071786

  9. Shape recognition and pose estimation for mobile Augmented Reality.

    PubMed

    Hagbi, Nate; Bergig, Oriel; El-Sana, Jihad; Billinghurst, Mark

    2011-10-01

    Nestor is a real-time recognition and camera pose estimation system for planar shapes. The system allows shapes that carry contextual meanings for humans to be used as Augmented Reality (AR) tracking targets. The user can teach the system new shapes in real time. New shapes can be shown to the system frontally, or they can be automatically rectified according to previously learned shapes. Shapes can be automatically assigned virtual content by classification according to a shape class library. Nestor performs shape recognition by analyzing contour structures and generating projective-invariant signatures from their concavities. The concavities are further used to extract features for pose estimation and tracking. Pose refinement is carried out by minimizing the reprojection error between sample points on each image contour and its library counterpart. Sample points are matched by evolving an active contour in real time. Our experiments show that the system provides stable and accurate registration, and runs at interactive frame rates on a Nokia N95 mobile phone. PMID:21041876

  10. Augmented reality enabling intelligence exploitation at the edge

    NASA Astrophysics Data System (ADS)

    Kase, Sue E.; Roy, Heather; Bowman, Elizabeth K.; Patton, Debra

    2015-05-01

    Today's Warfighters need to make quick decisions while interacting in densely populated environments comprised of friendly, hostile, and neutral host nation locals. However, there is a gap in the real-time processing of big data streams for edge intelligence. We introduce a big data processing pipeline called ARTEA that ingests, monitors, and performs a variety of analytics including noise reduction, pattern identification, and trend and event detection in the context of an area of operations (AOR). Results of the analytics are presented to the Soldier via an augmented reality (AR) device Google Glass (Glass). Non-intrusive AR devices such as Glass can visually communicate contextually relevant alerts to the Soldier based on the current mission objectives, time, location, and observed or sensed activities. This real-time processing and AR presentation approach to knowledge discovery flattens the intelligence hierarchy enabling the edge Soldier to act as a vital and active participant in the analysis process. We report preliminary observations testing ARTEA and Glass in a document exploitation and person of interest scenario simulating edge Soldier participation in the intelligence process in disconnected deployment conditions.

  11. Augmented Reality Image Guidance in Minimally Invasive Prostatectomy

    NASA Astrophysics Data System (ADS)

    Cohen, Daniel; Mayer, Erik; Chen, Dongbin; Anstee, Ann; Vale, Justin; Yang, Guang-Zhong; Darzi, Ara; Edwards, Philip'eddie'

    This paper presents our work aimed at providing augmented reality (AR) guidance of robot-assisted laparoscopic surgery (RALP) using the da Vinci system. There is a good clinical case for guidance due to the significant rate of complications and steep learning curve for this procedure. Patients who were due to undergo robotic prostatectomy for organ-confined prostate cancer underwent preoperative 3T MRI scans of the pelvis. These were segmented and reconstructed to form 3D images of pelvic anatomy. The reconstructed image was successfully overlaid onto screenshots of the recorded surgery post-procedure. Surgeons who perform minimally-invasive prostatectomy took part in a user-needs analysis to determine the potential benefits of an image guidance system after viewing the overlaid images. All surgeons stated that the development would be useful at key stages of the surgery and could help to improve the learning curve of the procedure and improve functional and oncological outcomes. Establishing the clinical need in this way is a vital early step in development of an AR guidance system. We have also identified relevant anatomy from preoperative MRI. Further work will be aimed at automated registration to account for tissue deformation during the procedure, using a combination of transrectal ultrasound and stereoendoscopic video.

  12. Spatial augmented reality on industrial CNC-machines

    NASA Astrophysics Data System (ADS)

    Olwal, Alex; Gustafsson, Jonny; Lindfors, Christoffer

    2008-02-01

    In this work we present how Augmented Reality (AR) can be used to create an intimate integration of process data with the workspace of an industrial CNC (computer numerical control) machine. AR allows us to combine interactive computer graphics with real objects in a physical environment - in this case, the workspace of an industrial lathe. ASTOR is an autostereoscopic optical see-through spatial AR system, which provides real-time 3D visual feedback without the need for user-worn equipment, such as head-mounted displays or sensors for tracking. The use of a transparent holographic optical element, overlaid onto the safety glass, allows the system to simultaneously provide bright imagery and clear visibility of the tool and workpiece. The system makes it possible to enhance visibility of occluded tools as well as to visualize real-time data from the process in the 3D space. The graphics are geometrically registered with the workspace and provide an intuitive representation of the process, amplifying the user's understanding and simplifying machine operation.

  13. Realistic real-time outdoor rendering in augmented reality.

    PubMed

    Kolivand, Hoshang; Sunar, Mohd Shahrizal

    2014-01-01

    Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems. PMID:25268480

  14. Realistic Real-Time Outdoor Rendering in Augmented Reality

    PubMed Central

    Kolivand, Hoshang; Sunar, Mohd Shahrizal

    2014-01-01

    Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems. PMID:25268480

  15. Integrating a Mobile Augmented Reality Activity to Contextualize Student Learning of a Socioscienti?c Issue

    ERIC Educational Resources Information Center

    Chang, Hsin-Yi; Wu, Hsin-Kai; Hsu, Ying-Shao

    2013-01-01

    virtual objects or information overlaying physical objects or environments, resulting in a mixed reality in which virtual objects and real environments coexist in a meaningful way to augment learning…

  16. An indoor augmented reality mobile application for simulation of building evacuation

    NASA Astrophysics Data System (ADS)

    Sharma, Sharad; Jerripothula, Shanmukha

    2015-03-01

    Augmented Reality enables people to remain connected with the physical environment they are in, and invites them to look at the world from new and alternative perspectives. There has been an increasing interest in emergency evacuation applications for mobile devices. Nearly all the smart phones these days are Wi-Fi and GPS enabled. In this paper, we propose a novel emergency evacuation system that will help people to safely evacuate a building in case of an emergency situation. It will further enhance knowledge and understanding of where the exits are in the building and safety evacuation procedures. We have applied mobile augmented reality (mobile AR) to create an application with Unity 3D gaming engine. We show how the mobile AR application is able to display a 3D model of the building and animation of people evacuation using markers and web camera. The system gives a visual representation of a building in 3D space, allowing people to see where exits are in the building through the use of a smart phone or tablets. Pilot studies were conducted with the system showing its partial success and demonstrated the effectiveness of the application in emergency evacuation. Our computer vision methods give good results when the markers are closer to the camera, but accuracy decreases when the markers are far away from the camera.

  17. Past and future of wearable augmented reality displays and their applications

    NASA Astrophysics Data System (ADS)

    Hua, Hong

    2014-10-01

    A wearable augmented reality (AR) display enables the ability to overlay computer-generated imagery on a person's real-world view and it has long been portrayed as a transformative technology to redefine the way we perceive and interact with digital information. In this paper, I will provide an overview on my research group's past development efforts in augmented reality display technologies and applications and discuss key technical challenges and opportunities for future developments.

  18. Augmented reality guidance system for peripheral nerve blocks

    NASA Astrophysics Data System (ADS)

    Wedlake, Chris; Moore, John; Rachinsky, Maxim; Bainbridge, Daniel; Wiles, Andrew D.; Peters, Terry M.

    2010-02-01

    Peripheral nerve block treatments are ubiquitous in hospitals and pain clinics worldwide. State of the art techniques use ultrasound (US) guidance and/or electrical stimulation to verify needle tip location. However, problems such as needle-US beam alignment, poor echogenicity of block needles and US beam thickness can make it difficult for the anesthetist to know the exact needle tip location. Inaccurate therapy delivery raises obvious safety and efficacy issues. We have developed and evaluated a needle guidance system that makes use of a magnetic tracking system (MTS) to provide an augmented reality (AR) guidance platform to accurately localize the needle tip as well as its projected trajectory. Five anesthetists and five novices performed simulated nerve block deliveries in a polyvinyl alcohol phantom to compare needle guidance under US alone to US placed in our AR environment. Our phantom study demonstrated a decrease in targeting attempts, decrease in contacting of critical structures, and an increase in accuracy of 0.68 mm compared to 1.34mm RMS in US guidance alone. Currently, the MTS uses 18 and 21 gauge hypodermic needles with a 5 degree of freedom sensor located at the needle tip. These needles can only be sterilized using an ethylene oxide process. In the interest of providing clinicians with a simple and efficient guidance system, we also evaluated attaching the sensor at the needle hub as a simple clip-on device. To do this, we simultaneously performed a needle bending study to assess the reliability of a hub-based sensor.

  19. Sharing skills: using augmented reality for human-robot collaboration

    NASA Astrophysics Data System (ADS)

    Giesler, Bjorn; Steinhaus, Peter; Walther, Marcus; Dillmann, Ruediger

    2004-05-01

    Both stationary 'industrial' and autonomous mobile robots nowadays pervade many workplaces, but human-friendly interaction with them is still very much an experimental subject. One of the reasons for this is that computer and robotic systems are very bad at performing certain tasks well and robust. A prime example is classification of sensor readings: Which part of a 3D depth image is the cup, which the saucer, which the table? These are tasks that humans excel at. To alleviate this problem, we propose a team approah, wherein the robot records sensor data and uses an Augmented-Reality (AR) system to present the data to the user directly in the 3D environment. The user can then perform classification decisions directly on the data by pointing, gestures and speech commands. After the classification has been performed by the user, the robot takes the classified data and matches it to its environment model. As a demonstration of this approach, we present an initial system for creating objects on-the-fly in the environment model. A rotating laser scanner is used to capture a 3D snapshot of the environment. This snapshot is presented to the user as an overlay over his view of the scene. The user classifies unknown objects by pointing at them. The system segments the snapshot according to the user's indications and presents the results of segmentation back to the user, who can then inspect, correct and enhance them interactively. After a satisfying result has been reached, the laser-scanner can take more snapshots from other angles and use the previous segmentation hints to construct a 3D model of the object.

  20. Augmented Reality Cues and Elderly Driver Hazard Perception

    PubMed Central

    Schall, Mark C.; Rusch, Michelle L.; Lee, John D.; Dawson, Jeffrey D.; Thomas, Geb; Aksan, Nazan; Rizzo, Matthew

    2013-01-01

    Objective Evaluate the effectiveness of augmented reality (AR) cues in improving driving safety in elderly drivers who are at increased crash risk due to cognitive impairments. Background Cognitively challenging driving environments pose a particular crash risk for elderly drivers. AR cueing is a promising technology to mitigate risk by directing driver attention to roadway hazards. This study investigates whether AR cues improve or interfere with hazard perception in elderly drivers with age-related cognitive decline. Methods Twenty elderly (Mean= 73 years, SD= 5 years), licensed drivers with a range of cognitive abilities measured by a speed of processing (SOP) composite participated in a one-hour drive in an interactive, fixed-base driving simulator. Each participant drove through six, straight, six-mile-long rural roadway scenarios following a lead vehicle. AR cues directed attention to potential roadside hazards in three of the scenarios, and the other three were uncued (baseline) drives. Effects of AR cueing were evaluated with respect to: 1) detection of hazardous target objects, 2) interference with detecting nonhazardous secondary objects, and 3) impairment in maintaining safe distance behind a lead vehicle. Results AR cueing improved the detection of hazardous target objects of low visibility. AR cues did not interfere with detection of nonhazardous secondary objects and did not impair ability to maintain safe distance behind a lead vehicle. SOP capacity did not moderate those effects. Conclusion AR cues show promise for improving elderly driver safety by increasing hazard detection likelihood without interfering with other driving tasks such as maintaining safe headway. PMID:23829037

  1. Augmented reality using ultra-wideband radar imagery

    NASA Astrophysics Data System (ADS)

    Nguyen, Lam; Koenig, Francois; Sherbondy, Kelly

    2011-06-01

    The U.S. Army Research Laboratory (ARL) has been investigating the utility of ultra-wideband (UWB) synthetic aperture radar (SAR) technology for detecting concealed targets in various applications. We have designed and built a vehicle-based, low-frequency UWB SAR radar for proof-of-concept demonstration in detecting obstacles for autonomous navigation, detecting concealed targets (mines, etc.), and mapping internal building structures to locate enemy activity. Although the low-frequency UWB radar technology offers valuable information to complement other technologies due to its penetration capability, it is very difficult to comprehend the radar imagery and correlate the detection list from the radar with the objects in the real world. Using augmented reality (AR) technology, we can superimpose the information from the radar onto the video image of the real world in real-time. Using this, Soldiers would view the environment and the superimposed graphics (SAR imagery, detection locations, digital map, etc.) via a standard display or a head-mounted display. The superimposed information would be constantly changed and adjusted for every perspective and movement of the user. ARL has been collaborating with ITT Industries to implement an AR system that integrates the video data captured from the real world and the information from the UWB radar. ARL conducted an experiment and demonstrated the real-time geo-registration of the two independent data streams. The integration of the AR sub-system into the radar system is underway. This paper presents the integration of the AR and SAR systems. It shows results that include the real-time embedding of the SAR imagery and other information into the video data stream.

  2. Augmented reality in healthcare education: an integrative review

    PubMed Central

    Zhu, Egui; Hadadgar, Arash; Masiello, Italo

    2014-01-01

    Background. The effective development of healthcare competencies poses great educational challenges. A possible approach to provide learning opportunities is the use of augmented reality (AR) where virtual learning experiences can be embedded in a real physical context. The aim of this study was to provide a comprehensive overview of the current state of the art in terms of user acceptance, the AR applications developed and the effect of AR on the development of competencies in healthcare. Methods. We conducted an integrative review. Integrative reviews are the broadest type of research review methods allowing for the inclusion of various research designs to more fully understand a phenomenon of concern. Our review included multi-disciplinary research publications in English reported until 2012. Results. 2529 research papers were found from ERIC, CINAHL, Medline, PubMed, Web of Science and Springer-link. Three qualitative, 20 quantitative and 2 mixed studies were included. Using a thematic analysis, we’ve described three aspects related to the research, technology and education. This study showed that AR was applied in a wide range of topics in healthcare education. Furthermore acceptance for AR as a learning technology was reported among the learners and its potential for improving different types of competencies. Discussion. AR is still considered as a novelty in the literature. Most of the studies reported early prototypes. Also the designed AR applications lacked an explicit pedagogical theoretical framework. Finally the learning strategies adopted were of the traditional style ‘see one, do one and teach one’ and do not integrate clinical competencies to ensure patients’ safety. PMID:25071992

  3. Augmented reality graphic interface for upstream dam inspection

    NASA Astrophysics Data System (ADS)

    Cote, Jean; Lavallee, Jean

    1995-12-01

    This paper presents a 3D graphic interface for the inspection of cracks along a dam. The monitoring of concrete dams is restricted by the accessibility of the various parts of the structure. Since the upstream face of a dam is not usually exposed, as in our case at Hydro- Quebec, a systematic and even ad hoc inspection become extremely complex. The piloting of a ROV (Remotely Operated Vehicle) underwater is like driving in a snowstorm. The view from the camera is similar to the visibility a driver would have in a snowstorm. Sensor fusion has to be performed by the operator since each sensor is specialized for its task. Even with a 2D positioning system or sonar scan, the approach to the inspection area is very tedious. A new 3D interface has been developed using augmented reality since the position and orientation of the vehicle are known. The point of view of the observer can easily be changed during a manipulation of the ROV. A shared memory based server can access the position data of the ROV and update the graphics in real time. The graphic environment can be used as well to drive the ROV with computer generated trajectories. A video card will be added to the Silicon Graphics workstation to display the view of the camera fixed to the ROV. This visual feedback will only be available when the ROV is close enough to the dam. The images will be calibrated since the position of the camera is known. The operator interface also includes a set of stereoscopic camera, hydrophonic (sound) feedback and imaging tools for measuring cracks.

  4. Augmented reality application utility for aviation maintenance work instruction

    NASA Astrophysics Data System (ADS)

    Pourcho, John Bryan

    Current aviation maintenance work instructions do not display information effectively enough to prevent costly errors and safety concerns. Aircraft are complex assemblies of highly interrelated components that confound troubleshooting and can make the maintenance procedure difficult (Drury & Gramopadhye, 2001). The sophisticated nature of aircraft maintenance necessitates a revolutionized training intervention for aviation maintenance technicians (United States General Accounting Office, 2003). Quite simply, the paper based job task cards fall short of offering rapid access to technical data and the system or component visualization necessary for working on complex integrated aircraft systems. Possible solutions to this problem include upgraded standards for paper based task cards and the use of integrated 3D product definition used on various mobile platforms (Ropp, Thomas, Lee, Broyles, Lewin, Andreychek, & Nicol, 2013). Previous studies have shown that incorporation of 3D graphics in work instructions allow the user to more efficiently and accurately interpret maintenance information (Jackson & Batstone, 2008). For aircraft maintenance workers, the use of mobile 3D model-based task cards could make current paper task card standards obsolete with their ability to deliver relevant, synchronized information to and from the hangar. Unlike previous versions of 3D model-based definition task cards and paper task cards, which are currently used in the maintenance industry, 3D model based definition task cards have the potential to be more mobile and accessible. Utilizing augmented reality applications on mobile devices to seamlessly deliver 3D product definition on mobile devices could increase the efficiency, accuracy, and reduce the mental workload for technicians when performing maintenance tasks (Macchiarella, 2004). This proposal will serve as a literary review of the aviation maintenance industry, the spatial ability of maintenance technicians, and benefits of

  5. Integrating Augmented Reality in Higher Education: A Multidisciplinary Study of Student Perceptions

    ERIC Educational Resources Information Center

    Delello, Julie A.; McWhorter, Rochell R.; Camp, Kerri M.

    2015-01-01

    Augmented reality (AR) is an emerging technology that blends physical objects with virtual reality. Through the integration of digital and print media, a gap between the "on and offline" worlds are merged, radically shifting student-computer interaction in the classroom. This research examined the results of a multiple case study on the…

  6. Constructing Liminal Blends in a Collaborative Augmented-Reality Learning Environment

    ERIC Educational Resources Information Center

    Enyedy, Noel; Danish, Joshua A.; DeLiema, David

    2015-01-01

    In vision-based augmented-reality (AR) environments, users view the physical world through a video feed or device that "augments" the display with a graphical or informational overlay. Our goal in this manuscript is to ask "how" and "why" these new technologies create opportunities for learning. We suggest that AR is…

  7. Sensorized Garment Augmented 3D Pervasive Virtual Reality System

    NASA Astrophysics Data System (ADS)

    Gulrez, Tauseef; Tognetti, Alessandro; de Rossi, Danilo

    Virtual reality (VR) technology has matured to a point where humans can navigate in virtual scenes; however, providing them with a comfortable fully immersive role in VR remains a challenge. Currently available sensing solutions do not provide ease of deployment, particularly in the seated position due to sensor placement restrictions over the body, and optic-sensing requires a restricted indoor environment to track body movements. Here we present a 52-sensor laden garment interfaced with VR, which offers both portability and unencumbered user movement in a VR environment. This chapter addresses the systems engineering aspects of our pervasive computing solution of the interactive sensorized 3D VR and presents the initial results and future research directions. Participants navigated in a virtual art gallery using natural body movements that were detected by their wearable sensor shirt and then mapped the signals to electrical control signals responsible for VR scene navigation. The initial results are positive, and offer many opportunities for use in computationally intelligentman-machine multimedia control.

  8. Towards Determination of Visual Requirements for Augmented Reality Displays and Virtual Environments for the Airport Tower

    NASA Technical Reports Server (NTRS)

    Ellis, Stephen R.

    2006-01-01

    The visual requirements for augmented reality or virtual environments displays that might be used in real or virtual towers are reviewed wi th respect to similar displays already used in aircraft. As an example of the type of human performance studies needed to determine the use ful specifications of augmented reality displays, an optical see-thro ugh display was used in an ATC Tower simulation. Three different binocular fields of view (14 deg, 28 deg, and 47 deg) were examined to det ermine their effect on subjects# ability to detect aircraft maneuveri ng and landing. The results suggest that binocular fields of view much greater than 47 deg are unlikely to dramatically improve search perf ormance and that partial binocular overlap is a feasible display tech nique for augmented reality Tower applications.

  9. Towards Determination of Visual Requirements for Augmented Reality Displays and Virtual Environments for the Airport Tower

    NASA Technical Reports Server (NTRS)

    Ellis, Stephen R.

    2006-01-01

    The visual requirements for augmented reality or virtual environments displays that might be used in real or virtual towers are reviewed with respect to similar displays already used in aircraft. As an example of the type of human performance studies needed to determine the useful specifications of augmented reality displays, an optical see-through display was used in an ATC Tower simulation. Three different binocular fields of view (14deg, 28deg, and 47deg) were examined to determine their effect on subjects ability to detect aircraft maneuvering and landing. The results suggest that binocular fields of view much greater than 47deg are unlikely to dramatically improve search performance and that partial binocular overlap is a feasible display technique for augmented reality Tower applications.

  10. Emerging technology in surgical education: combining real-time augmented reality and wearable computing devices.

    PubMed

    Ponce, Brent A; Menendez, Mariano E; Oladeji, Lasun O; Fryberger, Charles T; Dantuluri, Phani K

    2014-11-01

    The authors describe the first surgical case adopting the combination of real-time augmented reality and wearable computing devices such as Google Glass (Google Inc, Mountain View, California). A 66-year-old man presented to their institution for a total shoulder replacement after 5 years of progressive right shoulder pain and decreased range of motion. Throughout the surgical procedure, Google Glass was integrated with the Virtual Interactive Presence and Augmented Reality system (University of Alabama at Birmingham, Birmingham, Alabama), enabling the local surgeon to interact with the remote surgeon within the local surgical field. Surgery was well tolerated by the patient and early surgical results were encouraging, with an improvement of shoulder pain and greater range of motion. The combination of real-time augmented reality and wearable computing devices such as Google Glass holds much promise in the field of surgery. PMID:25361359

  11. Design and implementation of wearable outdoor augmented reality system for Yuanmingyuan Garden

    NASA Astrophysics Data System (ADS)

    Lei, Jinchao; Liu, Yue; Wang, Yongtian

    2004-03-01

    The design and implementation of a wearable computer, which is used in the outdoor AR (Augmented Reality) system in Yuanmingyuan garden, is discussed in this paper. The performance and portability of the wearable computer system are the key factor for the proposed outdoor AR system. The design principles and schemes, which must be considered during prototype development, are presented. A prototype of the proposed wearable outdoor Augmented Reality system for Yuanmingyuan garden is developed, which also shows its application scenario in the reconstruction of other historic remains.

  12. Augmented Reality for Teaching Endotracheal Intubation: MR Imaging to Create Anatomically Correct Models

    PubMed Central

    Kerner, Karen F.; Imielinska, Celina; Rolland, Jannick; Tang, Haiying

    2003-01-01

    Clinical procedures have traditionally been taught at the bedside, in the morgue and in the animal lab. Augmented Reality (AR) technology (the merging of virtual reality and real objects or patients) provides a new method for teaching clinical and surgical procedures. Improved patient safety is a major advantage. We describe a system which employs AR technology to teach endotracheal intubation, using the Visible Human datasets, as well as MR images from live patient volunteers. PMID:14728393

  13. Exploring the benefits of augmented reality documentation for maintenance and repair.

    PubMed

    Henderson, Steven; Feiner, Steven

    2011-10-01

    We explore the development of an experimental augmented reality application that provides benefits to professional mechanics performing maintenance and repair tasks in a field setting. We developed a prototype that supports military mechanics conducting routine maintenance tasks inside an armored vehicle turret, and evaluated it with a user study. Our prototype uses a tracked headworn display to augment a mechanic's natural view with text, labels, arrows, and animated sequences designed to facilitate task comprehension, localization, and execution. A within-subject controlled user study examined professional military mechanics using our system to complete 18 common tasks under field conditions. These tasks included installing and removing fasteners and indicator lights, and connecting cables, all within the cramped interior of an armored personnel carrier turret. An augmented reality condition was tested against two baseline conditions: the same headworn display providing untracked text and graphics and a fixed flat panel display representing an improved version of the laptop-based documentation currently employed in practice. The augmented reality condition allowed mechanics to locate tasks more quickly than when using either baseline, and in some instances, resulted in less overall head movement. A qualitative survey showed that mechanics found the augmented reality condition intuitive and satisfying for the tested sequence of tasks. PMID:21041888

  14. Virtual Reality Augmentation for Functional Assessment and Treatment of Stuttering

    ERIC Educational Resources Information Center

    Brundage, Shelley B.

    2007-01-01

    Stuttering characteristics, assessment, and treatment principles present challenges to assessment and treatment that can be addressed with virtual reality (VR) technology. This article describes how VR can be used to assist clinicians in meeting some of these challenges with adults who stutter. A review of current VR research at the Stuttering…

  15. Moving from virtual reality exposure-based therapy to augmented reality exposure-based therapy: a review.

    PubMed

    Baus, Oliver; Bouchard, Stéphane

    2014-01-01

    This paper reviews the move from virtual reality exposure-based therapy to augmented reality exposure-based therapy (ARET). Unlike virtual reality (VR), which entails a complete virtual environment (VE), augmented reality (AR) limits itself to producing certain virtual elements to then merge them into the view of the physical world. Although, the general public may only have become aware of AR in the last few years, AR type applications have been around since beginning of the twentieth century. Since, then, technological developments have enabled an ever increasing level of seamless integration of virtual and physical elements into one view. Like VR, AR allows the exposure to stimuli which, due to various reasons, may not be suitable for real-life scenarios. As such, AR has proven itself to be a medium through which individuals suffering from specific phobia can be exposed "safely" to the object(s) of their fear, without the costs associated with programing complete VEs. Thus, ARET can offer an efficacious alternative to some less advantageous exposure-based therapies. Above and beyond presenting what has been accomplished in ARET, this paper covers some less well-known aspects of the history of AR, raises some ARET related issues, and proposes potential avenues to be followed. These include the type of measures to be used to qualify the user's experience in an augmented reality environment, the exclusion of certain AR-type functionalities from the definition of AR, as well as the potential use of ARET to treat non-small animal phobias, such as social phobia. PMID:24624073

  16. Moving from Virtual Reality Exposure-Based Therapy to Augmented Reality Exposure-Based Therapy: A Review

    PubMed Central

    Baus, Oliver; Bouchard, Stéphane

    2014-01-01

    This paper reviews the move from virtual reality exposure-based therapy to augmented reality exposure-based therapy (ARET). Unlike virtual reality (VR), which entails a complete virtual environment (VE), augmented reality (AR) limits itself to producing certain virtual elements to then merge them into the view of the physical world. Although, the general public may only have become aware of AR in the last few years, AR type applications have been around since beginning of the twentieth century. Since, then, technological developments have enabled an ever increasing level of seamless integration of virtual and physical elements into one view. Like VR, AR allows the exposure to stimuli which, due to various reasons, may not be suitable for real-life scenarios. As such, AR has proven itself to be a medium through which individuals suffering from specific phobia can be exposed “safely” to the object(s) of their fear, without the costs associated with programing complete VEs. Thus, ARET can offer an efficacious alternative to some less advantageous exposure-based therapies. Above and beyond presenting what has been accomplished in ARET, this paper covers some less well-known aspects of the history of AR, raises some ARET related issues, and proposes potential avenues to be followed. These include the type of measures to be used to qualify the user’s experience in an augmented reality environment, the exclusion of certain AR-type functionalities from the definition of AR, as well as the potential use of ARET to treat non-small animal phobias, such as social phobia. PMID:24624073

  17. Research on gesture recognition of augmented reality maintenance guiding system based on improved SVM

    NASA Astrophysics Data System (ADS)

    Zhao, Shouwei; Zhang, Yong; Zhou, Bin; Ma, Dongxi

    2014-09-01

    Interaction is one of the key techniques of augmented reality (AR) maintenance guiding system. Because of the complexity of the maintenance guiding system's image background and the high dimensionality of gesture characteristics, the whole process of gesture recognition can be divided into three stages which are gesture segmentation, gesture characteristic feature modeling and trick recognition. In segmentation stage, for solving the misrecognition of skin-like region, a segmentation algorithm combing background mode and skin color to preclude some skin-like regions is adopted. In gesture characteristic feature modeling of image attributes stage, plenty of characteristic features are analyzed and acquired, such as structure characteristics, Hu invariant moments features and Fourier descriptor. In trick recognition stage, a classifier based on Support Vector Machine (SVM) is introduced into the augmented reality maintenance guiding process. SVM is a novel learning method based on statistical learning theory, processing academic foundation and excellent learning ability, having a lot of issues in machine learning area and special advantages in dealing with small samples, non-linear pattern recognition at high dimension. The gesture recognition of augmented reality maintenance guiding system is realized by SVM after the granulation of all the characteristic features. The experimental results of the simulation of number gesture recognition and its application in augmented reality maintenance guiding system show that the real-time performance and robustness of gesture recognition of AR maintenance guiding system can be greatly enhanced by improved SVM.

  18. PRISMA-MAR: An Architecture Model for Data Visualization in Augmented Reality Mobile Devices

    ERIC Educational Resources Information Center

    Gomes Costa, Mauro Alexandre Folha; Serique Meiguins, Bianchi; Carneiro, Nikolas S.; Gonçalves Meiguins, Aruanda Simões

    2013-01-01

    This paper proposes an extension to mobile augmented reality (MAR) environments--the addition of data charts to the more usual text, image and video components. To this purpose, we have designed a client-server architecture including the main necessary modules and services to provide an Information Visualization MAR experience. The server side…

  19. Making the Invisible Visible in Science Museums through Augmented Reality Devices

    ERIC Educational Resources Information Center

    Yoon, Susan A.; Wang, Joyce

    2014-01-01

    Despite the potential of augmented reality (AR) in enabling students to construct new understanding, little is known about how the processes and interactions with the multimedia lead to increased learning. This study seeks to explore the affordances of an AR tool on learning that is focused on the science concept of magnets and magnetic fields.…

  20. Exploring the Effect of Materials Designed with Augmented Reality on Language Learners' Vocabulary Learning

    ERIC Educational Resources Information Center

    Solak, Ekrem; Cakir, Recep

    2015-01-01

    The purpose of this study was to determine the motivational level of the participants in a language classroom towards course materials designed in accordance with augmented reality technology and to identify the correlation between academic achievement and motivational level. 130 undergraduate students from a state-run university in Turkey…

  1. Modeling Augmented Reality Games with Preservice Elementary and Secondary Science Teachers

    ERIC Educational Resources Information Center

    Burton, Erin Peters; Frazier, Wendy; Annetta, Leonard; Lamb, Richard; Cheng, Rebecca; Chmiel, Margaret

    2011-01-01

    Cell phones are ever-present in daily life, yet vastly underused in the formal science classroom. The purpose of this study was to implement a novel learning tool on cell phones, Augmented Reality Games, and determine how the interaction influenced preservice teachers' content knowledge and self-efficacy of cell phone use in schools. Results show…

  2. Augmented Reality M-Learning to Enhance Nursing Skills Acquisition in the Clinical Skills Laboratory

    ERIC Educational Resources Information Center

    Garrett, Bernard M.; Jackson, Cathryn; Wilson, Brian

    2015-01-01

    Purpose: This paper aims to report on a pilot research project designed to explore if new mobile augmented reality (AR) technologies have the potential to enhance the learning of clinical skills in the lab. Design/methodology/approach: An exploratory action-research-based pilot study was undertaken to explore an initial proof-of-concept design in…

  3. Integrated Authoring Tool for Mobile Augmented Reality-Based E-Learning Applications

    ERIC Educational Resources Information Center

    Lobo, Marcos Fermin; Álvarez García, Víctor Manuel; del Puerto Paule Ruiz, María

    2013-01-01

    Learning management systems are increasingly being used to complement classroom teaching and learning and in some instances even replace traditional classroom settings with online educational tools. Mobile augmented reality is an innovative trend in e-learning that is creating new opportunities for teaching and learning. This article proposes a…

  4. Alien Contact!: Exploring Teacher Implementation of an Augmented Reality Curricular Unit

    ERIC Educational Resources Information Center

    Mitchell, Rebecca

    2011-01-01

    This paper reports on findings from a five-teacher, exploratory case study, critically observing their implementation of a technology-intensive, augmented reality (AR) mathematics curriculum unit, along with its paper-based control. The unit itself was intended to promote multiple proportional-reasoning strategies with urban, public, middle school…

  5. Using Augmented Reality in Early Art Education: A Case Study in Hong Kong Kindergarten

    ERIC Educational Resources Information Center

    Huang, Yujia; Li, Hui; Fong, Ricci

    2016-01-01

    Innovation in pedagogy by technology integration in kindergarten classroom has always been a challenge for most teachers. This design-based research aimed to explore the feasibility of using Augmented Reality (AR) technology in early art education with a focus on the gains and pains of this innovation. A case study was conducted in a typical…

  6. Augmented Reality for Teaching Science Vocabulary to Postsecondary Education Students with Intellectual Disabilities and Autism

    ERIC Educational Resources Information Center

    McMahon, Don D.; Cihak, David F.; Wright, Rachel E.; Bell, Sherry Mee

    2016-01-01

    The purpose of this study was to examine the use of an emerging technology called augmented reality to teach science vocabulary words to college students with intellectual disability and autism spectrum disorders. One student with autism and three students with an intellectual disability participated in a multiple probe across behaviors (i.e.,…

  7. Assessing the Effectiveness of Learning Solid Geometry by Using an Augmented Reality-Assisted Learning System

    ERIC Educational Resources Information Center

    Lin, Hao-Chiang Koong; Chen, Mei-Chi; Chang, Chih-Kai

    2015-01-01

    This study integrates augmented reality (AR) technology into teaching activities to design a learning system that assists junior high-school students in learning solid geometry. The following issues are addressed: (1) the relationship between achievements in mathematics and performance in spatial perception; (2) whether system-assisted learning…

  8. Integrating Augmented Reality Technology to Enhance Children's Learning in Marine Education

    ERIC Educational Resources Information Center

    Lu, Su-Ju; Liu, Ying-Chieh

    2015-01-01

    Marine education comprises rich and multifaceted issues. Raising general awareness of marine environments and issues demands the development of new learning materials. This study adapts concepts from digital game-based learning to design an innovative marine learning program integrating augmented reality (AR) technology for lower grade primary…

  9. What Teachers Need to Know about Augmented Reality Enhanced Learning Environments

    ERIC Educational Resources Information Center

    Wasko, Christopher

    2013-01-01

    Augmented reality (AR) enhanced learning environments have been designed to teach a variety of subjects by having learners act like professionals in the field as opposed to students in a classroom. The environments, grounded in constructivist and situated learning theories, place students in a meaningful, non-classroom environment and force them…

  10. ARBOOK: Development and Assessment of a Tool Based on Augmented Reality for Anatomy

    ERIC Educational Resources Information Center

    Ferrer-Torregrosa, J.; Torralba, J.; Jimenez, M. A.; García, S.; Barcia, J. M.

    2015-01-01

    The evolution of technologies and the development of new tools with educational purposes are growing up. This work presents the experience of a new tool based on augmented reality (AR) focusing on the anatomy of the lower limb. ARBOOK was constructed and developed based on TC and MRN images, dissections and drawings. For ARBOOK evaluation, a…

  11. Examining Young Children's Perception toward Augmented Reality-Infused Dramatic Play

    ERIC Educational Resources Information Center

    Han, Jeonghye; Jo, Miheon; Hyun, Eunja; So, Hyo-jeong

    2015-01-01

    Amid the increasing interest in applying augmented reality (AR) in educational settings, this study explores the design and enactment of an AR-infused robot system to enhance children's satisfaction and sensory engagement with dramatic play activities. In particular, we conducted an exploratory study to empirically examine children's perceptions…

  12. Comparing Virtual and Location-Based Augmented Reality Mobile Learning: Emotions and Learning Outcomes

    ERIC Educational Resources Information Center

    Harley, Jason M.; Poitras, Eric G.; Jarrell, Amanda; Duffy, Melissa C.; Lajoie, Susanne P.

    2016-01-01

    Research on the effectiveness of augmented reality (AR) on learning exists, but there is a paucity of empirical work that explores the role that positive emotions play in supporting learning in such settings. To address this gap, this study compared undergraduate students' emotions and learning outcomes during a guided historical tour using mobile…

  13. Apply an Augmented Reality in a Mobile Guidance to Increase Sense of Place for Heritage Places

    ERIC Educational Resources Information Center

    Chang, Yu-Lien; Hou, Huei-Tse; Pan, Chao-Yang; Sung, Yao-Ting; Chang, Kuo-En

    2015-01-01

    Based on the sense of place theory and the design principles of guidance and interpretation, this study developed an augmented reality mobile guidance system that used a historical geo-context-embedded visiting strategy. This tool for heritage guidance and educational activities enhanced visitor sense of place. This study consisted of 3 visitor…

  14. The Use of Augmented Reality Games in Education: A Review of the Literature

    ERIC Educational Resources Information Center

    Koutromanos, George; Sofos, Alivisos; Avraamidou, Lucy

    2015-01-01

    This paper provides a review of the literature about the use of augmented reality in education and specifically in the context of formal and informal environments. It examines the research that has been conducted up to date on the use of those games through mobile technology devices such as mobile phones and tablets, both in primary and secondary…

  15. Affordances and Limitations of Immersive Participatory Augmented Reality Simulations for Teaching and Learning

    ERIC Educational Resources Information Center

    Dunleavy, Matt; Dede, Chris; Mitchell, Rebecca

    2009-01-01

    The purpose of this study was to document how teachers and students describe and comprehend the ways in which participating in an augmented reality (AR) simulation aids or hinders teaching and learning. Like the multi-user virtual environment (MUVE) interface that underlies Internet games, AR is a good medium for immersive collaborative…

  16. Affordances of Augmented Reality in Science Learning: Suggestions for Future Research

    ERIC Educational Resources Information Center

    Cheng, Kun-Hung; Tsai, Chin-Chung

    2013-01-01

    Augmented reality (AR) is currently considered as having potential for pedagogical applications. However, in science education, research regarding AR-aided learning is in its infancy. To understand how AR could help science learning, this review paper firstly has identified two major approaches of utilizing AR technology in science education,…

  17. Wearable computer for mobile augmented-reality-based controlling of an intelligent robot

    NASA Astrophysics Data System (ADS)

    Turunen, Tuukka; Roening, Juha; Ahola, Sami; Pyssysalo, Tino

    2000-10-01

    An intelligent robot can be utilized to perform tasks that are either hazardous or unpleasant for humans. Such tasks include working in disaster areas or conditions that are, for example, too hot. An intelligent robot can work on its own to some extent, but in some cases the aid of humans will be needed. This requires means for controlling the robot from somewhere else, i.e. teleoperation. Mobile augmented reality can be utilized as a user interface to the environment, as it enhances the user's perception of the situation compared to other interfacing methods and allows the user to perform other tasks while controlling the intelligent robot. Augmented reality is a method that combines virtual objects into the user's perception of the real world. As computer technology evolves, it is possible to build very small devices that have sufficient capabilities for augmented reality applications. We have evaluated the existing wearable computers and mobile augmented reality systems to build a prototype of a future mobile terminal- the CyPhone. A wearable computer with sufficient system resources for applications, wireless communication media with sufficient throughput and enough interfaces for peripherals has been built at the University of Oulu. It is self-sustained in energy, with enough operating time for the applications to be useful, and uses accurate positioning systems.

  18. Augmented reality image guidance for minimally invasive coronary artery bypass

    NASA Astrophysics Data System (ADS)

    Figl, Michael; Rueckert, Daniel; Hawkes, David; Casula, Roberto; Hu, Mingxing; Pedro, Ose; Zhang, Dong Ping; Penney, Graeme; Bello, Fernando; Edwards, Philip

    2008-03-01

    We propose a novel system for image guidance in totally endoscopic coronary artery bypass (TECAB). A key requirement is the availability of 2D-3D registration techniques that can deal with non-rigid motion and deformation. Image guidance for TECAB is mainly required before the mechanical stabilization of the heart, thus the most dominant source of non-rigid deformation is the motion of the beating heart. To augment the images in the endoscope of the da Vinci robot, we have to find the transformation from the coordinate system of the preoperative imaging modality to the system of the endoscopic cameras. In a first step we build a 4D motion model of the beating heart. Intraoperatively we can use the ECG or video processing to determine the phase of the cardiac cycle. We can then take the heart surface from the motion model and register it to the stereo-endoscopic images of the da Vinci robot using 2D-3D registration methods. We are investigating robust feature tracking and intensity-based methods for this purpose. Images of the vessels available in the preoperative coordinate system can then be transformed to the camera system and projected into the calibrated endoscope view using two video mixers with chroma keying. It is hoped that the augmented view can improve the efficiency of TECAB surgery and reduce the conversion rate to more conventional procedures.

  19. Augmented reality: don't we all wish we lived in one?

    SciTech Connect

    Hayes, Birchard P; Michel, Kelly D; Few, Douglas A; Gertman, David; Le Blanc, Katya

    2010-01-01

    From stereophonic, positional sound to high-definition imagery that is crisp and clean, high fidelity computer graphics enhance our view, insight, and intuition regarding our environments and conditions. Contemporary 3-D modeling tools offer an open architecture framework that enables integration with other technologically innovative arenas. One innovation of great interest is Augmented Reality, the merging of virtual, digital environments with physical, real-world environments creating a mixed reality where relevant data and information augments the real or actual experience in real-time by spatial or semantic context. Pairing 3-D virtual immersive models with a dynamic platform such as semi-autonomous robotics or personnel odometry systems to create a mixed reality offers a new and innovative design information verification inspection capability, evaluation accuracy, and information gathering capability for nuclear facilities. Our paper discusses the integration of two innovative technologies, 3-D visualizations with inertial positioning systems, and the resulting augmented reality offered to the human inspector. The discussion in the paper includes an exploration of human and non-human (surrogate) inspections of a nuclear facility, integrated safeguards knowledge within a synchronized virtual model operated, or worn, by a human inspector, and the anticipated benefits to safeguards evaluations of facility operations.

  20. Mobile augmented reality in support of building damage and safety assessment

    NASA Astrophysics Data System (ADS)

    Kim, W.; Kerle, N.; Gerke, M.

    2016-02-01

    Rapid and accurate assessment of the state of buildings in the aftermath of a disaster event is critical for an effective and timely response. For rapid damage assessment of buildings, the utility of remote sensing (RS) technology has been widely researched, with focus on a range of platforms and sensors. However, RS-based approaches still have limitations to assess structural integrity and the specific damage status of individual buildings. Structural integrity refers to the ability of a building to hold the entire structure. Consequently, ground-based assessment conducted by structural engineers and first responders is still required. This paper demonstrates the concept of mobile augmented reality (mAR) to improve performance of building damage and safety assessment in situ. Mobile AR provides a means to superimpose various types of reference or pre-disaster information (virtual data) on actual post-disaster building data (real buildings). To adopt mobile AR, this study defines a conceptual framework based on the level of complexity (LOC). The framework consists of four LOCs, and for each of these, the data types, required processing steps, AR implementation and use for damage assessment are described. Based on this conceptualization we demonstrate prototypes of mAR for both indoor and outdoor purposes. Finally, we conduct a user evaluation of the prototypes to validate the mAR approach for building damage and safety assessment.

  1. Mobile Augmented Reality in support of building damage and safety assessment

    NASA Astrophysics Data System (ADS)

    Kim, W.; Kerle, N.; Gerke, M.

    2015-04-01

    Rapid and accurate assessment of the state of buildings in the aftermath of a disaster event is critical for an effective and timely response. For rapid damage assessment of buildings, the utility of remote sensing (RS) technology has been widely researched, with focus on a range of platforms and sensors. However, RS-based approach still have limitations to assess structural integrity and the specific damage status of individual buildings. Consequently, ground-based assessment conducted by structural engineers and first responders is still required. This paper demonstrates the concept of mobile Augmented Reality (mAR) to improve performance of building damage and safety assessment in situ. Mobile AR provides a means to superimpose various types of reference or pre-disaster information (virtual data) on actual post-disaster building data (real building). To adopt mobile AR, this study defines a conceptual framework based on Level of Complexity (LOC). The framework consists of four LOCs, and for each of these the data types, required processing steps, AR implementation, and use for damage assessment, are described. Based on this conceptualization we demonstrate prototypes of mAR for both indoor and outdoor purposes. Finally, we conduct a user evaluation of the prototypes to validate the mAR approach for building damage and safety assessment.

  2. A method for real-time generation of augmented reality work instructions via expert movements

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Bhaskar; Winer, Eliot

    2015-03-01

    Augmented Reality (AR) offers tremendous potential for a wide range of fields including entertainment, medicine, and engineering. AR allows digital models to be integrated with a real scene (typically viewed through a video camera) to provide useful information in a variety of contexts. The difficulty in authoring and modifying scenes is one of the biggest obstacles to widespread adoption of AR. 3D models must be created, textured, oriented and positioned to create the complex overlays viewed by a user. This often requires using multiple software packages in addition to performing model format conversions. In this paper, a new authoring tool is presented which uses a novel method to capture product assembly steps performed by a user with a depth+RGB camera. Through a combination of computer vision and imaging process techniques, each individual step is decomposed into objects and actions. The objects are matched to those in a predetermined geometry library and the actions turned into animated assembly steps. The subsequent instruction set is then generated with minimal user input. A proof of concept is presented to establish the method's viability.

  3. Comparing "pick and place" task in spatial Augmented Reality versus non-immersive Virtual Reality for rehabilitation setting.

    PubMed

    Khademi, Maryam; Hondori, Hossein Mousavi; Dodakian, Lucy; Cramer, Steve; Lopes, Cristina V

    2013-01-01

    Introducing computer games to the rehabilitation market led to development of numerous Virtual Reality (VR) training applications. Although VR has provided tremendous benefit to the patients and caregivers, it has inherent limitations, some of which might be solved by replacing it with Augmented Reality (AR). The task of pick-and-place, which is part of many activities of daily living (ADL's), is one of the major affected functions stroke patients mainly expect to recover. We developed an exercise consisting of moving an object between various points, following a flash light that indicates the next target. The results show superior performance of subjects in spatial AR versus non-immersive VR setting. This could be due to the extraneous hand-eye coordination which exists in VR whereas it is eliminated in spatial AR. PMID:24110762

  4. Virtual reality, augmented reality, and robotics applied to digestive operative procedures: from in vivo animal preclinical studies to clinical use

    NASA Astrophysics Data System (ADS)

    Soler, Luc; Marescaux, Jacques

    2006-04-01

    Technological innovations of the 20 th century provided medicine and surgery with new tools, among which virtual reality and robotics belong to the most revolutionary ones. Our work aims at setting up new techniques for detection, 3D delineation and 4D time follow-up of small abdominal lesions from standard mecial images (CT scsan, MRI). It also aims at developing innovative systems making tumor resection or treatment easier with the use of augmented reality and robotized systems, increasing gesture precision. It also permits a realtime great distance connection between practitioners so they can share a same 3D reconstructed patient and interact on a same patient, virtually before the intervention and for real during the surgical procedure thanks to a telesurgical robot. In preclinical studies, our first results obtained from a micro-CT scanner show that these technologies provide an efficient and precise 3D modeling of anatomical and pathological structures of rats and mice. In clinical studies, our first results show the possibility to improve the therapeutic choice thanks to a better detection and and representation of the patient before performing the surgical gesture. They also show the efficiency of augmented reality that provides virtual transparency of the patient in real time during the operative procedure. In the near future, through the exploitation of these systems, surgeons will program and check on the virtual patient clone an optimal procedure without errors, which will be replayed on the real patient by the robot under surgeon control. This medical dream is today about to become reality.

  5. Sensor-Aware Recognition and Tracking for Wide-Area Augmented Reality on Mobile Phones.

    PubMed

    Chen, Jing; Cao, Ruochen; Wang, Yongtian

    2015-01-01

    Wide-area registration in outdoor environments on mobile phones is a challenging task in mobile augmented reality fields. We present a sensor-aware large-scale outdoor augmented reality system for recognition and tracking on mobile phones. GPS and gravity information is used to improve the VLAD performance for recognition. A kind of sensor-aware VLAD algorithm, which is self-adaptive to different scale scenes, is utilized to recognize complex scenes. Considering vision-based registration algorithms are too fragile and tend to drift, data coming from inertial sensors and vision are fused together by an extended Kalman filter (EKF) to achieve considerable improvements in tracking stability and robustness. Experimental results show that our method greatly enhances the recognition rate and eliminates the tracking jitters. PMID:26690439

  6. Comparative Evaluation of Monocular Augmented-Reality Display for Surgical Microscopes

    PubMed Central

    Palma, Santiago Rodríguez; Becker, Brian C.; Lobes, Louis A.; Riviere, Cameron N.

    2012-01-01

    Medical augmented reality has undergone great development recently. However, there is a lack of studies to compare quantitatively the different display options available. This paper compares the effects of different graphical overlay systems in a simple micromanipulation task with “soft” visual servoing. We compared positioning accuracy in a real-time visually-guided task using Micron, an active handheld tremor-canceling microsurgical instrument, using three different displays: 2D screen, 3D screen, and microscope with monocular image injection. Tested with novices and an experienced vitreoretinal surgeon, display of virtual cues in the microscope via an augmented reality injection system significantly (p < 0.05) decreased 3D error compared to the 2D and 3D monitors when confounding factors such as magnification level were normalized. PMID:23366164

  7. Sensor-Aware Recognition and Tracking for Wide-Area Augmented Reality on Mobile Phones

    PubMed Central

    Chen, Jing; Cao, Ruochen; Wang, Yongtian

    2015-01-01

    Wide-area registration in outdoor environments on mobile phones is a challenging task in mobile augmented reality fields. We present a sensor-aware large-scale outdoor augmented reality system for recognition and tracking on mobile phones. GPS and gravity information is used to improve the VLAD performance for recognition. A kind of sensor-aware VLAD algorithm, which is self-adaptive to different scale scenes, is utilized to recognize complex scenes. Considering vision-based registration algorithms are too fragile and tend to drift, data coming from inertial sensors and vision are fused together by an extended Kalman filter (EKF) to achieve considerable improvements in tracking stability and robustness. Experimental results show that our method greatly enhances the recognition rate and eliminates the tracking jitters. PMID:26690439

  8. Avoiding Focus Shifts in Surgical Telementoring Using an Augmented Reality Transparent Display.

    PubMed

    Andersen, Daniel; Popescu, Voicu; Cabrera, Maria Eugenia; Shanghavi, Aditya; Gomez, Gerardo; Marley, Sherri; Mullis, Brian; Wachs, Juan

    2016-01-01

    Conventional surgical telementoring systems require the trainee to shift focus away from the operating field to a nearby monitor to receive mentor guidance. This paper presents the next generation of telementoring systems. Our system, STAR (System for Telementoring with Augmented Reality) avoids focus shifts by placing mentor annotations directly into the trainee's field of view using augmented reality transparent display technology. This prototype was tested with pre-medical and medical students. Experiments were conducted where participants were asked to identify precise operating field locations communicated to them using either STAR or a conventional telementoring system. STAR was shown to improve accuracy and to reduce focus shifts. The initial STAR prototype only provides an approximate transparent display effect, without visual continuity between the display and the surrounding area. The current version of our transparent display provides visual continuity by showing the geometry and color of the operating field from the trainee's viewpoint. PMID:27046545

  9. Augmented Reality in a Simulated Tower Environment: Effect of Field of View on Aircraft Detection

    NASA Technical Reports Server (NTRS)

    Ellis, Stephen R.; Adelstein, Bernard D.; Reisman, Ronald J.; Schmidt-Ott, Joelle R.; Gips, Jonathan; Krozel, Jimmy; Cohen, Malcolm (Technical Monitor)

    2002-01-01

    An optical see-through, augmented reality display was used to study subjects' ability to detect aircraft maneuvering and landing at the Dallas Ft. Worth International airport in an ATC Tower simulation. Subjects monitored the traffic patterns as if from the airport's western control tower. Three binocular fields of view (14 deg, 28 deg and 47 deg) were studied in an independent groups' design to measure the degradation in detection performance associated with the visual field restrictions. In a second experiment the 14 deg and 28 deg fields were presented either with 46% binocular overlap or 100% overlap for separate groups. The near asymptotic results of the first experiment suggest that binocular fields of view much greater than 47% are unlikely to dramatically improve performance; and those of the second experiment suggest that partial binocular overlap is feasible for augmented reality displays such as may be used for ATC tower applications.

  10. Vision-based registration for augmented reality system using monocular and binocular vision

    NASA Astrophysics Data System (ADS)

    Vallerand, Steve; Kanbara, Masayuki; Yokoya, Naokazu

    2003-05-01

    In vision-based augmented reality systems, the relation between the real and virtual worlds needs to be estimated to perform the registration of virtual objects. This paper suggests a vision-based registration method for video see-through augmented reality systems using binocular cameras which increases the quality of the registration performed using three points of a known marker. The originality of this work is the use of both monocular vision-based and stereoscopic vision-based techniques in order to complete the registration. Also, a method that performs a correction of the 2D positions in the images of the marker points is proposed. The correction improves the registration stability and accuracy of the system. The stability of the registration obtained with the proposed registration method combined or not with the correction method is compared to the registration obtained with standard stereoscopic registration.

  11. How to reduce workload--augmented reality to ease the work of air traffic controllers.

    PubMed

    Hofmann, Thomas; König, Christina; Bruder, Ralph; Bergner, Jörg

    2012-01-01

    In the future the air traffic will rise--the workload of the controllers will do the same. In the BMWi research project, one of the tasks is, how to ensure safe air traffic, and a reasonable workload for the air traffic controllers. In this project it was the goal to find ways how to reduce the workload (and stress) for the controllers to allow safe air traffic, esp. at huge hub-airports by implementing augmented reality visualization and interaction. PMID:22316878

  12. Tracking and registration method based on vector operation for augmented reality system

    NASA Astrophysics Data System (ADS)

    Gao, Yanfei; Wang, Hengyou; Bian, Xiaoning

    2015-08-01

    Tracking and registration is one key issue for an augmented reality (AR) system. For the marker-based AR system, the research focuses on detecting the real-time position and orientation of camera. In this paper, we describe a method of tracking and registration using the vector operations. Our method is proved to be stable and accurate, and have a good real-time performance.

  13. Technical experience from clinical studies with INPRES and a concept for a miniature augmented reality system

    NASA Astrophysics Data System (ADS)

    Sudra, Gunther; Marmulla, Ruediger; Salb, Tobias; Gockel, Tilo; Eggers, Georg; Giesler, Bjoern; Ghanai, Sassan; Fritz, Dominik; Dillmann, Ruediger; Muehling, Joachim

    2005-04-01

    This paper is going to present a summary of our technical experience with the INPRES System -- an augmented reality system based upon a tracked see-through head-mounted display. With INPRES a complete augmented reality solution has been developed that has crucial advantages when compared with previous navigation systems. Using these techniques the surgeon does not need to turn his head from the patient to the computer monitor and vice versa. The system's purpose is to display virtual objects, e.g. cutting trajectories, tumours and risk-areas from computer-based surgical planning systems directly in the surgical site. The INPRES system was evaluated in several patient experiments in craniofacial surgery at the Department of Oral and Maxillofacial Surgery/University of Heidelberg. We will discuss the technical advantages as well as the limitations of INPRES and present two strategies as a result. On the one hand we will improve the existing and successful INPRES system with new hardware and a new calibration method to compensate for the stated disadvantage. On the other hand we will focus on miniaturized augmented reality systems and present a new concept based on fibre optics. This new system should be easily adaptable at surgical instruments and capable of projecting small structures. It consists of a source of light, a miniature TFT display, a fibre optic cable and a tool grip. Compared to established projection systems it has the capability of projecting into areas that are only accessible by a narrow path. No wide surgical exposure of the region is necessary for the use of augmented reality.

  14. Design and Validation of an Augmented Reality System for Laparoscopic Surgery in a Real Environment

    PubMed Central

    López-Mir, F.; Naranjo, V.; Fuertes, J. J.; Alcañiz, M.; Bueno, J.; Pareja, E.

    2013-01-01

    Purpose. This work presents the protocol carried out in the development and validation of an augmented reality system which was installed in an operating theatre to help surgeons with trocar placement during laparoscopic surgery. The purpose of this validation is to demonstrate the improvements that this system can provide to the field of medicine, particularly surgery. Method. Two experiments that were noninvasive for both the patient and the surgeon were designed. In one of these experiments the augmented reality system was used, the other one was the control experiment, and the system was not used. The type of operation selected for all cases was a cholecystectomy due to the low degree of complexity and complications before, during, and after the surgery. The technique used in the placement of trocars was the French technique, but the results can be extrapolated to any other technique and operation. Results and Conclusion. Four clinicians and ninety-six measurements obtained of twenty-four patients (randomly assigned in each experiment) were involved in these experiments. The final results show an improvement in accuracy and variability of 33% and 63%, respectively, in comparison to traditional methods, demonstrating that the use of an augmented reality system offers advantages for trocar placement in laparoscopic surgery. PMID:24236293

  15. Augmented reality three-dimensional object visualization and recognition with axially distributed sensing.

    PubMed

    Markman, Adam; Shen, Xin; Hua, Hong; Javidi, Bahram

    2016-01-15

    An augmented reality (AR) smartglass display combines real-world scenes with digital information enabling the rapid growth of AR-based applications. We present an augmented reality-based approach for three-dimensional (3D) optical visualization and object recognition using axially distributed sensing (ADS). For object recognition, the 3D scene is reconstructed, and feature extraction is performed by calculating the histogram of oriented gradients (HOG) of a sliding window. A support vector machine (SVM) is then used for classification. Once an object has been identified, the 3D reconstructed scene with the detected object is optically displayed in the smartglasses allowing the user to see the object, remove partial occlusions of the object, and provide critical information about the object such as 3D coordinates, which are not possible with conventional AR devices. To the best of our knowledge, this is the first report on combining axially distributed sensing with 3D object visualization and recognition for applications to augmented reality. The proposed approach can have benefits for many applications, including medical, military, transportation, and manufacturing. PMID:26766698

  16. Does Augmented Reality Affect High School Students' Learning Outcomes in Chemistry?

    NASA Astrophysics Data System (ADS)

    Renner, Jonathan Christopher

    Some teens may prefer using a self-directed, constructivist, and technologic approach to learning rather than traditional classroom instruction. If it can be demonstrated, educators may adjust their teaching methodology. The guiding research question for this study focused on how augmented reality affects high school students' learning outcomes in chemistry, as measured by a pretest and posttest methodology when ensuring that the individual outcomes were not the result of group collaboration. This study employed a quantitative, quasi-experimental study design that used a comparison and experimental group. Inferential statistical analysis was employed. The study was conducted at a high school in southwest Colorado. Eighty-nine respondents returned completed and signed consent forms, and 78 participants completed the study. Results demonstrated that augmented reality instruction caused posttest scores to significantly increase, as compared to pretest scores, but it was not as effective as traditional classroom instruction. Scores did improve under both types of instruction; therefore, more research is needed in this area. The present study was the first quantitative experiment controlling for individual learning to validate augmented reality using mobile handheld digital devices that affected individual students' learning outcomes without group collaboration. This topic was important to the field of education as it may help educators understand how students learn and it may also change the way students are taught.

  17. 3D interactive augmented reality-enhanced digital learning systems for mobile devices

    NASA Astrophysics Data System (ADS)

    Feng, Kai-Ten; Tseng, Po-Hsuan; Chiu, Pei-Shuan; Yang, Jia-Lin; Chiu, Chun-Jie

    2013-03-01

    With enhanced processing capability of mobile platforms, augmented reality (AR) has been considered a promising technology for achieving enhanced user experiences (UX). Augmented reality is to impose virtual information, e.g., videos and images, onto a live-view digital display. UX on real-world environment via the display can be e ectively enhanced with the adoption of interactive AR technology. Enhancement on UX can be bene cial for digital learning systems. There are existing research works based on AR targeting for the design of e-learning systems. However, none of these work focuses on providing three-dimensional (3-D) object modeling for en- hanced UX based on interactive AR techniques. In this paper, the 3-D interactive augmented reality-enhanced learning (IARL) systems will be proposed to provide enhanced UX for digital learning. The proposed IARL systems consist of two major components, including the markerless pattern recognition (MPR) for 3-D models and velocity-based object tracking (VOT) algorithms. Realistic implementation of proposed IARL system is conducted on Android-based mobile platforms. UX on digital learning can be greatly improved with the adoption of proposed IARL systems.

  18. Augmented reality and cone beam CT guidance for transoral robotic surgery.

    PubMed

    Liu, Wen P; Richmon, Jeremy D; Sorger, Jonathan M; Azizian, Mahdi; Taylor, Russell H

    2015-09-01

    In transoral robotic surgery preoperative image data do not reflect large deformations of the operative workspace from perioperative setup. To address this challenge, in this study we explore image guidance with cone beam computed tomographic angiography to guide the dissection of critical vascular landmarks and resection of base-of-tongue neoplasms with adequate margins for transoral robotic surgery. We identify critical vascular landmarks from perioperative c-arm imaging to augment the stereoscopic view of a da Vinci si robot in addition to incorporating visual feedback from relative tool positions. Experiments resecting base-of-tongue mock tumors were conducted on a series of ex vivo and in vivo animal models comparing the proposed workflow for video augmentation to standard non-augmented practice and alternative, fluoroscopy-based image guidance. Accurate identification of registered augmented critical anatomy during controlled arterial dissection and en bloc mock tumor resection was possible with the augmented reality system. The proposed image-guided robotic system also achieved improved resection ratios of mock tumor margins (1.00) when compared to control scenarios (0.0) and alternative methods of image guidance (0.58). The experimental results show the feasibility of the proposed workflow and advantages of cone beam computed tomography image guidance through video augmentation of the primary stereo endoscopy as compared to control and alternative navigation methods. PMID:26531203

  19. Impact of Soft Tissue Heterogeneity on Augmented Reality for Liver Surgery.

    PubMed

    Haouchine, Nazim; Cotin, Stephane; Peterlik, Igor; Dequidt, Jeremie; Lopez, Mario Sanz; Kerrien, Erwan; Berger, Marie-Odile

    2015-05-01

    This paper presents a method for real-time augmented reality of internal liver structures during minimally invasive hepatic surgery. Vessels and tumors computed from pre-operative CT scans can be overlaid onto the laparoscopic view for surgery guidance. Compared to current methods, our method is able to locate the in-depth positions of the tumors based on partial three-dimensional liver tissue motion using a real-time biomechanical model. This model permits to properly handle the motion of internal structures even in the case of anisotropic or heterogeneous tissues, as it is the case for the liver and many anatomical structures. Experimentations conducted on phantom liver permits to measure the accuracy of the augmentation while real-time augmentation on in vivo human liver during real surgery shows the benefits of such an approach for minimally invasive surgery. PMID:26357206

  20. Augmented reality system for MR-guided interventions: phantom studies and first animal test

    NASA Astrophysics Data System (ADS)

    Vogt, Sebastian; Wacker, Frank; Khamene, Ali; Elgort, Daniel R.; Sielhorst, Tobias; Niemann, Heinrich; Duerk, Jeff; Lewin, Jonathan S.; Sauer, Frank

    2004-05-01

    We developed an augmented reality navigation system for MR-guided interventions. A head-mounted display provides in real-time a stereoscopic video-view of the patient, which is augmented with three-dimensional medical information to perform MR-guided needle placement procedures. Besides with the MR image information, we augment the scene with 3D graphics representing a forward extension of the needle and the needle itself. During insertion, the needle can be observed virtually at its actual location in real-time, supporting the interventional procedure in an efficient and intuitive way. In this paper we report on quantitative results of AR guided needle placement procedures on gel phantoms with embedded targets of 12mm and 6mm diameter; we furthermore evaluate our first animal experiment involving needle insertion into deep lying anatomical structures of a pig.

  1. Planning, simulation, and augmented reality for robotic cardiac procedures: The STARS system of the ChIR team.

    PubMed

    Coste-Manière, Eve; Adhami, Louaï; Mourgues, Fabien; Carpentier, Alain

    2003-04-01

    This paper presents STARS (Simulation and Transfer Architecture for Robotic Surgery), a versatile system that aims at enhancing minimally invasive robotic surgery through patient-dependent optimized planning, realistic simulation, safe supervision, and augmented reality. The underlying architecture of the proposed approach is presented, then each component is detailed. An experimental validation is conducted on a dog for a coronary bypass intervention using the Da Vinci(TM) surgical system focusing on planing, registration, and augmented reality trials. PMID:12838484

  2. In Vivo versus Augmented Reality Exposure in the Treatment of Small Animal Phobia: A Randomized Controlled Trial.

    PubMed

    Botella, Cristina; Pérez-Ara, M Ángeles; Bretón-López, Juana; Quero, Soledad; García-Palacios, Azucena; Baños, Rosa María

    2016-01-01

    Although in vivo exposure is the treatment of choice for specific phobias, some acceptability problems have been associated with it. Virtual Reality exposure has been shown to be as effective as in vivo exposure, and it is widely accepted for the treatment of specific phobias, but only preliminary data are available in the literature about the efficacy of Augmented Reality. The purpose of the present study was to examine the efficacy and acceptance of two treatment conditions for specific phobias in which the exposure component was applied in different ways: In vivo exposure (N = 31) versus an Augmented Reality system (N = 32) in a randomized controlled trial. "One-session treatment" guidelines were followed. Participants in the Augmented Reality condition significantly improved on all the outcome measures at post-treatment and follow-ups. When the two treatment conditions were compared, some differences were found at post-treatment, favoring the participants who received in vivo exposure. However, these differences disappeared at the 3- and 6-month follow-ups. Regarding participants' expectations and satisfaction with the treatment, very positive ratings were reported in both conditions. In addition, participants from in vivo exposure condition considered the treatment more useful for their problem whereas participants from Augmented Reality exposure considered the treatment less aversive. Results obtained in this study indicate that Augmented Reality exposure is an effective treatment for specific phobias and well accepted by the participants. PMID:26886423

  3. In Vivo versus Augmented Reality Exposure in the Treatment of Small Animal Phobia: A Randomized Controlled Trial

    PubMed Central

    Botella, Cristina; Pérez-Ara, M. Ángeles; Bretón-López, Juana; Quero, Soledad; García-Palacios, Azucena; Baños, Rosa María

    2016-01-01

    Although in vivo exposure is the treatment of choice for specific phobias, some acceptability problems have been associated with it. Virtual Reality exposure has been shown to be as effective as in vivo exposure, and it is widely accepted for the treatment of specific phobias, but only preliminary data are available in the literature about the efficacy of Augmented Reality. The purpose of the present study was to examine the efficacy and acceptance of two treatment conditions for specific phobias in which the exposure component was applied in different ways: In vivo exposure (N = 31) versus an Augmented Reality system (N = 32) in a randomized controlled trial. “One-session treatment” guidelines were followed. Participants in the Augmented Reality condition significantly improved on all the outcome measures at post-treatment and follow-ups. When the two treatment conditions were compared, some differences were found at post-treatment, favoring the participants who received in vivo exposure. However, these differences disappeared at the 3- and 6-month follow-ups. Regarding participants’ expectations and satisfaction with the treatment, very positive ratings were reported in both conditions. In addition, participants from in vivo exposure condition considered the treatment more useful for their problem whereas participants from Augmented Reality exposure considered the treatment less aversive. Results obtained in this study indicate that Augmented Reality exposure is an effective treatment for specific phobias and well accepted by the participants. PMID:26886423

  4. Augmented reality for the assessment of children's spatial memory in real settings.

    PubMed

    Juan, M-Carmen; Mendez-Lopez, Magdalena; Perez-Hernandez, Elena; Albiol-Perez, Sergio

    2014-01-01

    Short-term memory can be defined as the capacity for holding a small amount of information in mind in an active state for a short period of time. Although some instruments have been developed to study spatial short-term memory in real environments, there are no instruments that are specifically designed to assess visuospatial short-term memory in an attractive way to children. In this paper, we present the ARSM (Augmented Reality Spatial Memory) task, the first Augmented Reality task that involves a user's movement to assess spatial short-term memory in healthy children. The experimental procedure of the ARSM task was designed to assess the children's skill to retain visuospatial information. They were individually asked to remember the real place where augmented reality objects were located. The children (N = 76) were divided into two groups: preschool (5-6 year olds) and primary school (7-8 year olds). We found a significant improvement in ARSM task performance in the older group. The correlations between scores for the ARSM task and traditional procedures were significant. These traditional procedures were the Dot Matrix subtest for the assessment of visuospatial short-term memory of the computerized AWMA-2 battery and a parent's questionnaire about a child's everyday spatial memory. Hence, we suggest that the ARSM task has high verisimilitude with spatial short-term memory skills in real life. In addition, we evaluated the ARSM task's usability and perceived satisfaction. The study revealed that the younger children were more satisfied with the ARSM task. This novel instrument could be useful in detecting visuospatial short-term difficulties that affect specific developmental navigational disorders and/or school academic achievement. PMID:25438146

  5. Augmented Reality for the Assessment of Children's Spatial Memory in Real Settings

    PubMed Central

    Juan, M.-Carmen; Mendez-Lopez, Magdalena; Perez-Hernandez, Elena; Albiol-Perez, Sergio

    2014-01-01

    Short-term memory can be defined as the capacity for holding a small amount of information in mind in an active state for a short period of time. Although some instruments have been developed to study spatial short-term memory in real environments, there are no instruments that are specifically designed to assess visuospatial short-term memory in an attractive way to children. In this paper, we present the ARSM (Augmented Reality Spatial Memory) task, the first Augmented Reality task that involves a user's movement to assess spatial short-term memory in healthy children. The experimental procedure of the ARSM task was designed to assess the children's skill to retain visuospatial information. They were individually asked to remember the real place where augmented reality objects were located. The children (N = 76) were divided into two groups: preschool (5–6 year olds) and primary school (7–8 year olds). We found a significant improvement in ARSM task performance in the older group. The correlations between scores for the ARSM task and traditional procedures were significant. These traditional procedures were the Dot Matrix subtest for the assessment of visuospatial short-term memory of the computerized AWMA-2 battery and a parent's questionnaire about a child's everyday spatial memory. Hence, we suggest that the ARSM task has high verisimilitude with spatial short-term memory skills in real life. In addition, we evaluated the ARSM task's usability and perceived satisfaction. The study revealed that the younger children were more satisfied with the ARSM task. This novel instrument could be useful in detecting visuospatial short-term difficulties that affect specific developmental navigational disorders and/or school academic achievement. PMID:25438146

  6. Real-time self-calibration of a tracked augmented reality display

    NASA Astrophysics Data System (ADS)

    Baum, Zachary; Lasso, Andras; Ungi, Tamas; Fichtinger, Gabor

    2016-03-01

    PURPOSE: Augmented reality systems have been proposed for image-guided needle interventions but they have not become widely used in clinical practice due to restrictions such as limited portability, low display refresh rates, and tedious calibration procedures. We propose a handheld tablet-based self-calibrating image overlay system. METHODS: A modular handheld augmented reality viewbox was constructed from a tablet computer and a semi-transparent mirror. A consistent and precise self-calibration method, without the use of any temporary markers, was designed to achieve an accurate calibration of the system. Markers attached to the viewbox and patient are simultaneously tracked using an optical pose tracker to report the position of the patient with respect to a displayed image plane that is visualized in real-time. The software was built using the open-source 3D Slicer application platform's SlicerIGT extension and the PLUS toolkit. RESULTS: The accuracy of the image overlay with image-guided needle interventions yielded a mean absolute position error of 0.99 mm (95th percentile 1.93 mm) in-plane of the overlay and a mean absolute position error of 0.61 mm (95th percentile 1.19 mm) out-of-plane. This accuracy is clinically acceptable for tool guidance during various procedures, such as musculoskeletal injections. CONCLUSION: A self-calibration method was developed and evaluated for a tracked augmented reality display. The results show potential for the use of handheld image overlays in clinical studies with image-guided needle interventions.

  7. A Novel Augmented Reality Navigation System for Endoscopic Sinus and Skull Base Surgery: A Feasibility Study

    PubMed Central

    Li, Liang; Yang, Jian; Chu, Yakui; Wu, Wenbo; Xue, Jin; Liang, Ping; Chen, Lei

    2016-01-01

    Objective To verify the reliability and clinical feasibility of a self-developed navigation system based on an augmented reality technique for endoscopic sinus and skull base surgery. Materials and Methods In this study we performed a head phantom and cadaver experiment to determine the display effect and accuracy of our navigational system. We compared cadaver head-based simulated operations, the target registration error, operation time, and National Aeronautics and Space Administration Task Load Index scores of our navigation system to conventional navigation systems. Results The navigation system developed in this study has a novel display mode capable of fusing endoscopic images to three-dimensional (3-D) virtual images. In the cadaver head experiment, the target registration error was 1.28 ± 0.45 mm, which met the accepted standards of a navigation system used for nasal endoscopic surgery. Compared with conventional navigation systems, the new system was more effective in terms of operation time and the mental workload of surgeons, which is especially important for less experienced surgeons. Conclusion The self-developed augmented reality navigation system for endoscopic sinus and skull base surgery appears to have advantages that outweigh those of conventional navigation systems. We conclude that this navigational system will provide rhinologists with more intuitive and more detailed imaging information, thus reducing the judgment time and mental workload of surgeons when performing complex sinus and skull base surgeries. Ultimately, this new navigational system has potential to increase the quality of surgeries. In addition, the augmented reality navigational system could be of interest to junior doctors being trained in endoscopic techniques because it could speed up their learning. However, it should be noted that the navigation system serves as an adjunct to a surgeon’s skills and knowledge, not as a substitute. PMID:26757365

  8. An Augmented Reality Nanomanipulator for Learning Nanophysics: The "NanoLearner" Platform

    NASA Astrophysics Data System (ADS)

    Marchi, Florence; Marliere, Sylvain; Florens, Jean Loup; Luciani, Annie; Chevrier, Joel

    The work focuses on the description and evaluation of an augmented reality nanomanipulator, called "NanoLearner" platform used as educational tool in practical works of nanophysics. Through virtual reality associated to multisensory renderings, students are immersed in the nanoworld where they can interact in real time with a sample surface or an object, using their senses as hearing, seeing and touching. The role of each sensorial rendering in the understanding and control of the "approach-retract" interaction has been determined thanks to statistical studies obtained during the practical works. Finally, we present two extensions of the use of this innovative tool for investigating nano effects in living organisms and for allowing grand public to have access to a natural understanding of nanophenomena.

  9. Beam steering for virtual/augmented reality displays with a cycloidal diffractive waveplate.

    PubMed

    Chen, Haiwei; Weng, Yishi; Xu, Daming; Tabiryan, Nelson V; Wu, Shin-Tson

    2016-04-01

    We proposed a switchable beam steering device with cycloidal diffractive waveplate (CDW) for eye tracking in a virtual reality (VR) or augmented reality (AR) display system. Such a CDW diffracts the incident circularly polarized light to the first order with over 95% efficiency. To convert the input linearly polarized light to right-handed or left-handed circular polarization, we developed a broadband polarization switch consisting of a twisted nematic liquid crystal cell and an achromatic quarter-wave retardation film. By cascading 2-3 CDWs together, multiple diffraction angles can be achieved. To suppress the color dispersion, we proposed two approaches to obtain the same diffraction angle for red, green, and blue LEDs-based full color displays. Our device exhibits several advantages, such as high diffraction efficiency, fast response time, low power consumption, and low cost. It holds promise for the emerging VR/AR displays. PMID:27137019

  10. Building a hybrid patient's model for augmented reality in surgery: a registration problem.

    PubMed

    Lavallée, S; Cinquin, P; Szeliski, R; Peria, O; Hamadeh, A; Champleboux, G; Troccaz, J

    1995-03-01

    In the field of Augmented Reality in Surgery, building a hybrid patient's model, i.e. merging all the data and systems available for a given application, is a difficult but crucial technical problem. The purpose is to merge all the data that constitute the patient model with the reality of the surgery, i.e. the surgical tools and feedback devices. In this paper, we first develop this concept, we show that this construction comes to a problem of registration between various sensor data, and we detail a general framework of registration. The state of the art in this domain is presented. Finally, we show results that we have obtained using a method which is based on the use of anatomical reference surfaces. We show that in many clinical cases, registration is only possible through the use of internal patient structures. PMID:7554833

  11. Using Augmented Reality to Elicit Pretend Play for Children with Autism.

    PubMed

    Zhen Bai; Blackwell, Alan F; Coulouris, George

    2015-05-01

    Children with autism spectrum condition (ASC) suffer from deficits or developmental delays in symbolic thinking. In particular, they are often found lacking in pretend play during early childhood. Researchers believe that they encounter difficulty in generating and maintaining mental representation of pretense coupled with the immediate reality. We have developed an interactive system that explores the potential of Augmented Reality (AR) technology to visually conceptualize the representation of pretense within an open-ended play environment. Results from an empirical study involving children with ASC aged 4 to 7 demonstrated a significant improvement of pretend play in terms of frequency, duration and relevance using the AR system in comparison to a non computer-assisted situation. We investigated individual differences, skill transfer, system usability and limitations of the proposed AR system. We discuss design guidelines for future AR systems for children with ASC and other pervasive developmental disorders. PMID:26357207

  12. Social Gaming and Learning Applications: A Driving Force for the Future of Virtual and Augmented Reality?

    NASA Astrophysics Data System (ADS)

    Dörner, Ralf; Lok, Benjamin; Broll, Wolfgang

    Backed by a large consumer market, entertainment and education applications have spurred developments in the fields of real-time rendering and interactive computer graphics. Relying on Computer Graphics methodologies, Virtual Reality and Augmented Reality benefited indirectly from this; however, there is no large scale demand for VR and AR in gaming and learning. What are the shortcomings of current VR/AR technology that prevent a widespread use in these application areas? What advances in VR/AR will be necessary? And what might future “VR-enhanced” gaming and learning look like? Which role can and will Virtual Humans play? Concerning these questions, this article analyzes the current situation and provides an outlook on future developments. The focus is on social gaming and learning.

  13. Enhancing a Multi-body Mechanism with Learning-Aided Cues in an Augmented Reality Environment

    NASA Astrophysics Data System (ADS)

    Singh Sidhu, Manjit

    2013-06-01

    Augmented Reality (AR) is a potential area of research for education, covering issues such as tracking and calibration, and realistic rendering of virtual objects. The ability to augment real world with virtual information has opened the possibility of using AR technology in areas such as education and training as well. In the domain of Computer Aided Learning (CAL), researchers have long been looking into enhancing the effectiveness of the teaching and learning process by providing cues that could assist learners to better comprehend the materials presented. Although a number of works were done looking into the effectiveness of learning-aided cues, but none has really addressed this issue for AR-based learning solutions. This paper discusses the design and model of an AR based software that uses visual cues to enhance the learning process and the outcome perception results of the cues.

  14. Conceiving a specific holographic combiner for an augmented reality HMD dedicated to surgical applications

    NASA Astrophysics Data System (ADS)

    Sittler, Gilles; Twardowski, Patrice; Fasquel, Jean-Baptiste; Fontaine, Joël

    2006-04-01

    In this paper, we present the conception of a holographic combiner for an augmented reality Head Mounted Display (HMD) dedicated to surgical applications. The recording of this holographic component has been performed at the Laboratoire des Systemes Photoniques (LSP) in Strasbourg, France. We present in this paper two different approaches for the recording of such a component: one using plane waves, and the other using spherical waves. The setup linked to the first approach has been developed and built, so that measurments of the diffraction efficiency can be shown. For the other way of recording the holographic combiner, we have performed numerical simulations to find the best recording setup to fit our specifications.

  15. GyroWand: An Approach to IMU-Based Raycasting for Augmented Reality.

    PubMed

    Hincapié-Ramos, Juan David; Özacar, Kasim; Irani, Pourang P; Kitamura, Yoshifumi

    2016-01-01

    Optical see-through head-mounted displays enable augmented reality (AR) applications that display virtual objects overlaid on the real world. At the core of this new generation of devices are low-cost tracking technologies that allow us to interpret users' motion in the real world in relation to the virtual content for the purposes of navigation and interaction. The advantages of pervasive tracking come at the cost of limiting interaction possibilities, however. To address these challenges the authors introduce GyroWand, a raycasting technique for AR HMDs using inertial measurement unit (IMU) rotational data from a handheld controller. PMID:26960031

  16. AUVA - Augmented Reality Empowers Visual Analytics to explore Medical Curriculum Data.

    PubMed

    Nifakos, Sokratis; Vaitsis, Christos; Zary, Nabil

    2015-01-01

    Medical curriculum data play a key role in the structure and the organization of medical programs in Universities around the world. The effective processing and usage of these data may improve the educational environment of medical students. As a consequence, the new generation of health professionals would have improved skills from the previous ones. This study introduces the process of enhancing curriculum data by the use of augmented reality technology as a management and presentation tool. The final goal is to enrich the information presented from a visual analytics approach applied on medical curriculum data and to sustain low levels of complexity of understanding these data. PMID:25991196

  17. Real-Time Occlusion Handling in Augmented Reality Based on an Object Tracking Approach

    PubMed Central

    Tian, Yuan; Guan, Tao; Wang, Cheng

    2010-01-01

    To produce a realistic augmentation in Augmented Reality, the correct relative positions of real objects and virtual objects are very important. In this paper, we propose a novel real-time occlusion handling method based on an object tracking approach. Our method is divided into three steps: selection of the occluding object, object tracking and occlusion handling. The user selects the occluding object using an interactive segmentation method. The contour of the selected object is then tracked in the subsequent frames in real-time. In the occlusion handling step, all the pixels on the tracked object are redrawn on the unprocessed augmented image to produce a new synthesized image in which the relative position between the real and virtual object is correct. The proposed method has several advantages. First, it is robust and stable, since it remains effective when the camera is moved through large changes of viewing angles and volumes or when the object and the background have similar colors. Second, it is fast, since the real object can be tracked in real-time. Last, a smoothing technique provides seamless merging between the augmented and virtual object. Several experiments are provided to validate the performance of the proposed method. PMID:22319278

  18. The Effects of Augmented Reality-based Otago Exercise on Balance, Gait, and Falls Efficacy of Elderly Women

    PubMed Central

    Yoo, Ha-na; Chung, EunJung; Lee, Byoung-Hee

    2013-01-01

    [Purpose] The purpose of this study was to determine the effects of augmented reality-based Otago exercise on balance, gait, and falls efficacy of elderly women. [Subjects] The subjects were 21 elderly women, who were randomly divided into two groups: an augmented reality-based Otago exercise group of 10 subjects and an Otago exercise group of 11 subjects. [Methods] All subjects were evaluated for balance (Berg Balance Scale, BBS), gait parameters (velocity, cadence, step length, and stride length), and falls efficacy. Within 12 weeks, Otago exercise for muscle strengthening and balance training was conducted three times, for a period of 60 minutes each, and subjects in the experimental group performed augmented reality-based Otago exercise. [Results] Following intervention, the augmented reality-based Otago exercise group showed significant increases in BBS, velocity, cadence, step length (right side), stride length (right side and left side) and falls efficacy. [Conclusion] The results of this study suggest the feasibility and suitability of this augmented reality-based Otago exercise for elderly women. PMID:24259856

  19. Using augmented reality as a clinical support tool to assist combat medics in the treatment of tension pneumothoraces.

    PubMed

    Wilson, Kenneth L; Doswell, Jayfus T; Fashola, Olatokunbo S; Debeatham, Wayne; Darko, Nii; Walker, Travelyan M; Danner, Omar K; Matthews, Leslie R; Weaver, William L

    2013-09-01

    This study was to extrapolate potential roles of augmented reality goggles as a clinical support tool assisting in the reduction of preventable causes of death on the battlefield. Our pilot study was designed to improve medic performance in accurately placing a large bore catheter to release tension pneumothorax (prehospital setting) while using augmented reality goggles. Thirty-four preclinical medical students recruited from Morehouse School of Medicine performed needle decompressions on human cadaver models after hearing a brief training lecture on tension pneumothorax management. Clinical vignettes identifying cadavers as having life-threatening tension pneumothoraces as a consequence of improvised explosive device attacks were used. Study group (n = 13) performed needle decompression using augmented reality goggles whereas the control group (n = 21) relied solely on memory from the lecture. The two groups were compared according to their ability to accurately complete the steps required to decompress a tension pneumothorax. The medical students using augmented reality goggle support were able to treat the tension pneumothorax on the human cadaver models more accurately than the students relying on their memory (p < 0.008). Although the augmented reality group required more time to complete the needle decompression intervention (p = 0.0684), this did not reach statistical significance. PMID:24005547

  20. Integrated multimodal human-computer interface and augmented reality for interactive display applications

    NASA Astrophysics Data System (ADS)

    Vassiliou, Marius S.; Sundareswaran, Venkataraman; Chen, S.; Behringer, Reinhold; Tam, Clement K.; Chan, M.; Bangayan, Phil T.; McGee, Joshua H.

    2000-08-01

    We describe new systems for improved integrated multimodal human-computer interaction and augmented reality for a diverse array of applications, including future advanced cockpits, tactical operations centers, and others. We have developed an integrated display system featuring: speech recognition of multiple concurrent users equipped with both standard air- coupled microphones and novel throat-coupled sensors (developed at Army Research Labs for increased noise immunity); lip reading for improving speech recognition accuracy in noisy environments, three-dimensional spatialized audio for improved display of warnings, alerts, and other information; wireless, coordinated handheld-PC control of a large display; real-time display of data and inferences from wireless integrated networked sensors with on-board signal processing and discrimination; gesture control with disambiguated point-and-speak capability; head- and eye- tracking coupled with speech recognition for 'look-and-speak' interaction; and integrated tetherless augmented reality on a wearable computer. The various interaction modalities (speech recognition, 3D audio, eyetracking, etc.) are implemented a 'modality servers' in an Internet-based client-server architecture. Each modality server encapsulates and exposes commercial and research software packages, presenting a socket network interface that is abstracted to a high-level interface, minimizing both vendor dependencies and required changes on the client side as the server's technology improves.

  1. Generic precise augmented reality guiding system and its calibration method based on 3D virtual model.

    PubMed

    Liu, Miao; Yang, Shourui; Wang, Zhangying; Huang, Shujun; Liu, Yue; Niu, Zhenqi; Zhang, Xiaoxuan; Zhu, Jigui; Zhang, Zonghua

    2016-05-30

    Augmented reality system can be applied to provide precise guidance for various kinds of manual works. The adaptability and guiding accuracy of such systems are decided by the computational model and the corresponding calibration method. In this paper, a novel type of augmented reality guiding system and the corresponding designing scheme are proposed. Guided by external positioning equipment, the proposed system can achieve high relative indication accuracy in a large working space. Meanwhile, the proposed system is realized with a digital projector and the general back projection model is derived with geometry relationship between digitized 3D model and the projector in free space. The corresponding calibration method is also designed for the proposed system to obtain the parameters of projector. To validate the proposed back projection model, the coordinate data collected by a 3D positioning equipment is used to calculate and optimize the extrinsic parameters. The final projecting indication accuracy of the system is verified with subpixel pattern projecting technique. PMID:27410124

  2. Endoscopic feature tracking for augmented-reality assisted prosthesis selection in mitral valve repair

    NASA Astrophysics Data System (ADS)

    Engelhardt, Sandy; Kolb, Silvio; De Simone, Raffaele; Karck, Matthias; Meinzer, Hans-Peter; Wolf, Ivo

    2016-03-01

    Mitral valve annuloplasty describes a surgical procedure where an artificial prosthesis is sutured onto the anatomical structure of the mitral annulus to re-establish the valve's functionality. Choosing an appropriate commercially available ring size and shape is a difficult decision the surgeon has to make intraoperatively according to his experience. In our augmented-reality framework, digitalized ring models are superimposed onto endoscopic image streams without using any additional hardware. To place the ring model on the proper position within the endoscopic image plane, a pose estimation is performed that depends on the localization of sutures placed by the surgeon around the leaflet origins and punctured through the stiffer structure of the annulus. In this work, the tissue penetration points are tracked by the real-time capable Lucas Kanade optical flow algorithm. The accuracy and robustness of this tracking algorithm is investigated with respect to the question whether outliers influence the subsequent pose estimation. Our results suggest that optical flow is very stable for a variety of different endoscopic scenes and tracking errors do not affect the position of the superimposed virtual objects in the scene, making this approach a viable candidate for annuloplasty augmented reality-enhanced decision support.

  3. A study of students' motivation using the augmented reality science textbook

    NASA Astrophysics Data System (ADS)

    Gopalan, Valarmathie; Zulkifli, Abdul Nasir; Bakar, Juliana Aida Abu

    2016-08-01

    Science plays a major role in assisting Malaysia to achieve the developed nation status by 2020. However, over a few decades, Malaysia is facing a downward trend in the number of students pursuing careers and higher education in science related fields. Since school is the first platform where students learn science, a new learning approach needs to be introduced to motivate them towards science learning. The aim of this study is to determine whether the intervention of the enhanced science textbook using augmented reality contributes to the learning process of lower secondary school students in science. The study was carried out among a sample of 70 lower secondary school students. Pearson Correlation and Regression analyses were used to determine the effects of ease of use, engaging, enjoyment and fun on students' motivation in using the augmented reality science textbook for science learning. The results provide empirical support for the positive and statistically significant relationship between engaging, enjoyment and fun and students' motivation for science learning. However, Ease of use is not significant but positively correlated to Motivation.

  4. Optical augmented reality assisted navigation system for neurosurgery teaching and planning

    NASA Astrophysics Data System (ADS)

    Wu, Hui-Qun; Geng, Xing-Yun; Wang, Li; Zhang, Yuan-Peng; Jiang, Kui; Tang, Le-Min; Zhou, Guo-Min; Dong, Jian-Cheng

    2013-07-01

    This paper proposed a convenient navigation system for neurosurgeon's pre-operative planning and teaching with augmented reality (AR) technique, which maps the three-dimensional reconstructed virtual anatomy structures onto a skull model. This system included two parts, a virtual reality system and a skull model scence. In our experiment, a 73 year old right-handed man initially diagnosed with astrocytoma was selected as an example to vertify our system. His imaging data from different modalities were registered and the skull soft tissue, brain and inside vessels as well as tumor were reconstructed. Then the reconstructed models were overlayed on the real scence. Our findings showed that the reconstructed tissues were augmented into the real scence and the registration results were in good alignment. The reconstructed brain tissue was well distributed in the skull cavity. The probe was used by a neurosurgeon to explore the surgical pathway which could be directly posed into the tumor while not injuring important vessels. In this way, the learning cost for students and patients' education about surgical risks reduced. Therefore, this system could be a selective protocol for image guided surgery(IGS), and is promising for neurosurgeon's pre-operative planning and teaching.

  5. Augmented-reality-guided biopsy of a tumor near the skull base: the surgeon's experience

    NASA Astrophysics Data System (ADS)

    Eggers, Georg; Sudra, Gunther; Ghanai, Sassan; Salb, Tobias; Dillmann, Ruediger; Marmulla, Ruediger; Hassfeld, Stefan

    2005-04-01

    INPRES, a system for Augmented Reality has been developed in the collaborative research center "Information Technology in Medicine - Computer- and Sensor-Aided Surgery". The system is based on see-through glasses. In extensive preclinical testing the system has proven its functionality and tests with volunteers had been performed successfully, based on MRI imaging. We report the surgeons view of the first use of the system for AR guided biopsy of a tumour near the skull base. Preoperative planning was performed based on CT image data. The information to be projected was the tumour volume and was segmented from image data. With the use of infrared cameras, the positions of patient and surgeon were tracked intraoperatively and the information on the glasses displays was updated accordingly. The systems proved its functionality under OR conditions in patient care: Augmented reality information could be visualized with sufficient accuracy for the surgical task. After intraoperative calibration by the surgeon, the biopsy was acquired successfully. The advantage of see through glasses is their flexibility. A virtual stereoscopic image can be set up wherever and whenever desired. A biopsy at a delicate location could be performed without the need for wide exposure. This means additional safety and lower operation related morbidity to the patient. The integration of the calibration-procedure of the glasses into the intraoperative workflow is of importance to the surgeon.

  6. Augmented reality and photogrammetry: A synergy to visualize physical and virtual city environments

    NASA Astrophysics Data System (ADS)

    Portalés, Cristina; Lerma, José Luis; Navarro, Santiago

    2010-01-01

    Close-range photogrammetry is based on the acquisition of imagery to make accurate measurements and, eventually, three-dimensional (3D) photo-realistic models. These models are a photogrammetric product per se. They are usually integrated into virtual reality scenarios where additional data such as sound, text or video can be introduced, leading to multimedia virtual environments. These environments allow users both to navigate and interact on different platforms such as desktop PCs, laptops and small hand-held devices (mobile phones or PDAs). In very recent years, a new technology derived from virtual reality has emerged: Augmented Reality (AR), which is based on mixing real and virtual environments to boost human interactions and real-life navigations. The synergy of AR and photogrammetry opens up new possibilities in the field of 3D data visualization, navigation and interaction far beyond the traditional static navigation and interaction in front of a computer screen. In this paper we introduce a low-cost outdoor mobile AR application to integrate buildings of different urban spaces. High-accuracy 3D photo-models derived from close-range photogrammetry are integrated in real (physical) urban worlds. The augmented environment that is presented herein requires for visualization a see-through video head mounted display (HMD), whereas user's movement navigation is achieved in the real world with the help of an inertial navigation sensor. After introducing the basics of AR technology, the paper will deal with real-time orientation and tracking in combined physical and virtual city environments, merging close-range photogrammetry and AR. There are, however, some software and complex issues, which are discussed in the paper.

  7. Interactive Learning Environment: Web-based Virtual Hydrological Simulation System using Augmented and Immersive Reality

    NASA Astrophysics Data System (ADS)

    Demir, I.

    2014-12-01

    Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The hydrological simulation system is a web-based 3D interactive learning environment for teaching hydrological processes and concepts. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create or load predefined scenarios, control environmental parameters, and evaluate environmental mitigation alternatives. The web-based simulation system provides an environment for students to learn about the hydrological processes (e.g. flooding and flood damage), and effects of development and human activity in the floodplain. The system utilizes latest web technologies and graphics processing unit (GPU) for water simulation and object collisions on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various visualization and interaction modes.

  8. Thrust Augmentation Measurements for a Pulse Detonation Engine Driven Ejector

    NASA Technical Reports Server (NTRS)

    Pal, S.; Santoro, Robert J.; Shehadeh, R.; Saretto, S.; Lee, S.-Y.

    2005-01-01

    Thrust augmentation results of an ongoing study of pulse detonation engine driven ejectors are presented and discussed. The experiments were conducted using a pulse detonation engine (PDE) setup with various ejector configurations. The PDE used in these experiments utilizes ethylene (C2H4) as the fuel, and an equi-molar mixture of oxygen and nitrogen as the oxidizer at an equivalence ratio of one. High fidelity thrust measurements were made using an integrated spring damper system. The baseline thrust of the PDE engine was first measured and agrees with experimental and modeling results found in the literature. Thrust augmentation measurements were then made for constant diameter ejectors. The parameter space for the study included ejector length, PDE tube exit to ejector tube inlet overlap distance, and straight versus rounded ejector inlets. The relationship between the thrust augmentation results and various physical phenomena is described. To further understand the flow dynamics, shadow graph images of the exiting shock wave front from the PDE were also made. For the studied parameter space, the results showed a maximum augmentation of 40%. Further increase in augmentation is possible if the geometry of the ejector is tailored, a topic currently studied by numerous groups in the field.

  9. An augmented reality system in lymphatico-venous anastomosis surgery†

    PubMed Central

    Nishimoto, Soh; Tonooka, Maki; Fujita, Kazutoshi; Sotsuka, Yohei; Fujiwara, Toshihiro; Kawai, Kenichiro; Kakibuchi, Masao

    2016-01-01

    Indocyanine green lymphography, displayed as infrared image, is very useful in identifying lymphatic vessels during surgeries. Surgeons refer the infrared image on the displays as they proceed the operation. Those displays are usually placed on the walls or besides the operation tables. The surgeons cannot watch the infrared image and the operation field simultaneously. They have to move their heads and visual lines. An augmented reality system was developed for simultaneous referring of the infrared image, overlaid on real operation field view. A surgeon wore a see-through eye-glasses type display during lymphatico-venous anastomosis surgery. Infrared image was transferred wirelessly to the display. The surgeon was able to recognize fluorescently shining lymphatic vessels projected on the glasses and dissect them out. PMID:27154749

  10. Studienlandschaft Schwingbachtal: an out-door full-scale learning tool newly equipped with augmented reality

    NASA Astrophysics Data System (ADS)

    Aubert, A. H.; Schnepel, O.; Kraft, P.; Houska, T.; Plesca, I.; Orlowski, N.; Breuer, L.

    2015-11-01

    This paper addresses education and communication in hydrology and geosciences. Many approaches can be used, such as the well-known seminars, modelling exercises and practical field work but out-door learning in our discipline is a must, and this paper focuses on the recent development of a new out-door learning tool at the landscape scale. To facilitate improved teaching and hands-on experience, we designed the Studienlandschaft Schwingbachtal. Equipped with field instrumentation, education trails, and geocache, we now implemented an augmented reality App, adding virtual teaching objects on the real landscape. The App development is detailed, to serve as methodology for people wishing to implement such a tool. The resulting application, namely the Schwingbachtal App, is described as an example. We conclude that such an App is useful for communication and education purposes, making learning pleasant, and offering personalized options.

  11. New augmented reality and robotic based methods for head-surgery.

    PubMed

    Wörn, H; Aschke, M; Kahrs, L A

    2005-09-01

    Within the framework of the collaborative research centre "Information Technology in Medicine--Computer and Sensor-Aided Surgery" (SFB414) new methods for intraoperative computer assistance of surgical procedures are being developed. The developed tools will be controlled by an intraoperative host which provides interfaces to the electronic health record (EHR) and intraoperative computer assisted instruments.The interaction is based on standardised communication protocols. Plug & work functions will allow easy integration and configuration of new components. Intraoperative systems currently under development are intraoperative augmented reality (AR) using a projector and via a microscope, a planning system for definition of complex trajectories and a surgical robot system. The developed systems are under clinical evaluation and showing promising results in their application. PMID:17518390

  12. Inexpensive Monocular Pico-Projector-based Augmented Reality Display for Surgical Microscope

    PubMed Central

    Shi, Chen; Becker, Brian C.; Riviere, Cameron N.

    2013-01-01

    This paper describes an inexpensive pico-projector-based augmented reality (AR) display for a surgical microscope. The system is designed for use with Micron, an active handheld surgical tool that cancels hand tremor of surgeons to improve microsurgical accuracy. Using the AR display, virtual cues can be injected into the microscope view to track the movement of the tip of Micron, show the desired position, and indicate the position error. Cues can be used to maintain high performance by helping the surgeon to avoid drifting out of the workspace of the instrument. Also, boundary information such as the view range of the cameras that record surgical procedures can be displayed to tell surgeons the operation area. Furthermore, numerical, textual, or graphical information can be displayed, showing such things as tool tip depth in the work space and on/off status of the canceling function of Micron. PMID:25264542

  13. Electrically adjustable location of a projected image in augmented reality via a liquid-crystal lens.

    PubMed

    Chen, Hung-Shan; Wang, Yu-Jen; Chen, Po-Ju; Lin, Yi-Hsin

    2015-11-01

    An augmented reality (AR) system involving the electrically tunable location of a projected image is implemented using a liquid-crystal (LC) lens. The projected image is either real or virtual. By effectively doubling the LC lens power following light reflection, the position of a projected virtual image can be made to vary from 42 to 360 cm, while the tunable range for a projected real image is from 27 to 52 cm on the opposite side. The optical principle of the AR system is introduced and could be further developed for other tunable focusing lenses, even those with a lower lens power. The benefits of this study could be extended to head-mounted display systems for vision correction or vision compensation. We believe that tunable focusing LC optical elements are promising developments in the thriving field of AR applications. PMID:26561086

  14. Development of a haptic interface for motor rehabilitation therapy using augmented reality.

    PubMed

    Vidrios-Serrano, Carlos; Bonilla, Isela; Vigueras-Gomez, Flavio; Mendoza, Marco

    2015-08-01

    In this paper, a robot-assisted therapy system is presented, mainly focused on the improvement of fine movements of patients with motor deficits of upper limbs. This system combines the use of a haptic device with an augmented reality environment, where a kind of occupational therapy exercises are implemented. The main goal of the system is to provide an extra motivation to patients, who are stimulated visually and tactilely into a scene that mixes elements of real and virtual worlds. Additionally, using the norm of tracking error, it is possible to quantitatively measure the performance of the patient during a therapy session, likewise, it is possible to obtain information such as runtime and the followed path. PMID:26736471

  15. Augmented reality & gesture-based architecture in games for the elderly.

    PubMed

    McCallum, Simon; Boletsis, Costas

    2013-01-01

    Serious games for health and, more specifically, for elderly people have developed rapidly in recent years. The recent popularization of novel interaction methods of consoles, such as the Nintendo Wii and Microsoft Kinect, has provided an opportunity for the elderly to engage in computer and video games. These interaction methods, however, still present various challenges for elderly users. To address these challenges, we propose an architecture consisted of Augmented Reality (as an output mechanism) combined with gestured-based devices (as an input method). The intention of this work is to provide a theoretical justification for using these technologies and to integrate them into an architecture, acting as a basis for potentially creating suitable interaction techniques for the elderly players. PMID:23739373

  16. An augmented reality system in lymphatico-venous anastomosis surgery.

    PubMed

    Nishimoto, Soh; Tonooka, Maki; Fujita, Kazutoshi; Sotsuka, Yohei; Fujiwara, Toshihiro; Kawai, Kenichiro; Kakibuchi, Masao

    2016-01-01

    Indocyanine green lymphography, displayed as infrared image, is very useful in identifying lymphatic vessels during surgeries. Surgeons refer the infrared image on the displays as they proceed the operation. Those displays are usually placed on the walls or besides the operation tables. The surgeons cannot watch the infrared image and the operation field simultaneously. They have to move their heads and visual lines. An augmented reality system was developed for simultaneous referring of the infrared image, overlaid on real operation field view. A surgeon wore a see-through eye-glasses type display during lymphatico-venous anastomosis surgery. Infrared image was transferred wirelessly to the display. The surgeon was able to recognize fluorescently shining lymphatic vessels projected on the glasses and dissect them out. PMID:27154749

  17. An infrastructure for realizing custom-tailored augmented reality user interfaces.

    PubMed

    Broll, Wolfgang; Lindt, Irma; Ohlenburg, Jan; Herbst, Iris; Wittkämper, Michael; Novotny, Thomas

    2005-01-01

    Augmented Reality (AR) technologies are rapidly expanding into new application areas. However, the development of AR user interfaces and appropriate interaction techniques remains a complex and time-consuming task. Starting from scratch is more common than building upon existing solutions. Furthermore, adaptation is difficult, often resulting in poor quality and limited flexibility with regard to user requirements. In order to overcome these problems, we introduce an infrastructure for supporting the development of specific AR interaction techniques and their adaptation to individual user needs. Our approach is threefold: a flexible AR framework providing independence from particular input devices and rendering platforms, an interaction prototyping mechanism allowing for fast prototyping of new interaction techniques, and a high-level user interface description, extending user interface descriptions into the domain of AR. The general usability and applicability of the approach is demonstrated by means of three example AR projects. PMID:16270864

  18. Jedi training: playful evaluation of head-mounted augmented reality display systems

    NASA Astrophysics Data System (ADS)

    Ozbek, Christopher S.; Giesler, Bjorn; Dillmann, Ruediger

    2004-05-01

    A fundamental decision in building augmented reality (AR) systems is how to accomplish the combining of the real and virtual worlds. Nowadays this key-question boils down to the two alternatives video-see-through (VST) vs. optical-see-through (OST). Both systems have advantages and disadvantages in areas like production-simplicity, resolution, flexibility in composition strategies, field of view etc. To provide additional decision criteria for high dexterity, accuracy tasks and subjective user-acceptance a gaming environment was programmed that allowed good evaluation of hand-eye coordination, and that was inspired by the Star Wars movies. During an experimentation session with more than thirty participants a preference for optical-see-through glasses in conjunction with infra-red-tracking was found. Especially the high-computational demand for video-capture, processing and the resulting drop in frame rate emerged as a key-weakness of the VST-system.

  19. A complete augmented reality guidance system for liver punctures: first clinical evaluation.

    PubMed

    Nicolau, S A; Pennec, X; Soler, L; Ayache, N

    2005-01-01

    We provided in an augmented reality guidance system for liver punctures, which has been validated on a static abdominal phantom. In this paper, we report the first in vivo experiments. We developed a strictly passive protocol to directly evaluate our system on patients. We show that the system algorithms work efficiently and we highlight the clinical constraints that we had to overcome (small operative field, weight and sterility of the tracked marker attached to the needle...). Finally, we investigate to what extent breathing motion can be neglected for free breathing patient. Results show that the guiding accuracy, close to 1 cm, is sufficient for large targets only (above 3 cm of diameter) when the breathing motion is neglected. In the near future, we aim at validating our system on smaller targets using a respiratory gating technique. PMID:16685888

  20. Design and Implementation of a GPS Guidance System for Agricultural Tractors Using Augmented Reality Technology

    PubMed Central

    Santana-Fernández, Javier; Gómez-Gil, Jaime; del-Pozo-San-Cirilo, Laura

    2010-01-01

    Current commercial tractor guidance systems present to the driver information to perform agricultural tasks in the best way. This information generally includes a treated zones map referenced to the tractor’s position. Unlike actual guidance systems where the tractor driver must mentally associate treated zone maps and the plot layout, this paper presents a guidance system that using Augmented Reality (AR) technology, allows the tractor driver to see the real plot though eye monitor glasses with the treated zones in a different color. The paper includes a description of the system hardware and software, a real test done with image captures seen by the tractor driver, and a discussion predicting that the historical evolution of guidance systems could involve the use of AR technology in the agricultural guidance and monitoring systems. PMID:22163479

  1. Augmented Reality to Preserve Hidden Vestiges in Historical Cities. a Case Study

    NASA Astrophysics Data System (ADS)

    Martínez, J. L.; Álvareza, S.; Finat, J.; Delgado, F. J.; Finat, J.

    2015-02-01

    Mobile devices provide an increasingly sophisticated support to enhanced experiences and understanding the remote past in an interactive way. The use of augmented reality technologies allows to develop mobile applications for indoor exploration of virtually reconstructed archaeological places. In our work we have built a virtual reconstruction of a Roman Villa with data arising from an urgent partial excavation which were performed in order to build a car parking in the historical city of Valladolid (Spain). In its current state, the archaeological site is covered by an urban garden. Localization and tracking are performed using a combination of GPS and inertial sensors of the mobile device. In this work we prove how to perform an interactive navigation around the 3D virtual model showing an interpretation of the way it was. The user experience is enhanced by answering some simple questions, performing minor tasks and puzzles which are presented with multimedia contents linked to key features of the archaeological site.

  2. A Protein in the palm of your hand through augmented reality.

    PubMed

    Berry, Colin; Board, Jason

    2014-01-01

    Understanding of proteins and other biological macromolecules must be based on an appreciation of their 3-dimensional shape and the fine details of their structure. Conveying these details in a clear and stimulating fashion can present challenges using conventional approaches and 2-dimensional monitors and projectors. Here we describe a method for the production of 3-D interactive images of protein structures that can be manipulated in real time through the use of augmented reality software. Users first see a real-time image of themselves using the computer's camera, then, when they hold up a trigger image, a model of a molecule appears automatically in the video. This model rotates and translates in space in response to movements of the trigger card. The system described has been optimized to allow customization for the display of user-selected structures to create engaging, educational visualizations to explore 3-D structures. PMID:24979189

  3. Infrared marker-based tracking in an indoor unknown environment for augmented reality applications

    NASA Astrophysics Data System (ADS)

    Huang, Yetao; Weng, Dongdong; Liu, Yue; Wang, Yongtian

    2009-11-01

    Marker based tracking requires complicated preparation work that impedes its use in augmented reality applications. This paper presents a novel tracking scheme to be used in an indoor unknown scene by adapting simultaneous localization and mapping (SLAM) algorithms. An infrared (IR) marker system is specifically designed to simplify the feature recognition and tracking in SLAM process. With one initial IR marker, the other markers can be projected randomly onto a large-area environment. The pose of camera can be estimated with a monocular IR camera in real time. Experimental result demonstrates that the proposed system meets the requirements of accuracy for large-area tracking. A prototype system is built to show its feasibility in unknown environment and potential use in applications.

  4. Ultrasound guided robotic biopsy using augmented reality and human-robot cooperative control.

    PubMed

    Freschi, C; Troia, E; Ferrari, V; Megali, G; Pietrabissa, A; Mosca, F

    2009-01-01

    Ultrasound-guided biopsy is a proficient mininvasive approach for tumors staging but requires very long training and particular manual and 3D space perception abilities of the physician, for the planning of the needle trajectory and the execution of the procedure. In order to simplify this difficult task, we have developed an integrated system that provides the clinician two types of assistance: an augmented reality visualization allows accurate and easy planning of needle trajectory and target reaching verification; a robot arm with a six-degree-of-freedom force sensor allows the precise positioning of the needle holder and allows the clinician to adjust the planned trajectory (cooperative control) to overcome needle deflection and target motion. Preliminary tests have been executed on an ultrasound phantom showing high precision of the system in static conditions and the utility and usability of the cooperative control in simulated no-rigid conditions. PMID:19963882

  5. Application of augmented reality to the industrial systems for signalisation of emergency situations

    NASA Astrophysics Data System (ADS)

    Holejko, K.; Nowak, R.; Czarnecki, T.; Dzwiarek, M.

    2006-03-01

    One of the important measures to prevent undesired events consists in informing a machine operator about the appearance of a hazardous situation quickly and effectively enough. In order to conduct the investigations into possibilities of application of the Augmented Reality systems assigned for drawing the operator's attention to the hazards appearing at the workstation, two special test stands have been developed. For that purpose special glasses applying commonly available safety glasses have been designed for generation of warning signals. Red luminescent diodes were employed for generation of virtual images formed by means of diaphragms, which shaped it and supplied with additional inscriptions. The initial perception tests of the warning signals generated using the AG technique that were performed have proved that these signals can successfully serve as a warning to the machine or device operator against an impending hazardous event.

  6. Thrust augmentation nozzle (TAN) concept for rocket engine booster applications

    NASA Astrophysics Data System (ADS)

    Forde, Scott; Bulman, Mel; Neill, Todd

    2006-07-01

    Aerojet used the patented thrust augmented nozzle (TAN) concept to validate a unique means of increasing sea-level thrust in a liquid rocket booster engine. We have used knowledge gained from hypersonic Scramjet research to inject propellants into the supersonic region of the rocket engine nozzle to significantly increase sea-level thrust without significantly impacting specific impulse. The TAN concept overcomes conventional engine limitations by injecting propellants and combusting in an annular region in the divergent section of the nozzle. This injection of propellants at moderate pressures allows for obtaining high thrust at takeoff without overexpansion thrust losses. The main chamber is operated at a constant pressure while maintaining a constant head rise and flow rate of the main propellant pumps. Recent hot-fire tests have validated the design approach and thrust augmentation ratios. Calculations of nozzle performance and wall pressures were made using computational fluid dynamics analyses with and without thrust augmentation flow, resulting in good agreement between calculated and measured quantities including augmentation thrust. This paper describes the TAN concept, the test setup, test results, and calculation results.

  7. New education system for construction of optical holography setup - Tangible learning with Augmented Reality

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Takeshi; Yoshikawa, Hiroshi

    2013-02-01

    In case of teaching optical system construction, it is difficult to prepare the optical components for the attendance student. However the tangible learning is very important to master the optical system construction. It helps learners understand easily to use an inexpensive learning system that provides optical experiments experiences. Therefore, we propose the new education system for construction of optical setup with the augmented reality. To use the augmented reality, the proposed system can simulate the optical system construction by the direct hand control. Also, this system only requires an inexpensive web camera, printed makers and a personal computer. Since this system does not require the darkroom and the expensive optical equipments, the learners can study anytime, anywhere when they want to do. In this paper, we developed the system that can teach the optical system construction of the Denisyuk hologram and 2-step transmission type hologram. For the tangible learning and the easy understanding, the proposed system displays the CG objects of the optical components on the markers which are controlled by the learner's hands. The proposed system does not only display the CG object, but also display the light beam which is controlled by the optical components. To display the light beam that is hard to be seen directly, the learners can confirm about what is happening by the own manipulation. For the construction of optical holography setup, we arrange a laser, mirrors, a PBS (polarizing beam splitter), lenses, a polarizer, half-wave plates, spatial filters, an optical power meter and a recording plate. After the construction, proposed system can check optical setup correctly. In comparison with the learners who only read a book, the learners who use the system can construct the optical holography setup more quickly and correctly.

  8. Learning anatomy via mobile augmented reality: Effects on achievement and cognitive load.

    PubMed

    Küçük, Sevda; Kapakin, Samet; Göktaş, Yüksel

    2016-10-01

    Augmented reality (AR), a new generation of technology, has attracted the attention of educators in recent years. In this study, a MagicBook was developed for a neuroanatomy topic by using mobile augmented reality (mAR) technology. This technology integrates virtual learning objects into the real world and allow users to interact with the environment using mobile devices. The purpose of this study was to determine the effects of learning anatomy via mAR on medical students' academic achievement and cognitive load. The mixed method was applied in the study. The random sample consisted of 70 second-year undergraduate medical students: 34 in an experimental group and 36 in a control group. Academic achievement test and cognitive load scale were used as data collection tool. A one-way MANOVA test was used for analysis. The experimental group, which used mAR applications, reported higher achievement and lower cognitive load. The use of mAR applications in anatomy education contributed to the formation of an effective and productive learning environment. Student cognitive load decreased as abstract information became concrete in printed books via multimedia materials in mAR applications. Additionally, students were able to access the materials in the MagicBook anytime and anywhere they wanted. The mobile learning approach helped students learn better by exerting less cognitive effort. Moreover, the sensory experience and real time interaction with environment may provide learning satisfaction and enable students to structure their knowledge to complete the learning tasks. Anat Sci Educ 9: 411-421. © 2016 American Association of Anatomists. PMID:26950521

  9. Exploring the Impact of Varying Levels of Augmented Reality to Teach Probability and Sampling with a Mobile Device

    ERIC Educational Resources Information Center

    Conley, Quincy

    2013-01-01

    Statistics is taught at every level of education, yet teachers often have to assume their students have no knowledge of statistics and start from scratch each time they set out to teach statistics. The motivation for this experimental study comes from interest in exploring educational applications of augmented reality (AR) delivered via mobile…

  10. Using Tele-Coaching to Increase Behavior-Specific Praise Delivered by Secondary Teachers in an Augmented Reality Learning Environment

    ERIC Educational Resources Information Center

    Elford, Martha Denton

    2013-01-01

    This study analyzes the effects of real-time feedback on teacher behavior in an augmented reality simulation environment. Real-time feedback prompts teachers to deliver behavior-specific praise to students in the TeachLivE KU Lab as an evidence-based practice known to decrease disruptive behavior in inclusive classrooms. All educators face the…

  11. An Investigation of University Students' Collaborative Inquiry Learning Behaviors in an Augmented Reality Simulation and a Traditional Simulation

    ERIC Educational Resources Information Center

    Wang, Hung-Yuan; Duh, Henry Been-Lirn; Li, Nai; Lin, Tzung-Jin; Tsai, Chin-Chung

    2014-01-01

    The purpose of this study is to investigate and compare students' collaborative inquiry learning behaviors and their behavior patterns in an augmented reality (AR) simulation system and a traditional 2D simulation system. Their inquiry and discussion processes were analyzed by content analysis and lag sequential analysis (LSA). Forty…

  12. Employing Augmented-Reality-Embedded Instruction to Disperse the Imparities of Individual Differences in Earth Science Learning

    ERIC Educational Resources Information Center

    Chen, Cheng-ping; Wang, Chang-Hwa

    2015-01-01

    Studies have proven that merging hands-on and online learning can result in an enhanced experience in learning science. In contrast to traditional online learning, multiple in-classroom activities may be involved in an augmented-reality (AR)-embedded e-learning process and thus could reduce the effects of individual differences. Using a…

  13. The Use of Augmented Reality-Enhanced Reading Books for Vocabulary Acquisition with Students Who Are Diagnosed with Special Needs

    ERIC Educational Resources Information Center

    Fecich, Samantha J.

    2014-01-01

    During this collective case study, I explored the use of augmented reality books on an iPad 2 with students diagnosed with disabilities. Students in this study attended a high school life skills class in a rural school district during the fall 2013 semester. Four students participated in this study, two males and two females. Specifically, the…

  14. Weather Observers: A Manipulative Augmented Reality System for Weather Simulations at Home, in the Classroom, and at a Museum

    ERIC Educational Resources Information Center

    Hsiao, Hsien-Sheng; Chang, Cheng-Sian; Lin, Chien-Yu; Wang, Yau-Zng

    2016-01-01

    This study focused on how to enhance the interactivity and usefulness of augmented reality (AR) by integrating manipulative interactive tools with a real-world environment. A manipulative AR (MAR) system, which included 3D interactive models and manipulative aids, was designed and developed to teach the unit "Understanding Weather" in a…

  15. Delivering Educational Multimedia Contents through an Augmented Reality Application: A Case Study on Its Impact on Knowledge Acquisition and Retention

    ERIC Educational Resources Information Center

    Perez-Lopez, David; Contero, Manuel

    2013-01-01

    This paper presents a study to analyze the use of augmented reality (AR) for delivering multimedia content to support the teaching and learning process of the digestive and circulatory systems at the primary school level, and its impact on knowledge retention. Our AR application combines oral explanations and 3D models and animations of anatomical…

  16. Applying Augmented Reality to a Mobile-Assisted Learning System for Martial Arts Using Kinect Motion Capture

    ERIC Educational Resources Information Center

    Hsu, Wen-Chun; Shih, Ju-Ling

    2016-01-01

    In this study, to learn the routine of Tantui, a branch of martial arts was taken as an object of research. Fitts' stages of motor learning and augmented reality (AR) were applied to a 3D mobile-assisted learning system for martial arts, which was characterized by free viewing angles. With the new system, learners could rotate the viewing angle of…

  17. Augmented Reality as a Navigation Tool to Employment Opportunities for Postsecondary Education Students with Intellectual Disabilities and Autism

    ERIC Educational Resources Information Center

    McMahon, Don; Cihak, David F.; Wright, Rachel

    2015-01-01

    The purpose of this study was to examine the effects of location-based augmented reality navigation compared to Google Maps and paper maps as navigation aids for students with disabilities. The participants in this single subject study were three college students with intellectual disability and one college student with autism spectrum disorder.…

  18. Development of an IOS App Using Situated Learning, Communities of Practice, and Augmented Reality for Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Clarkson, Jessica

    2014-01-01

    This paper presents the development process and framework used to construct a transportation app that uses situated learning, augmented reality, and communities of practice. Autism spectrum disorder (ASD) is a neurodevelopmental disorder that can cause social impairments as well as the limit the potential for the individual to achieve independence…

  19. A Mixed Methods Assessment of Students' Flow Experiences during a Mobile Augmented Reality Science Game

    ERIC Educational Resources Information Center

    Bressler, D. M.; Bodzin, A. M.

    2013-01-01

    Current studies have reported that secondary students are highly engaged while playing mobile augmented reality (AR) learning games. Some researchers have posited that players' engagement may indicate a flow experience, but no research results have confirmed this hypothesis with vision-based AR learning games. This study investigated factors…

  20. Mad City Mystery: Developing Scientific Argumentation Skills with a Place-Based Augmented Reality Game on Handheld Computers

    ERIC Educational Resources Information Center

    Squire, Kurt D.; Jan, Mingfong

    2007-01-01

    While the knowledge economy has reshaped the world, schools lag behind in producing appropriate learning for this social change. Science education needs to prepare students for a future world in which multiple representations are the norm and adults are required to "think like scientists." Location-based augmented reality games offer an…

  1. Testing and evaluation of a wearable augmented reality system for natural outdoor environments

    NASA Astrophysics Data System (ADS)

    Roberts, David; Menozzi, Alberico; Cook, James; Sherrill, Todd; Snarski, Stephen; Russler, Pat; Clipp, Brian; Karl, Robert; Wenger, Eric; Bennett, Matthew; Mauger, Jennifer; Church, William; Towles, Herman; MacCabe, Stephen; Webb, Jeffrey; Lupo, Jasper; Frahm, Jan-Michael; Dunn, Enrique; Leslie, Christopher; Welch, Greg

    2013-05-01

    This paper describes performance evaluation of a wearable augmented reality system for natural outdoor environments. Applied Research Associates (ARA), as prime integrator on the DARPA ULTRA-Vis (Urban Leader Tactical, Response, Awareness, and Visualization) program, is developing a soldier-worn system to provide intuitive `heads-up' visualization of tactically-relevant geo-registered icons. Our system combines a novel pose estimation capability, a helmet-mounted see-through display, and a wearable processing unit to accurately overlay geo-registered iconography (e.g., navigation waypoints, sensor points of interest, blue forces, aircraft) on the soldier's view of reality. We achieve accurate pose estimation through fusion of inertial, magnetic, GPS, terrain data, and computer-vision inputs. We leverage a helmet-mounted camera and custom computer vision algorithms to provide terrain-based measurements of absolute orientation (i.e., orientation of the helmet with respect to the earth). These orientation measurements, which leverage mountainous terrain horizon geometry and mission planning landmarks, enable our system to operate robustly in the presence of external and body-worn magnetic disturbances. Current field testing activities across a variety of mountainous environments indicate that we can achieve high icon geo-registration accuracy (<10mrad) using these vision-based methods.

  2. Working alliance inventory applied to virtual and augmented reality (WAI-VAR): psychometrics and therapeutic outcomes.

    PubMed

    Miragall, Marta; Baños, Rosa M; Cebolla, Ausiàs; Botella, Cristina

    2015-01-01

    This study examines the psychometric properties of the Working Alliance Inventory-Short (WAI-S) adaptation to Virtual Reality (VR) and Augmented Reality (AR) therapies (WAI-VAR). The relationship between the therapeutic alliance (TA) with VR and AR and clinically significant change (CSC) is also explored. Seventy-five patients took part in this study (74.7% women, M age = 34.41). Fear of flying and adjustment disorder patients received VR therapy, and cockroach phobia patients received AR therapy. Psychometric properties, CSC, one-way ANOVA, Spearman's Correlations and Multiple Regression were calculated. The WAI-VAR showed a unidimensional structure, high internal consistency and adequate convergent validity. "Not changed" patients scored lower on the WAI-VAR than "improved" and "recovered" patients. Correlation between the WAI-VAR and CSC was moderate. The best fitting model for predicting CSC was a linear combination of the TA with therapist (WAI-S) and the TA with VR and AR (WAI-VAR), due to the latter variable slightly increased the percentage of variability accounted for in CSC. The WAI-VAR is the first validated instrument to measure the TA with VR and AR in research and clinical practice. This study reveals the importance of the quality of the TA with technologies in achieving positive outcomes in the therapy. PMID:26500589

  3. Working alliance inventory applied to virtual and augmented reality (WAI-VAR): psychometrics and therapeutic outcomes

    PubMed Central

    Miragall, Marta; Baños, Rosa M.; Cebolla, Ausiàs; Botella, Cristina

    2015-01-01

    This study examines the psychometric properties of the Working Alliance Inventory-Short (WAI-S) adaptation to Virtual Reality (VR) and Augmented Reality (AR) therapies (WAI-VAR). The relationship between the therapeutic alliance (TA) with VR and AR and clinically significant change (CSC) is also explored. Seventy-five patients took part in this study (74.7% women, Mage = 34.41). Fear of flying and adjustment disorder patients received VR therapy, and cockroach phobia patients received AR therapy. Psychometric properties, CSC, one-way ANOVA, Spearman’s Correlations and Multiple Regression were calculated. The WAI-VAR showed a unidimensional structure, high internal consistency and adequate convergent validity. “Not changed” patients scored lower on the WAI-VAR than “improved” and “recovered” patients. Correlation between the WAI-VAR and CSC was moderate. The best fitting model for predicting CSC was a linear combination of the TA with therapist (WAI-S) and the TA with VR and AR (WAI-VAR), due to the latter variable slightly increased the percentage of variability accounted for in CSC. The WAI-VAR is the first validated instrument to measure the TA with VR and AR in research and clinical practice. This study reveals the importance of the quality of the TA with technologies in achieving positive outcomes in the therapy. PMID:26500589

  4. A Real-Time Augmented Reality System for Industrial Tele-Training

    NASA Astrophysics Data System (ADS)

    Boulanger, Pierre; Georganas, Nicolas D.; Zhong, Xiaowei; Liu, Peiran

    2003-01-01

    Augmented Reality (AR) is a departure from standard virtual reality in a sense that it allows users to see computer generated virtual objects superimposed over the real world through the use of see-through head-mounted display. Users of such system can interact in the real/virtual world using additional information, such as 3D virtual models and instructions on how to perform these tasks in the form of video clips, annotations, speech instructions, and images. In this paper, we describe two prototypes of a collaborative industrial Tele-training system. The distributed aspect of this system will enables users on remote sites to collaborate on training tasks by sharing the view of the local user equipped with a wearable computer. The users can interactively manipulate virtual objects that substitute real objects allowing the trainee to try out and discuss the various tasks that needs to be performed. A new technique for identifying real world objects and estimating their coordinates in 3D space is introduced. The method is based on a computer vision technique capable of identifying and locating Binary Square Markers identifying each information stations. Experimental results are presented.

  5. Precise Haptic Device Co-Location for Visuo-Haptic Augmented Reality.

    PubMed

    Eck, Ulrich; Pankratz, Frieder; Sandor, Christian; Klinker, Gudrun; Laga, Hamid

    2015-12-01

    Visuo-haptic augmented reality systems enable users to see and touch digital information that is embedded in the real world. PHANToM haptic devices are often employed to provide haptic feedback. Precise co-location of computer-generated graphics and the haptic stylus is necessary to provide a realistic user experience. Previous work has focused on calibration procedures that compensate the non-linear position error caused by inaccuracies in the joint angle sensors. In this article we present a more complete procedure that additionally compensates for errors in the gimbal sensors and improves position calibration. The proposed procedure further includes software-based temporal alignment of sensor data and a method for the estimation of a reference for position calibration, resulting in increased robustness against haptic device initialization and external tracker noise. We designed our procedure to require minimal user input to maximize usability. We conducted an extensive evaluation with two different PHANToMs, two different optical trackers, and a mechanical tracker. Compared to state-of-the-art calibration procedures, our approach significantly improves the co-location of the haptic stylus. This results in higher fidelity visual and haptic augmentations, which are crucial for fine-motor tasks in areas such as medical training simulators, assembly planning tools, or rapid prototyping applications. PMID:26394430

  6. Way-Finding Assistance System for Underground Facilities Using Augmented Reality

    NASA Astrophysics Data System (ADS)

    Yokoi, K.; Yabuki, N.; Fukuda, T.; Michikawa, T.; Motamedi, A.

    2015-05-01

    Way-finding is one of main challenges for pedestrians in large subterranean spaces with complex network of connected labyrinths. This problem is caused by the loss of their sense of directions and orientation due to the lack of landmarks that are occluded by ceilings, walls, and skyscraper. This paper introduces an assistance system for way-finding problem in large subterranean spaces using Augmented Reality (AR). It suggests displaying known landmarks that are invisible in indoor environments on tablet/handheld devices to assist users with relative positioning and indoor way-finding. The location and orientation of the users can be estimated by the indoor positioning systems and sensors available in the common tablet or smartphones devices. The constructed 3D model of a chosen landmark that is in the field of view of the handheld's camera is augmented on the camera's video feed. A prototype system has been implemented to demonstrate the efficiency of the proposed system for way-finding.

  7. Feasibility of Virtual Reality Augmented Cycling for Health Promotion of People Post-Stroke

    PubMed Central

    Deutsch, Judith E; Myslinski, Mary Jane; Kafri, Michal; Ranky, Richard; Sivak, Mark; Mavroidis, Constantinos; Lewis, Jeffrey A

    2013-01-01

    Background and Purpose A virtual reality (VR) augmented cycling kit (VRACK) was developed to address motor control and fitness deficits of individuals with chronic stroke. In this paper we report on the safety, feasibility and efficacy of using the VRACK to train cardio-respiratory (CR) fitness of individuals in the chronic phase poststroke. Methods Four individuals with chronic stroke (47–65 years old and three or more years post-stroke), with residual lower extremity impairments (Fugl Meyer 24–26/34) who were limited community ambulators (gait speed range 0.56 to 1.1 m/s) participated in this study. Safety was defined as the absence of adverse events. Feasibility was measured using attendance, total exercise time, and “involvement” measured with the Presence Questionnaire (PQ). Efficacy of CR fitness was evaluated using a sub-maximal bicycle ergometer test before and after an 8-week training program. Results The intervention was safe and feasible with participants having 1 adverse event, 100% adherence, achieving between 90 and 125 minutes of cycling each week and a mean PQ score of 39 (SD 3.3). There was a statistically significant 13% (p = 0.035) improvement in peak VO2 with a range of 6–24.5 %. Discussion and Conclusion For these individuals post-stroke, VR augmented cycling, using their heart rate to set their avatar’s speed, fostered training of sufficient duration and intensity to promote CR fitness. In addition, there was a transfer of training from the bicycle to walking endurance. VR augmented cycling may be an addition to the therapist’s tools for concurrent training of mobility and health promotion of individuals post-stroke. Video Abstract available (see Video, Supplemental Digital Content 1) for more insights from the authors. PMID:23863828

  8. There's an app for that shirt! Evaluation of augmented reality tracking methods on deformable surfaces for fashion design

    NASA Astrophysics Data System (ADS)

    Ruzanka, Silvia; Chang, Ben; Behar, Katherine

    2013-03-01

    In this paper we present appARel, a creative research project at the intersection of augmented reality, fashion, and performance art. appARel is a mobile augmented reality application that transforms otherwise ordinary garments with 3D animations and modifications. With appARel, entire fashion collections can be uploaded in a smartphone application, and "new looks" can be downloaded in a software update. The project will culminate in a performance art fashion show, scheduled for March 2013. appARel includes textile designs incorporating fiducial markers, garment designs that incorporate multiple markers with the human body, and iOS and Android apps that apply different augments, or "looks", to a garment. We discuss our philosophy for combining computer-generated and physical objects; and share the challenges we encountered in applying fiduciary markers to the 3D curvatures of the human body.

  9. Two Innovative Steps for Training on Maintenance: 'VIRMAN' Spanish Project based on Virtual Reality 'STARMATE' European Project based on Augmented Reality

    SciTech Connect

    Gonzalez Anez, Francisco

    2002-07-01

    This paper presents two development projects (STARMATE and VIRMAN) focused on supporting training on maintenance. Both projects aim at specifying, designing, developing, and demonstrating prototypes allowing computer guided maintenance of complex mechanical elements using Augmented and Virtual Reality techniques. VIRMAN is a Spanish development project. The objective is to create a computer tool for maintenance training course elaborations and training delivery based on 3D virtual reality models of complex components. The training delivery includes 3D record displays on maintenance procedures with all complementary information for intervention understanding. Users are requested to perform the maintenance intervention trying to follow up the procedure. Users can be evaluated about the level of knowledge achieved. Instructors can check the evaluation records left during the training sessions. VIRMAN is simple software supported by a regular computer and can be used in an Internet framework. STARMATE is a forward step in the area of virtual reality. STARMATE is a European Commission project in the frame of 'Information Societies Technologies'. A consortium of five companies and one research institute shares their expertise in this new technology. STARMATE provides two main functionalities (1) user assistance for achieving assembly/de-assembly and following maintenance procedures, and (2) workforce training. The project relies on Augmented Reality techniques, which is a growing area in Virtual Reality research. The idea of Augmented Reality is to combine a real scene, viewed by the user, with a virtual scene, generated by a computer, augmenting the reality with additional information. The user interface is see-through goggles, headphones, microphone and an optical tracking system. All these devices are integrated in a helmet connected with two regular computers. The user has his hands free for performing the maintenance intervention and he can navigate in the virtual

  10. Virtual Planning, Control, and Machining for a Modular-Based Automated Factory Operation in an Augmented Reality Environment

    NASA Astrophysics Data System (ADS)

    Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S.; Phoon, Sin Ye

    2016-06-01

    This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively.

  11. Virtual Planning, Control, and Machining for a Modular-Based Automated Factory Operation in an Augmented Reality Environment.

    PubMed

    Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S; Phoon, Sin Ye

    2016-01-01

    This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively. PMID:27271840

  12. Virtual Planning, Control, and Machining for a Modular-Based Automated Factory Operation in an Augmented Reality Environment

    PubMed Central

    Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S.; Phoon, Sin Ye

    2016-01-01

    This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively. PMID:27271840

  13. Development of CFD model for augmented core tripropellant rocket engine

    NASA Astrophysics Data System (ADS)

    Jones, Kenneth M.

    1994-10-01

    The Space Shuttle era has made major advances in technology and vehicle design to the point that the concept of a single-stage-to-orbit (SSTO) vehicle appears more feasible. NASA presently is conducting studies into the feasibility of certain advanced concept rocket engines that could be utilized in a SSTO vehicle. One such concept is a tripropellant system which burns kerosene and hydrogen initially and at altitude switches to hydrogen. This system will attain a larger mass fraction because LOX-kerosene engines have a greater average propellant density and greater thrust-to-weight ratio. This report describes the investigation to model the tripropellant augmented core engine. The physical aspects of the engine, the CFD code employed, and results of the numerical model for a single modular thruster are discussed.

  14. Shaping Watersheds Exhibit: An Interactive, Augmented Reality Sandbox for Advancing Earth Science Education

    NASA Astrophysics Data System (ADS)

    Reed, S. E.; Kreylos, O.; Hsi, S.; Kellogg, L. H.; Schladow, G.; Yikilmaz, M. B.; Segale, H.; Silverman, J.; Yalowitz, S.; Sato, E.

    2014-12-01

    One of the challenges involved in learning earth science is the visualization of processes which occur over large spatial and temporal scales. Shaping Watersheds is an interactive 3D exhibit developed with support from the National Science Foundation by a team of scientists, science educators, exhibit designers, and evaluation professionals, in an effort to improve public understanding and stewardship of freshwater ecosystems. The hands-on augmented reality sandbox allows users to create topographic models by shaping real "kinetic" sand. The exhibit is augmented in real time by the projection of a color elevation map and contour lines which exactly match the sand topography, using a closed loop of a Microsoft Kinect 3D camera, simulation and visualization software, and a data projector. When an object (such as a hand) is sensed at a particular height above the sand surface, virtual rain appears as a blue visualization on the surface and a flow simulation (based on a depth-integrated version of the Navier-Stokes equations) moves the water across the landscape. The blueprints and software to build the sandbox are freely available online (http://3dh2o.org/71/) under the GNU General Public License, together with a facilitator's guide and a public forum (with how-to documents and FAQs). Using these resources, many institutions (20 and counting) have built their own exhibits to teach a wide variety of topics (ranging from watershed stewardship, hydrology, geology, topographic map reading, and planetary science) in a variety of venues (such as traveling science exhibits, K-12 schools, university earth science departments, and museums). Additional exhibit extensions and learning modules are planned such as tsunami modeling and prediction. Moreover, a study is underway at the Lawrence Hall of Science to assess how various aspects of the sandbox (such as visualization color scheme and level of interactivity) affect understanding of earth science concepts.

  15. System for synthetic vision and augmented reality in future flight decks

    NASA Astrophysics Data System (ADS)

    Behringer, Reinhold; Tam, Clement K.; McGee, Joshua H.; Sundareswaran, Venkataraman; Vassiliou, Marius S.

    2000-06-01

    Rockwell Science Center is investigating novel human-computer interface techniques for enhancing the situational awareness in future flight decks. One aspect is to provide intuitive displays which provide the vital information and the spatial awareness by augmenting the real world with an overlay of relevant information registered to the real world. Such Augmented Reality (AR) techniques can be employed during bad weather scenarios to permit flying in Visual Flight Rules (VFR) in conditions which would normally require Instrumental Flight Rules (IFR). These systems could easily be implemented on heads-up displays (HUD). The advantage of AR systems vs. purely synthetic vision (SV) systems is that the pilot can relate the information overlay to real objects in the world, whereas SV systems provide a constant virtual view, where inconsistencies can hardly be detected. The development of components for such a system led to a demonstrator implemented on a PC. A camera grabs video images which are overlaid with registered information, Orientation of the camera is obtained from an inclinometer and a magnetometer, position is acquired from GPS. In a possible implementation in an airplane, the on-board attitude information can be used for obtaining correct registration. If visibility is sufficient, computer vision modules can be used to fine-tune the registration by matching visual clues with database features. Such technology would be especially useful for landing approaches. The current demonstrator provides a frame-rate of 15 fps, using a live video feed as background and an overlay of avionics symbology in the foreground. In addition, terrain rendering from a 1 arc sec. digital elevation model database can be overlaid to provide synthetic vision in case of limited visibility. For true outdoor testing (on ground level), the system has been implemented on a wearable computer.

  16. Stereoscopic augmented reality using ultrasound volume rendering for laparoscopic surgery in children

    NASA Astrophysics Data System (ADS)

    Oh, Jihun; Kang, Xin; Wilson, Emmanuel; Peters, Craig A.; Kane, Timothy D.; Shekhar, Raj

    2014-03-01

    In laparoscopic surgery, live video provides visualization of the exposed organ surfaces in the surgical field, but is unable to show internal structures beneath those surfaces. The laparoscopic ultrasound is often used to visualize the internal structures, but its use is limited to intermittent confirmation because of the need for an extra hand to maneuver the ultrasound probe. Other limitations of using ultrasound are the difficulty of interpretation and the need for an extra port. The size of the ultrasound transducer may also be too large for its usage in small children. In this paper, we report on an augmented reality (AR) visualization system that features continuous hands-free volumetric ultrasound scanning of the surgical anatomy and video imaging from a stereoscopic laparoscope. The acquisition of volumetric ultrasound image is realized by precisely controlling a back-and-forth movement of an ultrasound transducer mounted on a linear slider. Furthermore, the ultrasound volume is refreshed several times per minute. This scanner will sit outside of the body in the envisioned use scenario and could be even integrated into the operating table. An overlay of the maximum intensity projection (MIP) of ultrasound volume on the laparoscopic stereo video through geometric transformations features an AR visualization system particularly suitable for children, because ultrasound is radiation-free and provides higher-quality images in small patients. The proposed AR representation promises to be better than the AR representation using ultrasound slice data.

  17. Affordances and Limitations of Immersive Participatory Augmented Reality Simulations for Teaching and Learning

    NASA Astrophysics Data System (ADS)

    Dunleavy, Matt; Dede, Chris; Mitchell, Rebecca

    2009-02-01

    The purpose of this study was to document how teachers and students describe and comprehend the ways in which participating in an augmented reality (AR) simulation aids or hinders teaching and learning. Like the multi-user virtual environment (MUVE) interface that underlies Internet games, AR is a good medium for immersive collaborative simulation, but has different strengths and limitations than MUVEs. Within a design-based research project, the researchers conducted multiple qualitative case studies across two middle schools (6th and 7th grade) and one high school (10th grade) in the northeastern United States to document the affordances and limitations of AR simulations from the student and teacher perspective. The researchers collected data through formal and informal interviews, direct observations, web site posts, and site documents. Teachers and students reported that the technology-mediated narrative and the interactive, situated, collaborative problem solving affordances of the AR simulation were highly engaging, especially among students who had previously presented behavioral and academic challenges for the teachers. However, while the AR simulation provided potentially transformative added value, it simultaneously presented unique technological, managerial, and cognitive challenges to teaching and learning.

  18. Matching and reaching depth judgments with real and augmented reality targets.

    PubMed

    Swan, J Edward; Singh, Gurjot; Ellis, Stephen R

    2015-11-01

    Many compelling augmented reality (AR) applications require users to correctly perceive the location of virtual objects, some with accuracies as tight as 1 mm. However, measuring the perceived depth of AR objects at these accuracies has not yet been demonstrated. In this paper, we address this challenge by employing two different depth judgment methods, perceptual matching and blind reaching, in a series of three experiments, where observers judged the depth of real and AR target objects presented at reaching distances. Our experiments found that observers can accurately match the distance of a real target, but when viewing an AR target through collimating optics, their matches systematically overestimate the distance by 0.5 to 4.0 cm. However, these results can be explained by a model where the collimation causes the eyes' vergence angle to rotate outward by a constant angular amount. These findings give error bounds for using collimating AR displays at reaching distances, and suggest that for these applications, AR displays need to provide an adjustable focus. Our experiments further found that observers initially reach ∼4 cm too short, but reaching accuracy improves with both consistent proprioception and corrective visual feedback, and eventually becomes nearly as accurate as matching. PMID:26340777

  19. Clinical application of navigation surgery using augmented reality in the abdominal field.

    PubMed

    Okamoto, Tomoyoshi; Onda, Shinji; Yanaga, Katsuhiko; Suzuki, Naoki; Hattori, Asaki

    2015-04-01

    This article presents general principles and recent advancements in the clinical application of augmented reality-based navigation surgery (AR based NS) for abdominal procedures and includes a description of our clinical trial and subsequent outcomes. Moreover, current problems and future aspects are discussed. The development of AR-based NS in the abdomen is delayed compared with another field because of the problem of intraoperative organ deformations or the existence of established modalities. Although there are a few reports on the clinical use of AR-based NS for digestive surgery, sophisticated technologies in urology have often been reported. However, the rapid widespread use of video- or robot assisted surgeries requires this technology. We have worked to develop a system of AR-based NS for hepatobiliary and pancreatic surgery. Then we developed a short rigid scope that enables surgeons to obtain 3D view. We recently focused on pancreatic surgery, because intraoperative organ shifting is minimal. The position of each organ in overlaid image almost corresponded with that of the actual organ with about 5 mm of mean registration errors. Intraoperative information generated from this system provided us with useful navigation. However, AR-based NS has several problems to overcome such as organ deformity, evaluation of utility, portability or cost. PMID:24898629

  20. A high-accuracy surgical augmented reality system using enhanced integral videography image overlay.

    PubMed

    Zhang, Xinran; Chen, Guowen; Liao, Hongen

    2015-08-01

    Image guided surgery has been used in clinic to improve the surgery safety and accuracy. Augmented reality (AR) technique, which can provide intuitive image guidance, has been greatly evolved these years. As one promising approach of surgical AR systems, integral videography (IV) autostereoscopic image overlay has achieved accurate fusion of full parallax guidance into surgical scene. This paper describes an image enhanced high-accuracy IV overlay system. A flexible optical image enhancement system (IES) is designed to increase the resolution and quality of IV image. Furthermore, we introduce a novel IV rendering algorithm to promote the spatial accuracy with the consideration of distortion introduced by micro lens array. Preliminary experiments validated that the image accuracy and resolution are improved with the proposed methods. The resolution of the IV image could be promoted to 1 mm for a micro lens array with pitch of 2.32 mm and IES magnification value of 0.5. The relative deviation of accuracy in depth and lateral directions are -4.68 ± 0.83% and -9.01 ± 0.42%. PMID:26737223

  1. Fast Scene Recognition and Camera Relocalisation for Wide Area Augmented Reality Systems

    PubMed Central

    Guan, Tao; Duan, Liya; Chen, Yongjian; Yu, Junqing

    2010-01-01

    This paper focuses on online scene learning and fast camera relocalisation which are two key problems currently limiting the performance of wide area augmented reality systems. Firstly, we propose to use adaptive random trees to deal with the online scene learning problem. The algorithm can provide more accurate recognition rates than traditional methods, especially with large scale workspaces. Secondly, we use the enhanced PROSAC algorithm to obtain a fast camera relocalisation method. Compared with traditional algorithms, our method can significantly reduce the computation complexity, which facilitates to a large degree the process of online camera relocalisation. Finally, we implement our algorithms in a multithreaded manner by using a parallel-computing scheme. Camera tracking, scene mapping, scene learning and relocalisation are separated into four threads by using multi-CPU hardware architecture. While providing real-time tracking performance, the resulting system also possesses the ability to track multiple maps simultaneously. Some experiments have been conducted to demonstrate the validity of our methods. PMID:22219700

  2. Fast scene recognition and camera relocalisation for wide area augmented reality systems.

    PubMed

    Guan, Tao; Duan, Liya; Chen, Yongjian; Yu, Junqing

    2010-01-01

    This paper focuses on online scene learning and fast camera relocalisation which are two key problems currently limiting the performance of wide area augmented reality systems. Firstly, we propose to use adaptive random trees to deal with the online scene learning problem. The algorithm can provide more accurate recognition rates than traditional methods, especially with large scale workspaces. Secondly, we use the enhanced PROSAC algorithm to obtain a fast camera relocalisation method. Compared with traditional algorithms, our method can significantly reduce the computation complexity, which facilitates to a large degree the process of online camera relocalisation. Finally, we implement our algorithms in a multithreaded manner by using a parallel-computing scheme. Camera tracking, scene mapping, scene learning and relocalisation are separated into four threads by using multi-CPU hardware architecture. While providing real-time tracking performance, the resulting system also possesses the ability to track multiple maps simultaneously. Some experiments have been conducted to demonstrate the validity of our methods. PMID:22219700

  3. Holographic display for see-through augmented reality using mirror-lens holographic optical element.

    PubMed

    Li, Gang; Lee, Dukho; Jeong, Youngmo; Cho, Jaebum; Lee, Byoungho

    2016-06-01

    A holographic display system for realizing a three-dimensional optical see-through augmented reality (AR) is proposed. A multi-functional holographic optical element (HOE), which simultaneously performs the optical functions of a mirror and a lens, is adopted in the system. In the proposed method, a mirror that is used to guide the light source into a reflection type spatial light modulator (SLM) and a lens that functions as Fourier transforming optics are recorded on a single holographic recording material by utilizing an angular multiplexing technique of volume hologram. The HOE is transparent and performs the optical functions just for Bragg matched condition. Therefore, the real-world scenes that are usually distorted by a Fourier lens or an SLM in the conventional holographic display can be observed without visual disturbance by using the proposed mirror-lens HOE (MLHOE). Furthermore, to achieve an optimized optical recording condition of the MLHOE, the optical characteristics of the holographic material are measured. The proposed holographic AR display system is verified experimentally. PMID:27244395

  4. A learning performance study between the conventional approach and augmented reality textbook among secondary school students

    NASA Astrophysics Data System (ADS)

    Gopalan, Valarmathie; Zulkifli, Abdul Nasir; Bakar, Juliana Aida Abu

    2016-08-01

    Malaysia is moving towards becoming a developed nation by 2020. As such, the need for adequate human resources in science-related fields is one of the requirements to achieve a developed nation status. Unfortunately, there is a downward trend in the number of students pursuing the science stream at the secondary school level. This paper introduces an enhanced science textbook using Augmented Reality (eSTAR) that is intended to motivate students to be interested in science. The eSTAR was implemented to provide a supplement to the conventional science teaching and learning methods in the secondary schools. A learning performance study with a control group was conducted to investigate the effectiveness of the eSTAR for science learning among a sample of 140 Form Two secondary school students. The results indicate that the learning performance of the students in both groups had a significant difference in mean scores between the pre-test and post-test. Students using the eSTAR have a better score in the post-test and eventually resulted in a better learning performance compared to those who were exposed to the conventional science learning. Overall, the results show that the students benefited from the use of the conventional and eSTAR learning approaches.

  5. Application of the Augmented Reality in prototyping the educational simulator in sport - the example of judo

    NASA Astrophysics Data System (ADS)

    Cieślńiski, Wojciech B.; Sobecki, Janusz; Piepiora, Paweł A.; Piepiora, Zbigniew N.; Witkowski, Kazimierz

    2016-04-01

    The mental training (Galloway, 2011) is one of the measures of the psychological preparation in sport. Especially such as the judo discipline requires the mental training, due to the fact that the judo is a combat sport, the direct, physical confrontation of two opponents. Hence the mental preparation should be an essential element of preparing for the sports fight. In the article are described the basics of the AR systems and presents selected elements of the AR systems: sight glasses Vuzix glasses, Kinect sensor and an interactive floor Multitap. Next, there are proposed the scenarios for using the AR in the mental training which are based on using both Vuzix glasses type head as well as the interactive floor Multitap. All options, except for the last, are provided for using the Kinect sensor. In addition, these variants differ as to the primary user of the system. It can be an competitor, his coach the competitor and the coach at the same time. In the end of the article are presented methods of exploring, both, the effectiveness and usefulness, and/or the User Experience of the proposed prototypes. There are presented three educational training simulator prototype models in sport (judo) describing their functionality based on the theory of sports training (the cyclical nature of sports training) and the theory of subtle interactions, enabling an explanation of the effects of sports training using the augmented reality technology.

  6. SoftAR: visually manipulating haptic softness perception in spatial augmented reality.

    PubMed

    Punpongsanon, Parinya; Iwai, Daisuke; Sato, Kosuke

    2015-11-01

    We present SoftAR, a novel spatial augmented reality (AR) technique based on a pseudo-haptics mechanism that visually manipulates the sense of softness perceived by a user pushing a soft physical object. Considering the limitations of projection-based approaches that change only the surface appearance of a physical object, we propose two projection visual effects, i.e., surface deformation effect (SDE) and body appearance effect (BAE), on the basis of the observations of humans pushing physical objects. The SDE visualizes a two-dimensional deformation of the object surface with a controlled softness parameter, and BAE changes the color of the pushing hand. Through psychophysical experiments, we confirm that the SDE can manipulate softness perception such that the participant perceives significantly greater softness than the actual softness. Furthermore, fBAE, in which BAE is applied only for the finger area, significantly enhances manipulation of the perception of softness. We create a computational model that estimates perceived softness when SDE+fBAE is applied. We construct a prototype SoftAR system in which two application frameworks are implemented. The softness adjustment allows a user to adjust the softness parameter of a physical object, and the softness transfer allows the user to replace the softness with that of another object. PMID:26340774

  7. An augmented reality home-training system based on the mirror training and imagery approach.

    PubMed

    Trojan, Jörg; Diers, Martin; Fuchs, Xaver; Bach, Felix; Bekrater-Bodmann, Robin; Foell, Jens; Kamping, Sandra; Rance, Mariela; Maaß, Heiko; Flor, Herta

    2014-09-01

    Mirror training and movement imagery have been demonstrated to be effective in treating several clinical conditions, such as phantom limb pain, stroke-induced hemiparesis, and complex regional pain syndrome. This article presents an augmented reality home-training system based on the mirror and imagery treatment approaches for hand training. A head-mounted display equipped with cameras captures one hand held in front of the body, mirrors this hand, and displays it in real time in a set of four different training tasks: (1) flexing fingers in a predefined sequence, (2) moving the hand into a posture fitting into a silhouette template, (3) driving a "Snake" video game with the index finger, and (4) grasping and moving a virtual ball. The system records task performance and transfers these data to a central server via the Internet, allowing monitoring of training progress. We evaluated the system by having 7 healthy participants train with it over the course of ten sessions of 15-min duration. No technical problems emerged during this time. Performance indicators showed that the system achieves a good balance between relatively easy and more challenging tasks and that participants improved significantly over the training sessions. This suggests that the system is well suited to maintain motivation in patients, especially when it is used for a prolonged period of time. PMID:24338625

  8. Affordances of Augmented Reality in Science Learning: Suggestions for Future Research

    NASA Astrophysics Data System (ADS)

    Cheng, Kun-Hung; Tsai, Chin-Chung

    2013-08-01

    Augmented reality (AR) is currently considered as having potential for pedagogical applications. However, in science education, research regarding AR-aided learning is in its infancy. To understand how AR could help science learning, this review paper firstly has identified two major approaches of utilizing AR technology in science education, which are named as image- based AR and location- based AR. These approaches may result in different affordances for science learning. It is then found that students' spatial ability, practical skills, and conceptual understanding are often afforded by image-based AR and location-based AR usually supports inquiry-based scientific activities. After examining what has been done in science learning with AR supports, several suggestions for future research are proposed. For example, more research is required to explore learning experience (e.g., motivation or cognitive load) and learner characteristics (e.g., spatial ability or perceived presence) involved in AR. Mixed methods of investigating learning process (e.g., a content analysis and a sequential analysis) and in-depth examination of user experience beyond usability (e.g., affective variables of esthetic pleasure or emotional fulfillment) should be considered. Combining image-based and location-based AR technology may bring new possibility for supporting science learning. Theories including mental models, spatial cognition, situated cognition, and social constructivist learning are suggested for the profitable uses of future AR research in science education.

  9. From Motion to Photons in 80 Microseconds: Towards Minimal Latency for Virtual and Augmented Reality.

    PubMed

    Lincoln, Peter; Blate, Alex; Singh, Montek; Whitted, Turner; State, Andrei; Lastra, Anselmo; Fuchs, Henry

    2016-04-01

    We describe an augmented reality, optical see-through display based on a DMD chip with an extremely fast (16 kHz) binary update rate. We combine the techniques of post-rendering 2-D offsets and just-in-time tracking updates with a novel modulation technique for turning binary pixels into perceived gray scale. These processing elements, implemented in an FPGA, are physically mounted along with the optical display elements in a head tracked rig through which users view synthetic imagery superimposed on their real environment. The combination of mechanical tracking at near-zero latency with reconfigurable display processing has given us a measured average of 80 µs of end-to-end latency (from head motion to change in photons from the display) and also a versatile test platform for extremely-low-latency display systems. We have used it to examine the trade-offs between image quality and cost (i.e. power and logical complexity) and have found that quality can be maintained with a fairly simple display modulation scheme. PMID:26780797

  10. Stereovision and augmented reality for closed-loop control of grasping in hand prostheses

    NASA Astrophysics Data System (ADS)

    Markovic, Marko; Dosen, Strahinja; Cipriani, Christian; Popovic, Dejan; Farina, Dario

    2014-08-01

    Objective. Technologically advanced assistive devices are nowadays available to restore grasping, but effective and effortless control integrating both feed-forward (commands) and feedback (sensory information) is still missing. The goal of this work was to develop a user friendly interface for the semi-automatic and closed-loop control of grasping and to test its feasibility. Approach. We developed a controller based on stereovision to automatically select grasp type and size and augmented reality (AR) to provide artificial proprioceptive feedback. The system was experimentally tested in healthy subjects using a dexterous hand prosthesis to grasp a set of daily objects. The subjects wore AR glasses with an integrated stereo-camera pair, and triggered the system via a simple myoelectric interface. Main results. The results demonstrated that the subjects got easily acquainted with the semi-autonomous control. The stereovision grasp decoder successfully estimated the grasp type and size in realistic, cluttered environments. When allowed (forced) to correct the automatic system decisions, the subjects successfully utilized the AR feedback and achieved close to ideal system performance. Significance. The new method implements a high level, low effort control of complex functions in addition to the low level closed-loop control. The latter is achieved by providing rich visual feedback, which is integrated into the real life environment. The proposed system is an effective interface applicable with small alterations for many advanced prosthetic and orthotic/therapeutic rehabilitation devices.

  11. Providing IoT Services in Smart Cities through Dynamic Augmented Reality Markers.

    PubMed

    Chaves-Diéguez, David; Pellitero-Rivero, Alexandre; García-Coego, Daniel; González-Castaño, Francisco Javier; Rodríguez-Hernández, Pedro Salvador; Piñeiro-Gómez, Óscar; Gil-Castiñeira, Felipe; Costa-Montenegro, Enrique

    2015-01-01

    Smart cities are expected to improve the quality of life of citizens by relying on new paradigms, such as the Internet of Things (IoT) and its capacity to manage and interconnect thousands of sensors and actuators scattered across the city. At the same time, mobile devices widely assist professional and personal everyday activities. A very good example of the potential of these devices for smart cities is their powerful support for intuitive service interfaces (such as those based on augmented reality (AR)) for non-expert users. In our work, we consider a scenario that combines IoT and AR within a smart city maintenance service to improve the accessibility of sensor and actuator devices in the field, where responsiveness is crucial. In it, depending on the location and needs of each service, data and commands will be transported by an urban communications network or consulted on the spot. Direct AR interaction with urban objects has already been described; it usually relies on 2D visual codes to deliver object identifiers (IDs) to the rendering device to identify object resources. These IDs allow information about the objects to be retrieved from a remote server. In this work, we present a novel solution that replaces static AR markers with dynamic markers based on LED communication, which can be decoded through cameras embedded in smartphones. These dynamic markers can directly deliver sensor information to the rendering device, on top of the object ID, without further network interaction. PMID:26151215

  12. Hand gesture guided robot-assisted surgery based on a direct augmented reality interface.

    PubMed

    Wen, Rong; Tay, Wei-Liang; Nguyen, Binh P; Chng, Chin-Boon; Chui, Chee-Kong

    2014-09-01

    Radiofrequency (RF) ablation is a good alternative to hepatic resection for treatment of liver tumors. However, accurate needle insertion requires precise hand-eye coordination and is also affected by the difficulty of RF needle navigation. This paper proposes a cooperative surgical robot system, guided by hand gestures and supported by an augmented reality (AR)-based surgical field, for robot-assisted percutaneous treatment. It establishes a robot-assisted natural AR guidance mechanism that incorporates the advantages of the following three aspects: AR visual guidance information, surgeon's experiences and accuracy of robotic surgery. A projector-based AR environment is directly overlaid on a patient to display preoperative and intraoperative information, while a mobile surgical robot system implements specified RF needle insertion plans. Natural hand gestures are used as an intuitive and robust method to interact with both the AR system and surgical robot. The proposed system was evaluated on a mannequin model. Experimental results demonstrated that hand gesture guidance was able to effectively guide the surgical robot, and the robot-assisted implementation was found to improve the accuracy of needle insertion. This human-robot cooperative mechanism is a promising approach for precise transcutaneous ablation therapy. PMID:24438993

  13. 3D global estimation and augmented reality visualization of intra-operative X-ray dose.

    PubMed

    Rodas, Nicolas Loy; Padoy, Nicolas

    2014-01-01

    The growing use of image-guided minimally-invasive surgical procedures is confronting clinicians and surgical staff with new radiation exposure risks from X-ray imaging devices. The accurate estimation of intra-operative radiation exposure can increase staff awareness of radiation exposure risks and enable the implementation of well-adapted safety measures. The current surgical practice of wearing a single dosimeter at chest level to measure radiation exposure does not provide a sufficiently accurate estimation of radiation absorption throughout the body. In this paper, we propose an approach that combines data from wireless dosimeters with the simulation of radiation propagation in order to provide a global radiation risk map in the area near the X-ray device. We use a multi-camera RGBD system to obtain a 3D point cloud reconstruction of the room. The positions of the table, C-arm and clinician are then used 1) to simulate the propagation of radiation in a real-world setup and 2) to overlay the resulting 3D risk-map onto the scene in an augmented reality manner. By using real-time wireless dosimeters in our system, we can both calibrate the simulation and validate its accuracy at specific locations in real-time. We demonstrate our system in an operating room equipped with a robotised X-ray imaging device and validate the radiation simulation on several X-ray acquisition setups. PMID:25333145

  14. Augmented reality technology for day/night situational awareness for the dismounted Soldier

    NASA Astrophysics Data System (ADS)

    Gans, Eric; Roberts, David; Bennett, Matthew; Towles, Herman; Menozzi, Alberico; Cook, James; Sherrill, Todd

    2015-05-01

    This paper describes Applied Research Associates' (ARA) recent advances in Soldier augmented reality (AR) technology. Our AR technology, called ARC4, delivers heads-up situational awareness to the dismounted warfighter, enabling non-line-of-sight team coordination in distributed operations. ARC4 combines compact head tracking sensors with advanced pose estimation algorithms, network management software, and an intuitive AR visualization interface to overlay tactical iconic information accurately on the user's real-world view. The technology supports heads-up navigation, blue-force tracking, target handoff, image sharing, and tagging of features in the environment. It integrates seamlessly with established network protocols (e.g., Cursor-on-Target) and Command and Control software tools (e.g., Nett Warrior, Android Tactical Assault Kit) and interfaces with a wide range of daytime see-through displays and night vision goggles to deliver real-time actionable intelligence, day or night. We describe our pose estimation framework, which fuses inertial data, magnetometer data, GPS, DTED, and digital imagery to provide measurements of the operator's precise orientation. These measurements leverage mountainous terrain horizon geometry, known landmarks, and sun position, enabling ARC4 to achieve significant improvements in accuracy compared to conventional INS/GPS solutions of similar size, weight, and power. We detail current research and development efforts toward helmet-based and handheld AR systems for operational use cases and describe extensions to immersive training applications.

  15. Composite lay-up process with application of elements of augmented reality

    NASA Astrophysics Data System (ADS)

    Novák-Marcinčin, Jozef; Barna, Jozef; Janák, Miroslav; Fečová, Veronika; Nováková-Marcinčinová, L'udmila

    2012-03-01

    This article deals with virtual tools based on principles of open source philosophy in implementation area of composite lay-up technology. It describes virtual software and hardware elements that are necessary for work in augmented reality environment. In the beginning it focuses on general problems of applications of virtual components and in composite lay-up process. It clarifies the fundamental philosophy of new created application and the process called visual scripting that was used for program development. Further it provides closer view on particular logical sections, where necessary data are harvested and compared with values from virtual arrays. Also the new device is described with adjustment of operating desk, what enables detailed control of any realized manufacturing process. This positioning table can determine and set the position of the working plane using the commands in computer interface or manual changes. Information about exact position of individual layers are obtained in real time thanks to the built-in sensors. One of them manages the data change of the desk position (X, Y, Z), other checks the rotation around main axis situated in the center of the table. New software consists of 4 main logical areas with data packets comming from internal computer components as well as from external devices. In the end the displaying section is able to realize the movement process of virtual item (composite layer) according to its trajectory. Article presents new attitude in realization of composite lay-up process. Finally it deals with possible future improvements and other application possibilities.

  16. ARBOOK: Development and Assessment of a Tool Based on Augmented Reality for Anatomy

    NASA Astrophysics Data System (ADS)

    Ferrer-Torregrosa, J.; Torralba, J.; Jimenez, M. A.; García, S.; Barcia, J. M.

    2015-02-01

    The evolution of technologies and the development of new tools with educational purposes are growing up. This work presents the experience of a new tool based on augmented reality (AR) focusing on the anatomy of the lower limb. ARBOOK was constructed and developed based on TC and MRN images, dissections and drawings. For ARBOOK evaluation, a specific questionnaire of three blocks was performed and validated according to the Delphi method. The questionnaire included motivation and attention tasks, autonomous work and three-dimensional interpretation tasks. A total of 211 students from 7 public and private Spanish universities were divided in two groups. Control group received standard teaching sessions supported by books, and video. The ARBOOK group received the same standard sessions but additionally used the ARBOOK tool. At the end of the training, a written test on lower limb anatomy was done by students. Statistically significant better scorings for the ARBOOK group were found on attention-motivation, autonomous work and three-dimensional comprehension tasks. Additionally, significantly better scoring was obtained by the ARBOOK group in the written test. The results strongly suggest that the use of AR is suitable for anatomical purposes. Concretely, the results indicate how this technology is helpful for student motivation, autonomous work or spatial interpretation. The use of this type of technologies must be taken into account even more at the present moment, when new technologies are naturally incorporated to our current lives.

  17. Augmented reality user interface for mobile ground robots with manipulator arms

    NASA Astrophysics Data System (ADS)

    Vozar, Steven; Tilbury, Dawn M.

    2011-01-01

    Augmented Reality (AR) is a technology in which real-world visual data is combined with an overlay of computer graphics, enhancing the original feed. AR is an attractive tool for teleoperated UGV UIs as it can improve communication between robots and users via an intuitive spatial and visual dialogue, thereby increasing operator situational awareness. The successful operation of UGVs often relies upon both chassis navigation and manipulator arm control, and since existing literature usually focuses on one task or the other, there is a gap in mobile robot UIs that take advantage of AR for both applications. This work describes the development and analysis of an AR UI system for a UGV with an attached manipulator arm. The system supplements a video feed shown to an operator with information about geometric relationships within the robot task space to improve the operator's situational awareness. Previous studies on AR systems and preliminary analyses indicate that such an implementation of AR for a mobile robot with a manipulator arm is anticipated to improve operator performance. A full user-study can determine if this hypothesis is supported by performing an analysis of variance on common test metrics associated with UGV teleoperation.

  18. MEDIASSIST: medical assistance for intraoperative skill transfer in minimally invasive surgery using augmented reality

    NASA Astrophysics Data System (ADS)

    Sudra, Gunther; Speidel, Stefanie; Fritz, Dominik; Müller-Stich, Beat Peter; Gutt, Carsten; Dillmann, Rüdiger

    2007-03-01

    Minimally invasive surgery is a highly complex medical discipline with various risks for surgeon and patient, but has also numerous advantages on patient-side. The surgeon has to adapt special operation-techniques and deal with difficulties like the complex hand-eye coordination, limited field of view and restricted mobility. To alleviate with these new problems, we propose to support the surgeon's spatial cognition by using augmented reality (AR) techniques to directly visualize virtual objects in the surgical site. In order to generate an intelligent support, it is necessary to have an intraoperative assistance system that recognizes the surgical skills during the intervention and provides context-aware assistance surgeon using AR techniques. With MEDIASSIST we bundle our research activities in the field of intraoperative intelligent support and visualization. Our experimental setup consists of a stereo endoscope, an optical tracking system and a head-mounted-display for 3D visualization. The framework will be used as platform for the development and evaluation of our research in the field of skill recognition and context-aware assistance generation. This includes methods for surgical skill analysis, skill classification, context interpretation as well as assistive visualization and interaction techniques. In this paper we present the objectives of MEDIASSIST and first results in the fields of skill analysis, visualization and multi-modal interaction. In detail we present a markerless instrument tracking for surgical skill analysis as well as visualization techniques and recognition of interaction gestures in an AR environment.

  19. INPRES (intraoperative presentation of surgical planning and simulation results): augmented reality for craniofacial surgery

    NASA Astrophysics Data System (ADS)

    Salb, Tobias; Brief, Jakob; Welzel, Thomas; Giesler, Bjoern; Hassfeld, Steffan; Muehling, Joachim; Dillmann, Ruediger

    2003-05-01

    In this paper we present recent developments and pre-clinical validation results of our approach for augmented reality (AR, for short) in craniofacial surgery. A commercial Sony Glasstron display is used for optical see-through overlay of surgical planning and simulation results with a patient inside the operation room (OR). For the tracking of the glasses, of the patient and of various medical instruments an NDI Polaris system is used as standard solution. A complementary inside-out navigation approach has been realized with a panoramic camera. This device is mounted on the head of the surgeon for tracking of fiducials placed on the walls of the OR. Further tasks described include the calibration of the head-mounted display (HMD), the registration of virtual objects with the real world and the detection of occlusions in the object overlay with help of two miniature CCD cameras. The evaluation of our work took place in the laboratory environment and showed promising results. Future work will concentrate on the optimization of the technical features of the prototype and on the development of a system for everyday clinical use.

  20. Providing IoT Services in Smart Cities through Dynamic Augmented Reality Markers

    PubMed Central

    Chaves-Diéguez, David; Pellitero-Rivero, Alexandre; García-Coego, Daniel; González-Castaño, Francisco Javier; Rodríguez-Hernández, Pedro Salvador; Piñeiro-Gómez, Óscar; Gil-Castiñeira, Felipe; Costa-Montenegro, Enrique

    2015-01-01

    Smart cities are expected to improve the quality of life of citizens by relying on new paradigms, such as the Internet of Things (IoT) and its capacity to manage and interconnect thousands of sensors and actuators scattered across the city. At the same time, mobile devices widely assist professional and personal everyday activities. A very good example of the potential of these devices for smart cities is their powerful support for intuitive service interfaces (such as those based on augmented reality (AR)) for non-expert users. In our work, we consider a scenario that combines IoT and AR within a smart city maintenance service to improve the accessibility of sensor and actuator devices in the field, where responsiveness is crucial. In it, depending on the location and needs of each service, data and commands will be transported by an urban communications network or consulted on the spot. Direct AR interaction with urban objects has already been described; it usually relies on 2D visual codes to deliver object identifiers (IDs) to the rendering device to identify object resources. These IDs allow information about the objects to be retrieved from a remote server. In this work, we present a novel solution that replaces static AR markers with dynamic markers based on LED communication, which can be decoded through cameras embedded in smartphones. These dynamic markers can directly deliver sensor information to the rendering device, on top of the object ID, without further network interaction. PMID:26151215

  1. Augmented reality visualization in head and neck surgery: an overview of recent findings in sentinel node biopsy and future perspectives.

    PubMed

    Profeta, Andrea Corrado; Schilling, Clare; McGurk, Mark

    2016-07-01

    "Augmented reality visualisation", in which the site of an operation is merged with computer-generated graphics, provides a way to view the relevant part of the patient's body in better detail. We describe its role in relation to sentinel lymph node biopsy (SLNB), current advancements, and future directions in the excision of tumours in early-stage cancers of the head and neck. PMID:26809235

  2. Real-time in situ three-dimensional integral videography and surgical navigation using augmented reality: a pilot study

    PubMed Central

    Suenaga, Hideyuki; Hoang Tran, Huy; Liao, Hongen; Masamune, Ken; Dohi, Takeyoshi; Hoshi, Kazuto; Mori, Yoshiyuki; Takato, Tsuyoshi

    2013-01-01

    To evaluate the feasibility and accuracy of a three-dimensional augmented reality system incorporating integral videography for imaging oral and maxillofacial regions, based on preoperative computed tomography data. Three-dimensional surface models of the jawbones, based on the computed tomography data, were used to create the integral videography images of a subject's maxillofacial area. The three-dimensional augmented reality system (integral videography display, computed tomography, a position tracker and a computer) was used to generate a three-dimensional overlay that was projected on the surgical site via a half-silvered mirror. Thereafter, a feasibility study was performed on a volunteer. The accuracy of this system was verified on a solid model while simulating bone resection. Positional registration was attained by identifying and tracking the patient/surgical instrument's position. Thus, integral videography images of jawbones, teeth and the surgical tool were superimposed in the correct position. Stereoscopic images viewed from various angles were accurately displayed. Change in the viewing angle did not negatively affect the surgeon's ability to simultaneously observe the three-dimensional images and the patient, without special glasses. The difference in three-dimensional position of each measuring point on the solid model and augmented reality navigation was almost negligible (<1 mm); this indicates that the system was highly accurate. This augmented reality system was highly accurate and effective for surgical navigation and for overlaying a three-dimensional computed tomography image on a patient's surgical area, enabling the surgeon to understand the positional relationship between the preoperative image and the actual surgical site, with the naked eye. PMID:23703710

  3. Mad City Mystery: Developing Scientific Argumentation Skills with a Place-based Augmented Reality Game on Handheld Computers

    NASA Astrophysics Data System (ADS)

    Squire, Kurt D.; Jan, Mingfong

    2007-02-01

    While the knowledge economy has reshaped the world, schools lag behind in producing appropriate learning for this social change. Science education needs to prepare students for a future world in which multiple representations are the norm and adults are required to "think like scientists." Location-based augmented reality games offer an opportunity to create a "post-progressive" pedagogy in which students are not only immersed in authentic scientific inquiry, but also required to perform in adult scientific discourses. This cross-case comparison as a component of a design-based research study investigates three cases (roughly 28 students total) where an Augmented Reality curriculum, Mad City Mystery, was used to support learning in environmental science. We investigate whether augmented reality games on handhelds can be used to engage students in scientific thinking (particularly argumentation), how game structures affect students' thinking, the impact of role playing on learning, and the role of the physical environment in shaping learning. We argue that such games hold potential for engaging students in meaningful scientific argumentation. Through game play, players are required to develop narrative accounts of scientific phenomena, a process that requires them to develop and argue scientific explanations. We argue that specific game features scaffold this thinking process, creating supports for student thinking non-existent in most inquiry-based learning environments.

  4. Fiber optics based jet engine augmenter viewing system

    NASA Astrophysics Data System (ADS)

    Murphy, P. J.; Jones, D. W.; Jones, R. R., III; Lennert, A. E.

    1988-06-01

    An augmenter viewing system employing a coherent fiber-optic array was developed for use in jet engine testing applications at AEDC. Real-time viewing of the test article afterburner was obtained in a severe environment under high temperature and vibration levels. The optical system consisted of a conventional front-end lens assembly coupled with the fiber-optic array, and a solid-state color video camera mounted inside the test cell. The advantages and problems associated with a fiber-optics-based viewing system will be discussed in comparison with more conventional viewing techniques for this application.

  5. Thrust Augmentation Measurements Using a Pulse Detonation Engine Ejector

    NASA Technical Reports Server (NTRS)

    Santoro, Robert J.; Pal, Sibtosh

    2005-01-01

    Results of an experimental effort on pulse detonation driven ejectors are presented and discussed. The experiments were conducted using a pulse detonation engine (PDE)/ejector setup that was specifically designed for the study and operated at frequencies up to 50 Hz. The results of various experiments designed to probe different aspects of the PDE/ejector setup are reported. The baseline PDE was operated using ethylene (C2H4) as the fuel and an oxygen/nitrogen O2 + N2) mixture at an equivalence ratio of one. The PDE only experiments included propellant mixture characterization using a laser absorption technique, high fidelity thrust measurements using an integrated spring-damper system, and shadowgraph imaging of the detonation/shock wave structure emanating from the tube. The baseline PDE thrust measurement results at each desired frequency agree with experimental and modeling results reported in the literature. These PDE setup results were then used as a basis for quantifying thrust augmentation for various PDE/ejector setups with constant diameter ejector tubes and various ejector lengths, the radius of curvature for the ejector inlets and various detonation tube/ejector tube overlap distances. For the studied experimental matrix, the results showed a maximum thrust augmentation of 106% at an operational frequency of 30 Hz. The thrust augmentation results are complemented by shadowgraph imaging of the flowfield in the ejector tube inlet area and high frequency pressure transducer measurements along the length of the ejector tube.

  6. Clinical usefulness of augmented reality using infrared camera based real-time feedback on gait function in cerebral palsy: a case study

    PubMed Central

    Lee, Byoung-Hee

    2016-01-01

    [Purpose] This study investigated the effects of real-time feedback using infrared camera recognition technology-based augmented reality in gait training for children with cerebral palsy. [Subjects] Two subjects with cerebral palsy were recruited. [Methods] In this study, augmented reality based real-time feedback training was conducted for the subjects in two 30-minute sessions per week for four weeks. Spatiotemporal gait parameters were used to measure the effect of augmented reality-based real-time feedback training. [Results] Velocity, cadence, bilateral step and stride length, and functional ambulation improved after the intervention in both cases. [Conclusion] Although additional follow-up studies of the augmented reality based real-time feedback training are required, the results of this study demonstrate that it improved the gait ability of two children with cerebral palsy. These findings suggest a variety of applications of conservative therapeutic methods which require future clinical trials. PMID:27190489

  7. Clinical usefulness of augmented reality using infrared camera based real-time feedback on gait function in cerebral palsy: a case study.

    PubMed

    Lee, Byoung-Hee

    2016-04-01

    [Purpose] This study investigated the effects of real-time feedback using infrared camera recognition technology-based augmented reality in gait training for children with cerebral palsy. [Subjects] Two subjects with cerebral palsy were recruited. [Methods] In this study, augmented reality based real-time feedback training was conducted for the subjects in two 30-minute sessions per week for four weeks. Spatiotemporal gait parameters were used to measure the effect of augmented reality-based real-time feedback training. [Results] Velocity, cadence, bilateral step and stride length, and functional ambulation improved after the intervention in both cases. [Conclusion] Although additional follow-up studies of the augmented reality based real-time feedback training are required, the results of this study demonstrate that it improved the gait ability of two children with cerebral palsy. These findings suggest a variety of applications of conservative therapeutic methods which require future clinical trials. PMID:27190489

  8. Valorisation of Cultural Heritage Through Virtual Visit and Augmented Reality: the Case of the Abbey of Epau (france)

    NASA Astrophysics Data System (ADS)

    Simonetto, E.; Froment, C.; Labergerie, E.; Ferré, G.; Séchet, B.; Chédorge, H.; Cali, J.; Polidori, L.

    2013-07-01

    Terrestrial Laser Scanning (TLS), 3-D modeling and its Web visualization are the three key steps needed to perform storage and grant-free and wide access to cultural heritage, as highlighted in many recent examples. The goal of this study is to set up 3-D Web resources for "virtually" visiting the exterior of the Abbaye de l'Epau, an old French abbey which has both a rich history and delicate architecture. The virtuality is considered in two ways: the flowing navigation in a virtual reality environment around the abbey and a game activity using augmented reality. First of all, the data acquisition consists in GPS and tacheometry survey, terrestrial laser scanning and photography acquisition. After data pre-processing, the meshed and textured 3-D model is generated using 3-D Reshaper commercial software. The virtual reality visit and augmented reality animation are then created using Unity software. This work shows the interest of such tools in bringing out the regional cultural heritage and making it attractive to the public.

  9. Gas-Generator Augmented Expander Cycle Rocket Engine

    NASA Technical Reports Server (NTRS)

    Greene, William D. (Inventor)

    2011-01-01

    An augmented expander cycle rocket engine includes first and second turbopumps for respectively pumping fuel and oxidizer. A gas-generator receives a first portion of fuel output from the first turbopump and a first portion of oxidizer output from the second turbopump to ignite and discharge heated gas. A heat exchanger close-coupled to the gas-generator receives in a first conduit the discharged heated gas, and transfers heat to an adjacent second conduit carrying fuel exiting the cooling passages of a primary combustion chamber. Heat is transferred to the fuel passing through the cooling passages. The heated fuel enters the second conduit of the heat exchanger to absorb more heat from the first conduit, and then flows to drive a turbine of one or both of the turbopumps. The arrangement prevents the turbopumps exposure to combusted gas that could freeze in the turbomachinery and cause catastrophic failure upon attempted engine restart.

  10. FreshAiR and Field Studies—Augmenting Geological Reality with Mobile Devices

    NASA Astrophysics Data System (ADS)

    De Paor, D. G.; Crompton, H.; Dunleavy, M.

    2014-12-01

    During the last decade, mobile devices have fomented a revolution in geological mapping. Present Clinton set the stage for this revolution in the year 2000 when he ordered a cessation to Selective Availability, making reliable GPS available for civilian use. Geologists began using personal digital assistants and ruggedized tablet PCs for geolocation and data recording and the pace of change accelerated with the development of mobile apps such as Google Maps, digital notebooks, and digital compass-clinometers. Despite these changes in map-making technologies, most students continue to learn geology in the field the old-fashioned way, by following a field trip leader as a group and trying to hear and understand lecturettes at the outcrop. In this presentation, we demonstrate the potential of a new Augment Reality (AR) mobile app called "FreshAiR" to change fundamentally the way content-knowledge and learning objectives are delivered to students in the field. FreshAiR, which was developed by co-author and ODU alumnus M.D., triggers content delivery to mobile devices based on proximity. Students holding their mobile devices to the horizon see trigger points superimposed on the field of view of the device's built-in camera. When they walk towards the trigger, information about the location pops up. This can include text, images, movies, and quiz questions (multiple choice and fill-in-the-blank). Students can use the app to reinforce the field trip leader's presentations or they can visit outcrops individuals at different times. This creates the possibility for asynchronous field class, a concept that has profound implications for distance education in the geosciences.

  11. Development and Application of a New Learning Object for Teaching Operative Dentistry Using Augmented Reality.

    PubMed

    Espejo-Trung, Luciana Cardoso; Elian, Silvia Nagib; Luz, Maria Aparecia Alves De Cerqueira

    2015-11-01

    Learning objects (LOs) associated with augmented reality have been used as attractive new technologic tools in the educational process. However, the acceptance of new LOs must be verified with the purpose of using these innovations in the learning process in general. The aim of this study was to develop a new LO and investigate the acceptance of gold onlay in teaching preparation design at a dental school in Brazil. Questionnaires were designed to assess, first, the users' computational ability and knowledge of computers (Q1) and, second, the users' acceptance of the new LO (Q2). For both questionnaires, the internal consistency index was calculated to determine whether the questions were measuring the same construct. The reliability of Q2 was measured with a retest procedure. The LO was tested by dental students (n=28), professors and postgraduate students in dentistry and prosthetics (n=30), and dentists participating in a continuing education or remedial course in dentistry and/or prosthetics (n=19). Analyses of internal consistency (Kappa coefficient and Cronbach's alpha) demonstrated a high degree of confidence in the questionnaires. Tests for simple linear regressions were conducted between the response variable (Q2) and the following explanative variables: the Q1 score, age, gender, and group. The results showed wide acceptance regardless of the subjects' computational ability (p=0.99; R2=0), gender (p=0.27; R2=1.6%), age (p=0.27; R2=0.1%), or group (p=0.53; R2=1.9%). The methodology used enabled the development of an LO with a high index of acceptance for all groups. PMID:26522642

  12. Real-time computed tomography-based augmented reality for natural orifice transluminal endoscopic surgery navigation

    PubMed Central

    Azagury, D. E.; Ryou, M.; Shaikh, S. N.; Estéepar, R. San José; Lengyel, B. I.; Jagadeesan, J.; Vosburgh, K. G.; Thompson, C. C.

    2013-01-01

    Background Natural orifice transluminal endoscopic surgery (NOTES) is technically challenging owing to endoscopic short-sighted visualization, excessive scope flexibility and lack of adequate instrumentation. Augmented reality may overcome these difficulties. This study tested whether an image registration system for NOTES procedures (IR-NOTES) can facilitate navigation. Methods In three human cadavers 15 intra-abdominal organs were targeted endoscopically with and without IR-NOTES via both transgastric and transcolonic routes, by three endoscopists with different levels of expertise. Ease of navigation was evaluated objectively by kinematic analysis, and navigation complexity was determined by creating an organ access complexity score based on the same data. Results Without IR-NOTES, 21 (11·7 per cent) of 180 targets were not reached (expert endoscopist 3, advanced 7, intermediate 11), compared with one (1 per cent) of 90 with IR-NOTES (intermediate endoscopist) (P = 0·002). Endoscope movements were significantly less complex in eight of the 15 listed organs when using IR-NOTES. The most complex areas to access were the pelvis and left upper quadrant, independently of the access route. The most difficult organs to access were the spleen (5 failed attempts; 3 of 7 kinematic variables significantly improved) and rectum (4 failed attempts; 5 of 7 kinematic variables significantly improved). The time needed to access the rectum through a transgastric approach was 206·3 s without and 54·9 s with IR-NOTES (P = 0·027). Conclusion The IR-NOTES system enhanced both navigation efficacy and ease of intra-abdominal NOTES exploration for operators of all levels. The system rendered some organs accessible to non-expert operators, thereby reducing one impediment to NOTES procedures. PMID:22864885

  13. Enhanced Lighting Techniques and Augmented Reality to Improve Human Task Performance

    NASA Technical Reports Server (NTRS)

    Maida, James C.; Bowen, Charles K.; Pace, John W.

    2005-01-01

    One of the most versatile tools designed for use on the International Space Station (ISS) is the Special Purpose Dexterous Manipulator (SPDM) robot. Operators for this system are trained at NASA Johnson Space Center (JSC) using a robotic simulator, the Dexterous Manipulator Trainer (DMT), which performs most SPDM functions under normal static Earth gravitational forces. The SPDM is controlled from a standard Robotic Workstation. A key feature of the SPDM and DMT is the Force/Moment Accommodation (FMA) system, which limits the contact forces and moments acting on the robot components, on its payload, an Orbital Replaceable Unit (ORU), and on the receptacle for the ORU. The FMA system helps to automatically alleviate any binding of the ORU as it is inserted or withdrawn from a receptacle, but it is limited in its correction capability. A successful ORU insertion generally requires that the reference axes of the ORU and receptacle be aligned to within approximately 0.25 inch and 0.5 degree of nominal values. The only guides available for the operator to achieve these alignment tolerances are views from any available video cameras. No special registration markings are provided on the ORU or receptacle, so the operator must use their intrinsic features in the video display to perform the pre-insertion alignment task. Since optimum camera views may not be available, and dynamic orbital lighting conditions may limit viewing periods, long times are anticipated for performing some ORU insertion or extraction operations. This study explored the feasibility of using augmented reality (AR) to assist with SPDM operations. Geometric graphical symbols were overlaid on the end effector (EE) camera view to afford cues to assist the operator in attaining adequate pre-insertion ORU alignment.

  14. Augmented reality cues to assist older drivers with gap estimation for left-turns.

    PubMed

    Rusch, Michelle L; Schall, Mark C; Lee, John D; Dawson, Jeffrey D; Rizzo, Matthew

    2014-10-01

    The objective of this study was to assess the effects of augmented reality (AR) cues designed to assist middle-aged and older drivers with a range of UFOV impairments, judging when to make left-turns across oncoming traffic. Previous studies have shown that AR cues can help middle-aged and older drivers respond to potential roadside hazards by increasing hazard detection without interfering with other driving tasks. Intersections pose a critical challenge for cognitively impaired drivers, prone to misjudge time-to-contact with oncoming traffic. We investigated whether AR cues improve or interfere with hazard perception in left-turns across oncoming traffic for drivers with age-related cognitive decline. Sixty-four middle-aged and older drivers with a range of UFOV impairment judged when it would be safe to turn left across oncoming traffic approaching the driver from the opposite direction in a rural stop-sign controlled intersection scenario implemented in a static base driving simulator. Outcome measures used to evaluate the effectiveness of AR cueing included: Time-to-Contact (TTC), Gap Time Variation (GTV), Response Rate, and Gap Response Variation (GRV). All drivers estimated TTCs were shorter in cued than in uncued conditions. In addition, drivers responded more often in cued conditions than in uncued conditions and GRV decreased for all drivers in scenarios that contained AR cues. For both TTC and response rate, drivers also appeared to adjust their behavior to be consistent with the cues, especially drivers with the poorest UFOV scores (matching their behavior to be close to middle-aged drivers). Driver ratings indicated that cueing was not considered to be distracting. Further, various conditions of reliability (e.g., 15% miss rate) did not appear to affect performance or driver ratings. PMID:24950128

  15. AUGMENTED REALITY CUES TO ASSIST OLDER DRIVERS WITH GAP ESTIMATION FOR LEFT-TURNS

    PubMed Central

    Rusch, Michelle L.; Schall, Mark C.; Lee, John D.; Dawson, Jeffrey D.; Rizzo, Matthew

    2014-01-01

    The objective of this study was to assess the effects of augmented reality (AR) cues designed to assist middle-aged and older drivers with a range of UFOV impairments, judging when to make left-turns across oncoming traffic. Previous studies have shown that AR cues can help middle-aged and older drivers respond to potential roadside hazards by increasing hazard detection without interfering with other driving tasks. Intersections pose a critical challenge for cognitively impaired drivers, prone to misjudge time-to-contact with oncoming traffic. We investigated whether AR cues improve or interfere with hazard perception in left-turns across oncoming traffic for drivers with age-related cognitive decline. Sixty-four middle-aged and older drivers with a range of UFOV impairment judged when it would be safe to turn left across oncoming traffic approaching the driver from the opposite direction in a rural stop-sign controlled intersection scenario implemented in a static base driving simulator. Outcome measures used to evaluate the effectiveness of AR cueing included: Time-to-Contact (TTC), Gap Time Variation (GTV), Response Rate, and Gap Response Variation (GRV). All drivers estimated TTCs were shorter in cued than in uncued conditions. In addition, drivers responded more often in cued conditions than in uncued conditions and GRV decreased for all drivers in scenarios that contained AR cues. For both TTC and response rate, drivers also appeared to adjust their behavior to be consistent with the cues, especially drivers with the poorest UFOV scores (matching their behavior to be close to middle-aged drivers). Driver ratings indicated that cueing was not considered to be distracting. Further, various conditions of reliability (e.g., 15% miss rate) did not appear to affect performance or driver ratings. PMID:24950128

  16. Augmented reality intravenous injection simulator based 3D medical imaging for veterinary medicine.

    PubMed

    Lee, S; Lee, J; Lee, A; Park, N; Lee, S; Song, S; Seo, A; Lee, H; Kim, J-I; Eom, K

    2013-05-01

    Augmented reality (AR) is a technology which enables users to see the real world, with virtual objects superimposed upon or composited with it. AR simulators have been developed and used in human medicine, but not in veterinary medicine. The aim of this study was to develop an AR intravenous (IV) injection simulator to train veterinary and pre-veterinary students to perform canine venipuncture. Computed tomographic (CT) images of a beagle dog were scanned using a 64-channel multidetector. The CT images were transformed into volumetric data sets using an image segmentation method and were converted into a stereolithography format for creating 3D models. An AR-based interface was developed for an AR simulator for IV injection. Veterinary and pre-veterinary student volunteers were randomly assigned to an AR-trained group or a control group trained using more traditional methods (n = 20/group; n = 8 pre-veterinary students and n = 12 veterinary students in each group) and their proficiency at IV injection technique in live dogs was assessed after training was completed. Students were also asked to complete a questionnaire which was administered after using the simulator. The group that was trained using an AR simulator were more proficient at IV injection technique using real dogs than the control group (P ≤ 0.01). The students agreed that they learned the IV injection technique through the AR simulator. Although the system used in this study needs to be modified before it can be adopted for veterinary educational use, AR simulation has been shown to be a very effective tool for training medical personnel. Using the technology reported here, veterinary AR simulators could be developed for future use in veterinary education. PMID:23103217

  17. Robust detection and tracking of annotations for outdoor augmented reality browsing

    PubMed Central

    Langlotz, Tobias; Degendorfer, Claus; Mulloni, Alessandro; Schall, Gerhard; Reitmayr, Gerhard; Schmalstieg, Dieter

    2011-01-01

    A common goal of outdoor augmented reality (AR) is the presentation of annotations that are registered to anchor points in the real world. We present an enhanced approach for registering and tracking such anchor points, which is suitable for current generation mobile phones and can also successfully deal with the wide variety of viewing conditions encountered in real life outdoor use. The approach is based on on-the-fly generation of panoramic images by sweeping the camera over the scene. The panoramas are then used for stable orientation tracking, while the user is performing only rotational movements. This basic approach is improved by several new techniques for the re-detection and tracking of anchor points. For the re-detection, specifically after temporal variations, we first compute a panoramic image with extended dynamic range, which can better represent varying illumination conditions. The panorama is then searched for known anchor points, while orientation tracking continues uninterrupted. We then use information from an internal orientation sensor to prime an active search scheme for the anchor points, which improves matching results. Finally, global consistency is enhanced by statistical estimation of a global rotation that minimizes the overall position error of anchor points when transforming them from the source panorama in which they were created, to the current view represented by a new panorama. Once the anchor points are redetected, we track the user's movement using a novel 3-degree-of-freedom orientation tracking approach that combines vision tracking with the absolute orientation from inertial and magnetic sensors. We tested our system using an AR campus guide as an example application and provide detailed results for our approach using an off-the-shelf smartphone. Results show that the re-detection rate is improved by a factor of 2 compared to previous work and reaches almost 90% for a wide variety of test cases while still keeping the ability

  18. Registration of partially overlapping surfaces for range image based augmented reality on mobile devices

    NASA Astrophysics Data System (ADS)

    Kilgus, T.; Franz, A. M.; Seitel, A.; Marz, K.; Bartha, L.; Fangerau, M.; Mersmann, S.; Groch, A.; Meinzer, H.-P.; Maier-Hein, L.

    2012-02-01

    Visualization of anatomical data for disease diagnosis, surgical planning, or orientation during interventional therapy is an integral part of modern health care. However, as anatomical information is typically shown on monitors provided by a radiological work station, the physician has to mentally transfer internal structures shown on the screen to the patient. To address this issue, we recently presented a new approach to on-patient visualization of 3D medical images, which combines the concept of augmented reality (AR) with an intuitive interaction scheme. Our method requires mounting a range imaging device, such as a Time-of-Flight (ToF) camera, to a portable display (e.g. a tablet PC). During the visualization process, the pose of the camera and thus the viewing direction of the user is continuously determined with a surface matching algorithm. By moving the device along the body of the patient, the physician is given the impression of looking directly into the human body. In this paper, we present and evaluate a new method for camera pose estimation based on an anisotropic trimmed variant of the well-known iterative closest point (ICP) algorithm. According to in-silico and in-vivo experiments performed with computed tomography (CT) and ToF data of human faces, knees and abdomens, our new method is better suited for surface registration with ToF data than the established trimmed variant of the ICP, reducing the target registration error (TRE) by more than 60%. The TRE obtained (approx. 4-5 mm) is promising for AR visualization, but clinical applications require maximization of robustness and run-time.

  19. Theoretical Comparison of Several Methods of Thrust Augmentation for Turbojet Engines

    NASA Technical Reports Server (NTRS)

    Hall, Eldon W; Wilcox, E Clinton

    1950-01-01

    A theoretical investigation of tail-pipe burning, water injection at the compressor inlet, combination tail-pipe burning plus water injection, bleedoff, and rocket-assist methods thrust augmentation for turbojet engines was made for an engine representative of those in current use. The effect of augmented liquid ratio on augmented thrust ratio and the effects of altitude and flight Mach number on the performance of various methods were determined. The additional take-off weight involved by the use of the different thrust augmentation methods, as well as the effect of the various thrust augmentation methods on the range of a representative aircraft was also investigated.

  20. Virtual Reality Used to Serve the Glenn Engineering Community

    NASA Technical Reports Server (NTRS)

    Carney, Dorothy V.

    2001-01-01

    There are a variety of innovative new visualization tools available to scientists and engineers for the display and analysis of their models. At the NASA Glenn Research Center, we have an ImmersaDesk, a large, single-panel, semi-immersive display device. This versatile unit can interactively display three-dimensional images in visual stereo. Our challenge is to make this virtual reality platform accessible and useful to researchers. An example of a successful application of this computer technology is the display of blade out simulations. NASA Glenn structural dynamicists, Dr. Kelly Carney and Dr. Charles Lawrence, funded by the Ultra Safe Propulsion Project under Base R&T, are researching blade outs, when turbine engines lose a fan blade during operation. Key objectives of this research include minimizing danger to the aircraft via effective blade containment, predicting destructive loads due to the imbalance following a blade loss, and identifying safe, cost-effective designs and materials for future engines.

  1. Applications of Panoramic Images: from 720° Panorama to Interior 3d Models of Augmented Reality

    NASA Astrophysics Data System (ADS)

    Lee, I.-C.; Tsai, F.

    2015-05-01

    A series of panoramic images are usually used to generate a 720° panorama image. Although panoramic images are typically used for establishing tour guiding systems, in this research, we demonstrate the potential of using panoramic images acquired from multiple sites to create not only 720° panorama, but also three-dimensional (3D) point clouds and 3D indoor models. Since 3D modeling is one of the goals of this research, the location of the panoramic sites needed to be carefully planned in order to maintain a robust result for close-range photogrammetry. After the images are acquired, panoramic images are processed into 720° panoramas, and these panoramas which can be used directly as panorama guiding systems or other applications. In addition to these straightforward applications, interior orientation parameters can also be estimated while generating 720° panorama. These parameters are focal length, principle point, and lens radial distortion. The panoramic images can then be processed with closerange photogrammetry procedures to extract the exterior orientation parameters and generate 3D point clouds. In this research, VisaulSFM, a structure from motion software is used to estimate the exterior orientation, and CMVS toolkit is used to generate 3D point clouds. Next, the 3D point clouds are used as references to create building interior models. In this research, Trimble Sketchup was used to build the model, and the 3D point cloud was added to the determining of locations of building objects using plane finding procedure. In the texturing process, the panorama images are used as the data source for creating model textures. This 3D indoor model was used as an Augmented Reality model replacing a guide map or a floor plan commonly used in an on-line touring guide system. The 3D indoor model generating procedure has been utilized in two research projects: a cultural heritage site at Kinmen, and Taipei Main Station pedestrian zone guidance and navigation system. The

  2. Testing Augmented Reality for Cue Exposure in Obese Patients: An Exploratory Study.

    PubMed

    Pallavicini, Federica; Serino, Silvia; Cipresso, Pietro; Pedroli, Elisa; Chicchi Giglioli, Irene Alice; Chirico, Alice; Manzoni, Gian Mauro; Castelnuovo, Gianluca; Molinari, Enrico; Riva, Giuseppe

    2016-02-01

    Binge eating is one of the key behaviors in relation to the etiology and severity of obesity. Cue exposure with response prevention consists of exposing patients to binge foods while actual eating is not allowed. Augmented reality (AR) has the potential to change the way cue exposure is administered, but very few prior studies have been conducted so far. Starting from these premises, this study was aimed to (a) investigate whether AR foods elicit emotional responses comparable to those produced by the real stimuli, (b) study differences between obese and control participants in terms of emotional responses to food, and (c) compare emotional responses to different categories of foods. To reach these goals, we assess in 15 obese (age, 44.6 ± 13 years; body mass index [BMI], 44.2 ± 8.1) and 15 control participants (age, 43.7 ± 12.8 years; BMI, 21.2 ± 1.4) the emotional responses to high-calorie (savory and sweet) and low-calorie food stimuli, presented through different exposure conditions (real, photographic, and AR). The State-Trait Anxiety Inventory was used for the assessment of state anxiety, and it was administered at the beginning and after the exposure to foods, along with the Visual Analog Scale (VAS) for Hunger and Happiness. To assess the perceived pleasantness, the VAS for Palatability was administered after the exposure to food stimuli. Heart rate, skin conductance response, and facial corrugator supercilii muscle activation were recorded. Although preliminary, the results showed that (a) AR food stimuli were perceived to be as palatable as real stimuli, and they also triggered a similar arousal response; (b) obese individuals showed lower happiness after the exposure to food compared to control participants, with regard to both psychological and physiological responses; and (c) high-calorie savory (vs. low-calorie) food stimuli were perceived by all the participants to be more palatable, and they triggered a greater arousal response

  3. Three-Dimensional Path Planning and Guidance of Leg Vascular Based on Improved Ant Colony Algorithm in Augmented Reality.

    PubMed

    Gao, Ming-ke; Chen, Yi-min; Liu, Quan; Huang, Chen; Li, Ze-yu; Zhang, Dian-hua

    2015-11-01

    Preoperative path planning plays a critical role in vascular access surgery. Vascular access surgery has superior difficulties and requires long training periods as well as precise operation. Yet doctors are on different leves, thus bulky size of blood vessels is usually chosen to undergo surgery and other possible optimal path is not considered. Moreover, patients and surgeons will suffer from X-ray radiation during the surgical procedure. The study proposed an improved ant colony algorithm to plan a vascular optimal three-dimensional path with overall consideration of factors such as catheter diameter, vascular length, diameter as well as the curvature and torsion. To protect the doctor and patient from exposing to X-ray long-term, the paper adopted augmented reality technology to register the reconstructed vascular model and physical model meanwhile, locate catheter by the electromagnetic tracking system and used Head Mounted Display to show the planning path in real time and monitor catheter push procedure. The experiment manifests reasonableness of preoperative path planning and proves the reliability of the algorithm. The augmented reality experiment real time and accurately displays the vascular phantom model, planning path and the catheter trajectory and proves the feasibility of this method. The paper presented a useful and feasible surgical scheme which was based on the improved ant colony algorithm to plan vascular three-dimensional path in augmented reality. The study possessed practical guiding significance in preoperative path planning, intraoperative catheter guiding and surgical training, which provided a theoretical method of path planning for vascular access surgery. It was a safe and reliable path planning approach and possessed practical reference value. PMID:26319273

  4. Interactive augmented reality using Scratch 2.0 to improve physical activities for children with developmental disabilities.

    PubMed

    Lin, Chien-Yu; Chang, Yu-Ming

    2015-02-01

    This study uses a body motion interactive game developed in Scratch 2.0 to enhance the body strength of children with disabilities. Scratch 2.0, using an augmented-reality function on a program platform, creates real world and virtual reality displays at the same time. This study uses a webcam integration that tracks movements and allows participants to interact physically with the project, to enhance the motivation of children with developmental disabilities to perform physical activities. This study follows a single-case research using an ABAB structure, in which A is the baseline and B is the intervention. The experimental period was 2 months. The experimental results demonstrated that the scores for 3 children with developmental disabilities increased considerably during the intervention phrases. The developmental applications of these results are also discussed. PMID:25460214

  5. Combining physical and virtual contexts through augmented reality: design and evaluation of a prototype using a drug box as a marker for antibiotic training

    PubMed Central

    Tomson, Tanja; Zary, Nabil

    2014-01-01

    Introduction. Antimicrobial resistance is a global health issue. Studies have shown that improved antibiotic prescription education among healthcare professionals reduces mistakes during the antibiotic prescription process. The aim of this study was to investigate novel educational approaches that through the use of Augmented Reality technology could make use of the real physical context and thereby enrich the educational process of antibiotics prescription. The objective is to investigate which type of information related to antibiotics could be used in an augmented reality application for antibiotics education. Methods. This study followed the Design-Based Research Methodology composed of the following main steps: problem analysis, investigation of information that should be visualized for the training session, and finally the involvement of the end users the development and evaluation processes of the prototype. Results. Two of the most important aspects in the antibiotic prescription process, to represent in an augmented reality application, are the antibiotic guidelines and the side effects. Moreover, this study showed how this information could be visualized from a mobile device using an Augmented Reality scanner and antibiotic drug boxes as markers. Discussion. In this study we investigated the usage of objects from a real physical context such as drug boxes and how they could be used as educational resources. The logical next steps are to examine how this approach of combining physical and virtual contexts through Augmented Reality applications could contribute to the improvement of competencies among healthcare professionals and its impact on the decrease of antibiotics resistance. PMID:25548733

  6. Combining physical and virtual contexts through augmented reality: design and evaluation of a prototype using a drug box as a marker for antibiotic training.

    PubMed

    Nifakos, Sokratis; Tomson, Tanja; Zary, Nabil

    2014-01-01

    Introduction. Antimicrobial resistance is a global health issue. Studies have shown that improved antibiotic prescription education among healthcare professionals reduces mistakes during the antibiotic prescription process. The aim of this study was to investigate novel educational approaches that through the use of Augmented Reality technology could make use of the real physical context and thereby enrich the educational process of antibiotics prescription. The objective is to investigate which type of information related to antibiotics could be used in an augmented reality application for antibiotics education. Methods. This study followed the Design-Based Research Methodology composed of the following main steps: problem analysis, investigation of information that should be visualized for the training session, and finally the involvement of the end users the development and evaluation processes of the prototype. Results. Two of the most important aspects in the antibiotic prescription process, to represent in an augmented reality application, are the antibiotic guidelines and the side effects. Moreover, this study showed how this information could be visualized from a mobile device using an Augmented Reality scanner and antibiotic drug boxes as markers. Discussion. In this study we investigated the usage of objects from a real physical context such as drug boxes and how they could be used as educational resources. The logical next steps are to examine how this approach of combining physical and virtual contexts through Augmented Reality applications could contribute to the improvement of competencies among healthcare professionals and its impact on the decrease of antibiotics resistance. PMID:25548733

  7. Development and human factors analysis of an augmented reality interface for multi-robot tele-operation and control

    NASA Astrophysics Data System (ADS)

    Lee, Sam; Lucas, Nathan P.; Ellis, R. Darin; Pandya, Abhilash

    2012-06-01

    This paper presents a seamlessly controlled human multi-robot system comprised of ground and aerial robots of semiautonomous nature for source localization tasks. The system combines augmented reality interfaces capabilities with human supervisor's ability to control multiple robots. The role of this human multi-robot interface is to allow an operator to control groups of heterogeneous robots in real time in a collaborative manner. It used advanced path planning algorithms to ensure obstacles are avoided and that the operators are free for higher-level tasks. Each robot knows the environment and obstacles and can automatically generate a collision-free path to any user-selected target. It displayed sensor information from each individual robot directly on the robot in the video view. In addition, a sensor data fused AR view is displayed which helped the users pin point source information or help the operator with the goals of the mission. The paper studies a preliminary Human Factors evaluation of this system in which several interface conditions are tested for source detection tasks. Results show that the novel Augmented Reality multi-robot control (Point-and-Go and Path Planning) reduced mission completion times compared to the traditional joystick control for target detection missions. Usability tests and operator workload analysis are also investigated.

  8. Shifting the paradigm of music instruction: implications of embodiment stemming from an augmented reality guitar learning system.

    PubMed

    Keebler, Joseph R; Wiltshire, Travis J; Smith, Dustin C; Fiore, Stephen M; Bedwell, Jeffrey S

    2014-01-01

    Musical instruction often includes materials that can act as a barrier to learning. New technologies using augmented reality may aid in reducing the initial difficulties involved in learning music by lowering these barriers characteristic of traditional instructional materials. Therefore, this set of studies examined a novel augmented reality guitar learning system (i.e., the Fretlight® guitar) in regards to current theories of embodied music cognition. Specifically, we examined the effects of using this system in comparison to a standard instructional material (i.e., diagrams). First, we review major theories related to musical embodiment and specify a niche within this research space we call embodied music technology for learning. Following, we explicate two parallel experiments that were conducted to address the learning effects of this system. Experiment 1 examined short-term learning effects within one experimental session, while Experiment 2 examined both short-term and long-term effects across two sessions spaced at a 2-week interval. Analyses demonstrated that, for many of our dependent variables, all participants increased in performance across time. Further, the Fretlight® condition consistently led to significantly better outcomes via interactive effects, including significantly better long term retention for the learned information across a 2 week time interval. These results are discussed in the context of embodied cognition theory as it relates to music. Potential limitations and avenues for future research are described. PMID:24999334

  9. Shifting the paradigm of music instruction: implications of embodiment stemming from an augmented reality guitar learning system

    PubMed Central

    Keebler, Joseph R.; Wiltshire, Travis J.; Smith, Dustin C.; Fiore, Stephen M.; Bedwell, Jeffrey S.

    2014-01-01

    Musical instruction often includes materials that can act as a barrier to learning. New technologies using augmented reality may aid in reducing the initial difficulties involved in learning music by lowering these barriers characteristic of traditional instructional materials. Therefore, this set of studies examined a novel augmented reality guitar learning system (i.e., the Fretlight® guitar) in regards to current theories of embodied music cognition. Specifically, we examined the effects of using this system in comparison to a standard instructional material (i.e., diagrams). First, we review major theories related to musical embodiment and specify a niche within this research space we call embodied music technology for learning. Following, we explicate two parallel experiments that were conducted to address the learning effects of this system. Experiment 1 examined short-term learning effects within one experimental session, while Experiment 2 examined both short-term and long-term effects across two sessions spaced at a 2-week interval. Analyses demonstrated that, for many of our dependent variables, all participants increased in performance across time. Further, the Fretlight® condition consistently led to significantly better outcomes via interactive effects, including significantly better long term retention for the learned information across a 2 week time interval. These results are discussed in the context of embodied cognition theory as it relates to music. Potential limitations and avenues for future research are described. PMID:24999334

  10. A stochastic analysis of the calibration problem for Augmented Reality systems with see-through head-mounted displays

    NASA Astrophysics Data System (ADS)

    Leebmann, Johannes

    This paper presents a closed stochastic solution for the calibration of see-through head-mounted displays (STHMD) for Augmented Reality. An Augmented Reality system (ARS) is based on several components that are affected by stochastic and random errors. One important component is the tracking system. The flock of birds (FOB) tracking system was tested for consistency in position and orientation outputs by establishing constraints that the system was required to meet. The tests for position and orientation were separated to derive uncorrelated quality measures. The tests are self-controlling and do not require any other measuring device. In addition, the image coordinate accuracy also had to be determined to complete the stochastic description of the calibration problem. Based on this stochastic model, different mathematical models were tested to determine whether or not they fit the stochastic model. An overview of different calibration approaches for optical see-through displays is given and a quantitative comparison of the different models is made based on the derived accuracy information.

  11. Thrust Augmentation Measurements Using a Pulse Detonation Engine Ejector

    NASA Technical Reports Server (NTRS)

    Santoro, Robert J.; Pal, Sibtosh

    2003-01-01

    The present NASA GRC-funded three-year research project is focused on studying PDE driven ejectors applicable to a hybrid Pulse Detonation/Turbofan Engine. The objective of the study is to characterize the PDE-ejector thrust augmentation. A PDE-ejector system has been designed to provide critical experimental data for assessing the performance enhancements possible with this technology. Completed tasks include demonstration of a thrust stand for measuring average thrust for detonation tube multi-cycle operation, and design of a 72-in.-long, 2.25-in.-diameter (ID) detonation tube and modular ejector assembly. This assembly will allow testing of both straight and contoured ejector geometries. Initial ejectors that have been fabricated are 72-in.-long-constant-diameter tubes (4-, 5-, and 6-in.-diameter) instrumented with high-frequency pressure transducers. The assembly has been designed such that the detonation tube exit can be positioned at various locations within the ejector tube. PDE-ejector system experiments with gaseous ethylene/ nitrogen/oxygen propellants will commence in the very near future. The program benefits from collaborations with Prof. Merkle of University of Tennessee whose PDE-ejector analysis helps guide the experiments. The present research effort will increase the TRL of PDE-ejectors from its current level of 2 to a level of 3.

  12. Theoretical analysis of various thrust-augmentation cycles for turbojet engines

    NASA Technical Reports Server (NTRS)

    Lundin, Bruce T

    1950-01-01

    The results of analytical studies of tail-pipe-burning, water-injection, and bleedoff methods of thrust augmentation are presented that provide an insight into the operating characteristics of these augmentation methods and summarizes the performance that may be obtained when applied to a typical turbojet engine. A brief description of the principles of operation of each augmentation method is given, together with curves that illustrate the effects of the principal design and operating variables of the augmentation system on the thrust and the liquid consumption of the engine. The necessity of designing tail-pipe burners with a low burner-inlet velocity, a low burner drag, and a high diffuser efficiency in order to obtain a high thrust augmentation and to minimize the loss in engine performance during nonburning operation is illustrated.

  13. Augmented Reality in the Science Museum: Lessons Learned in Scaffolding for Conceptual and Cognitive Learning

    ERIC Educational Resources Information Center

    Yoon, Susan A.; Elinich, Karen; Wang, Joyce; Van Schooneveld, Jacqueline G.

    2012-01-01

    This research follows on previous studies that investigated how digitally augmented devices and knowledge scaffolds enhance learning in a science museum. We investigated what combination of scaffolds could be used in conjunction with the unique characteristics of informal participation to increase conceptual and cognitive outcomes. 307 students…

  14. Development of marker-based tracking methods for augmented reality applied to NPP maintenance work support and its experimental evaluation

    SciTech Connect

    Ishii, H.; Fujino, H.; Bian, Z.; Sekiyama, T.; Shimoda, H.; Yoshikawa, H.

    2006-07-01

    In this study, two types of marker-based tracking methods for Augmented Reality have been developed. One is a method which employs line-shaped markers and the other is a method which employs circular-shaped markers. These two methods recognize the markers by means of image processing and calculate the relative position and orientation between the markers and the camera in real time. The line-shaped markers are suitable to be pasted in the buildings such as NPPs where many pipes and tanks exist. The circular-shaped markers are suitable for the case that there are many obstacles and it is difficult to use line-shaped markers because the obstacles hide the part of the line-shaped markers. Both methods can extend the maximum distance between the markers and the camera compared to the legacy marker-based tracking methods. (authors)

  15. 3D optical see-through head-mounted display based augmented reality system and its application

    NASA Astrophysics Data System (ADS)

    Zhang, Zhenliang; Weng, Dongdong; Liu, Yue; Xiang, Li

    2015-07-01

    The combination of health and entertainment becomes possible due to the development of wearable augmented reality equipment and corresponding application software. In this paper, we implemented a fast calibration extended from SPAAM for an optical see-through head-mounted display (OSTHMD) which was made in our lab. During the calibration, the tracking and recognition techniques upon natural targets were used, and the spatial corresponding points had been set in dispersed and well-distributed positions. We evaluated the precision of this calibration, in which the view angle ranged from 0 degree to 70 degrees. Relying on the results above, we calculated the position of human eyes relative to the world coordinate system and rendered 3D objects in real time with arbitrary complexity on OSTHMD, which accurately matched the real world. Finally, we gave the degree of satisfaction about our device in the combination of entertainment and prevention of cervical vertebra diseases through user feedbacks.

  16. A Low-Cost iPhone-Assisted Augmented Reality Solution for the Localization of Intracranial Lesions

    PubMed Central

    Zhu, RuYuan; Chen, XiaoLei; Zhang, Jun

    2016-01-01

    Background Precise location of intracranial lesions before surgery is important, but occasionally difficult. Modern navigation systems are very helpful, but expensive. A low-cost solution that could locate brain lesions and their surface projections in augmented reality would be beneficial. We used an iPhone to partially achieve this goal, and evaluated its accuracy and feasibility in a clinical neurosurgery setting. Methodology/Principal Findings We located brain lesions in 35 patients, and using an iPhone, we depicted the lesion’s surface projection onto the skin of the head. To assess the accuracy of this method, we pasted computed tomography (CT) markers surrounding the depicted lesion boundaries on the skin onto 15 patients. CT scans were then performed with or without contrast enhancement. The deviations (D) between the CT markers and the actual lesion boundaries were measured. We found that 97.7% of the markers displayed a high accuracy level (D ≤ 5mm). In the remaining 20 patients, we compared our iPhone-based method with a frameless neuronavigation system. Four check points were chosen on the skin surrounding the depicted lesion boundaries, to assess the deviations between the two methods. The integrated offset was calculated according to the deviations at the four check points. We found that for the supratentorial lesions, the medial offset between these two methods was 2.90 mm and the maximum offset was 4.2 mm. Conclusions/Significance This low-cost, image-based, iPhone-assisted, augmented reality solution is technically feasible, and helpful for the localization of some intracranial lesions, especially shallow supratentorial intracranial lesions of moderate size. PMID:27454518

  17. SU-E-J-134: An Augmented-Reality Optical Imaging System for Accurate Breast Positioning During Radiotherapy

    SciTech Connect

    Nazareth, D; Malhotra, H; French, S; Hoffmann, K; Merrow, C

    2014-06-01

    Purpose: Breast radiotherapy, particularly electronic compensation, may involve large dose gradients and difficult patient positioning problems. We have developed a simple self-calibrating augmented-reality system, which assists in accurately and reproducibly positioning the patient, by displaying her live image from a single camera superimposed on the correct perspective projection of her 3D CT data. Our method requires only a standard digital camera capable of live-view mode, installed in the treatment suite at an approximately-known orientation and position (rotation R; translation T). Methods: A 10-sphere calibration jig was constructed and CT imaged to provide a 3D model. The (R,T) relating the camera to the CT coordinate system were determined by acquiring a photograph of the jig and optimizing an objective function, which compares the true image points to points calculated with a given candidate R and T geometry. Using this geometric information, 3D CT patient data, viewed from the camera's perspective, is plotted using a Matlab routine. This image data is superimposed onto the real-time patient image, acquired by the camera, and displayed using standard live-view software. This enables the therapists to view both the patient's current and desired positions, and guide the patient into assuming the correct position. The method was evaluated using an in-house developed bolus-like breast phantom, mounted on a supporting platform, which could be tilted at various angles to simulate treatment-like geometries. Results: Our system allowed breast phantom alignment, with an accuracy of about 0.5 cm and 1 ± 0.5 degree. Better resolution could be possible using a camera with higher-zoom capabilities. Conclusion: We have developed an augmented-reality system, which combines a perspective projection of a CT image with a patient's real-time optical image. This system has the potential to improve patient setup accuracy during breast radiotherapy, and could possibly be

  18. Projector-Based Augmented Reality for Intuitive Intraoperative Guidance in Image-Guided 3D Interstitial Brachytherapy

    SciTech Connect

    Krempien, Robert Hoppe, Harald; Kahrs, Lueder; Daeuber, Sascha; Schorr, Oliver; Eggers, Georg; Bischof, Marc; Munter, Marc W.; Debus, Juergen; Harms, Wolfgang

    2008-03-01

    Purpose: The aim of this study is to implement augmented reality in real-time image-guided interstitial brachytherapy to allow an intuitive real-time intraoperative orientation. Methods and Materials: The developed system consists of a common video projector, two high-resolution charge coupled device cameras, and an off-the-shelf notebook. The projector was used as a scanning device by projecting coded-light patterns to register the patient and superimpose the operating field with planning data and additional information in arbitrary colors. Subsequent movements of the nonfixed patient were detected by means of stereoscopically tracking passive markers attached to the patient. Results: In a first clinical study, we evaluated the whole process chain from image acquisition to data projection and determined overall accuracy with 10 patients undergoing implantation. The described method enabled the surgeon to visualize planning data on top of any preoperatively segmented and triangulated surface (skin) with direct line of sight during the operation. Furthermore, the tracking system allowed dynamic adjustment of the data to the patient's current position and therefore eliminated the need for rigid fixation. Because of soft-part displacement, we obtained an average deviation of 1.1 mm by moving the patient, whereas changing the projector's position resulted in an average deviation of 0.9 mm. Mean deviation of all needles of an implant was 1.4 mm (range, 0.3-2.7 mm). Conclusions: The developed low-cost augmented-reality system proved to be accurate and feasible in interstitial brachytherapy. The system meets clinical demands and enables intuitive real-time intraoperative orientation and monitoring of needle implantation.

  19. An Augmented Reality-Based Mobile Learning System to Improve Students' Learning Achievements and Motivations in Natural Science Inquiry Activities

    ERIC Educational Resources Information Center

    Chiang, Tosti H. C.; Yang, Stephen J. H.; Hwang, Gwo-Jen

    2014-01-01

    In this study, an augmented reality-based mobile learning system is proposed for conducting inquiry-based learning activities. An experiment has been conducted to examine the effectiveness of the proposed approach in terms of learning achievements and motivations. The subjects were 57 fourth graders from two classes taught by the same teacher in…

  20. The Interaction of Child-Parent Shared Reading with an Augmented Reality (AR) Picture Book and Parents' Conceptions of AR Learning

    ERIC Educational Resources Information Center

    Cheng, Kun-Hung; Tsai, Chin-Chung

    2016-01-01

    Following a previous study (Cheng & Tsai, 2014. "Computers & Education"), this study aimed to probe the interaction of child-parent shared reading with the augmented reality (AR) picture book in more depth. A series of sequential analyses were thus conducted to infer the behavioral transition diagrams and visualize the continuity…

  1. Alien Contact. Examining the Influence of Teacher Mathematics Knowledge for Teaching on Their Implementation of a Mathematical, Augmented Reality Curricular Unit

    ERIC Educational Resources Information Center

    Mitchell, Rebecca Noelle

    2009-01-01

    This paper reports on findings from a five-teacher, exploratory case study, critically observing their implementation of a technology-intensive, augmented reality (AR) mathematics curriculum unit, along with its paper-based control. The unit itself was intended to promote multiple proportional reasoning strategies to urban, public middle school…

  2. Development of a surgical navigation system based on augmented reality using an optical see-through head-mounted display.

    PubMed

    Chen, Xiaojun; Xu, Lu; Wang, Yiping; Wang, Huixiang; Wang, Fang; Zeng, Xiangsen; Wang, Qiugen; Egger, Jan

    2015-06-01

    The surgical navigation system has experienced tremendous development over the past decades for minimizing the risks and improving the precision of the surgery. Nowadays, Augmented Reality (AR)-based surgical navigation is a promising technology for clinical applications. In the AR system, virtual and actual reality are mixed, offering real-time, high-quality visualization of an extensive variety of information to the users (Moussa et al., 2012) [1]. For example, virtual anatomical structures such as soft tissues, blood vessels and nerves can be integrated with the real-world scenario in real time. In this study, an AR-based surgical navigation system (AR-SNS) is developed using an optical see-through HMD (head-mounted display), aiming at improving the safety and reliability of the surgery. With the use of this system, including the calibration of instruments, registration, and the calibration of HMD, the 3D virtual critical anatomical structures in the head-mounted display are aligned with the actual structures of patient in real-world scenario during the intra-operative motion tracking process. The accuracy verification experiment demonstrated that the mean distance and angular errors were respectively 0.809±0.05mm and 1.038°±0.05°, which was sufficient to meet the clinical requirements. PMID:25882923

  3. Hybrid Reality Lab Capabilities - Video 2

    NASA Technical Reports Server (NTRS)

    Delgado, Francisco J.; Noyes, Matthew

    2016-01-01

    Our Hybrid Reality and Advanced Operations Lab is developing incredibly realistic and immersive systems that could be used to provide training, support engineering analysis, and augment data collection for various human performance metrics at NASA. To get a better understanding of what Hybrid Reality is, let's go through the two most commonly known types of immersive realities: Virtual Reality, and Augmented Reality. Virtual Reality creates immersive scenes that are completely made up of digital information. This technology has been used to train astronauts at NASA, used during teleoperation of remote assets (arms, rovers, robots, etc.) and other activities. One challenge with Virtual Reality is that if you are using it for real time-applications (like landing an airplane) then the information used to create the virtual scenes can be old (i.e. visualized long after physical objects moved in the scene) and not accurate enough to land the airplane safely. This is where Augmented Reality comes in. Augmented Reality takes real-time environment information (from a camera, or see through window, and places digitally created information into the scene so that it matches with the video/glass information). Augmented Reality enhances real environment information collected with a live sensor or viewport (e.g. camera, window, etc.) with the information-rich visualization provided by Virtual Reality. Hybrid Reality takes Augmented Reality even further, by creating a higher level of immersion where interactivity can take place. Hybrid Reality takes Virtual Reality objects and a trackable, physical representation of those objects, places them in the same coordinate system, and allows people to interact with both objects' representations (virtual and physical) simultaneously. After a short period of adjustment, the individuals begin to interact with all the objects in the scene as if they were real-life objects. The ability to physically touch and interact with digitally created

  4. D Survey and Augmented Reality for Cultural Heritage. The Case Study of Aurelian Wall at Castra Praetoria in Rome

    NASA Astrophysics Data System (ADS)

    Canciani, M.; Conigliaro, E.; Del Grasso, M.; Papalini, P.; Saccone, M.

    2016-06-01

    The development of close-range photogrammetry has produced a lot of new possibility to study cultural heritage. 3D data acquired with conventional and low cost cameras can be used to document, investigate the full appearance, materials and conservation status, to help the restoration process and identify intervention priorities. At the same time, with 3D survey a lot of three-dimensional data are collected and analyzed by researchers, but there are a very few possibility of 3D output. The augmented reality is one of this possible output with a very low cost technology but a very interesting result. Using simple mobile technology (for iPad and Android Tablets) and shareware software (in the case presented "Augment") it is possible to share and visualize a large number of 3D models with your own device. The case study presented is a part of an architecture graduate thesis, made in Rome at Department of Architecture of Roma Tre University. We have developed a photogrammetric survey to study the Aurelian Wall at Castra Praetoria in Rome. The surveys of 8000 square meters of surface have allowed to identify stratigraphy and construction phases of a complex portion of Aurelian Wall, specially about the Northern door of Castra. During this study, the data coming out of 3D survey (photogrammetric and topographic), are stored and used to create a reverse 3D model, or virtual reconstruction, of the Northern door of Castra. This virtual reconstruction shows the door in the Tiberian period, nowadays it's totally hidden by a curtain wall but, little and significative architectural details allow to know its original feature. The 3D model of the ancient walls has been mapped with the exact type of bricks and mortar, oriented and scaled according to the existing one to use augmented reality. Finally, two kind of application have been developed, one on site, were you can see superimposed the virtual reconstruction on the existing walls using the image recognition. On the other hand

  5. Virtual-reality-based educational laboratories in fiber optic engineering

    NASA Astrophysics Data System (ADS)

    Hayes, Dana; Turczynski, Craig; Rice, Jonny; Kozhevnikov, Michael

    2014-07-01

    Researchers and educators have observed great potential in virtual reality (VR) technology as an educational tool due to its ability to engage and spark interest in students, thus providing them with a deeper form of knowledge about a subject. The focus of this project is to develop an interactive VR educational module, Laser Diode Characteristics and Coupling to Fibers, to integrate into a fiber optics laboratory course. The developed module features a virtual laboratory populated with realistic models of optical devices in which students can set up and perform an optical experiment dealing with laser diode characteristics and fiber coupling. The module contains three increasingly complex levels for students to navigate through, with a short built-in quiz after each level to measure the student's understanding of the subject. Seventeen undergraduate students learned fiber coupling concepts using the designed computer simulation in a non-immersive desktop virtual environment (VE) condition. The analysis of students' responses on the updated pre- and post tests show statistically significant improvement of the scores for the post-test as compared to the pre-test. In addition, the students' survey responses suggest that they found the module very useful and engaging. The conducted study clearly demonstrated the feasibility of the proposed instructional technology for engineering education, where both the model of instruction and the enabling technology are equally important, in providing a better learning environment to improve students' conceptual understanding as compared to other instructional approaches.

  6. Using virtual reality to augment perception, enhance sensorimotor adaptation, and change our minds

    PubMed Central

    Wright, W. Geoffrey

    2014-01-01

    Technological advances that involve human sensorimotor processes can have both intended and unintended effects on the central nervous system (CNS). This mini review focuses on the use of virtual environments (VE) to augment brain functions by enhancing perception, eliciting automatic motor behavior, and inducing sensorimotor adaptation. VE technology is becoming increasingly prevalent in medical rehabilitation, training simulators, gaming, and entertainment. Although these VE applications have often been shown to optimize outcomes, whether it be to speed recovery, reduce training time, or enhance immersion and enjoyment, there are inherent drawbacks to environments that can potentially change sensorimotor calibration. Across numerous VE studies over the years, we have investigated the effects of combining visual and physical motion on perception, motor control, and adaptation. Recent results from our research involving exposure to dynamic passive motion within a visually-depicted VE reveal that short-term exposure to augmented sensorimotor discordance can result in systematic aftereffects that last beyond the exposure period. Whether these adaptations are advantageous or not, remains to be seen. Benefits as well as risks of using VE-driven sensorimotor stimulation to enhance brain processes will be discussed. PMID:24782724

  7. Augmented paper maps: Exploring the design space of a mixed reality system

    NASA Astrophysics Data System (ADS)

    Paelke, Volker; Sester, Monika

    Paper maps and mobile electronic devices have complementary strengths and shortcomings in outdoor use. In many scenarios, like small craft sailing or cross-country trekking, a complete replacement of maps is neither useful nor desirable. Paper maps are fail-safe, relatively cheap, offer superior resolution and provide large scale overview. In uses like open-water sailing it is therefore mandatory to carry adequate maps/charts. GPS based mobile devices, on the other hand, offer useful features like automatic positioning and plotting, real-time information update and dynamic adaptation to user requirements. While paper maps are now commonly used in combination with mobile GPS devices, there is no meaningful integration between the two, and the combined use leads to a number of interaction problems and potential safety issues. In this paper we explore the design space of augmented paper maps in which maps are augmented with additional functionality through a mobile device to achieve a meaningful integration between device and map that combines their respective strengths.

  8. Ghostman: Augmented Reality Application for Telerehabilitation and Remote Instruction of a Novel Motor Skill

    PubMed Central

    Chinthammit, Winyu; Visentin, Denis

    2014-01-01

    This paper describes a pilot study using a prototype telerehabilitation system (Ghostman). Ghostman is a visual augmentation system designed to allow a physical therapist and patient to inhabit each other's viewpoint in an augmented real-world environment. This allows the therapist to deliver instruction remotely and observe performance of a motor skill through the patient's point of view. In a pilot study, we investigated the efficacy of Ghostman by using it to teach participants to use chopsticks. Participants were randomized to a single training session, receiving either Ghostman or face-to-face instructions by the same skilled instructor. Learning was assessed by measuring retention of skills at 24-hour and 7-day post instruction. As hypothesised, there were no differences in reduction of error or time to completion between participants using Ghostman compared to those receiving face-to-face instruction. These initial results in a healthy population are promising and demonstrate the potential application of this technology to patients requiring learning or relearning of motor skills as may be required following a stroke or brain injury. PMID:24829910

  9. Human Pacman: A Mobile Augmented Reality Entertainment System Based on Physical, Social, and Ubiquitous Computing

    NASA Astrophysics Data System (ADS)

    Cheok, Adrian David

    This chapter details the Human Pacman system to illuminate entertainment computing which ventures to embed the natural physical world seamlessly with a fantasy virtual playground by capitalizing on infrastructure provided by mobile computing, wireless LAN, and ubiquitous computing. With Human Pacman, we have a physical role-playing computer fantasy together with real human-social and mobile-gaming that emphasizes on collaboration and competition between players in a wide outdoor physical area that allows natural wide-area human-physical movements. Pacmen and Ghosts are now real human players in the real world experiencing mixed computer graphics fantasy-reality provided by using the wearable computers on them. Virtual cookies and actual tangible physical objects are incorporated into the game play to provide novel experiences of seamless transitions between the real and virtual worlds. This is an example of a new form of gaming that anchors on physicality, mobility, social interaction, and ubiquitous computing.

  10. The use of virtual reality-based therapy to augment poststroke upper limb recovery

    PubMed Central

    Samuel, Geoffrey S; Choo, Min; Chan, Wai Yin; Kok, Stanley; Ng, Yee Sien

    2015-01-01

    Stroke remains one of the major causes of disability worldwide. This case report illustrates the complementary use of biomechanical and kinematic in-game markers, in addition to standard clinical outcomes, to comprehensively assess and track a patient’s disabilities. A 65-year-old patient was admitted for right-sided weakness and clinically diagnosed with acute ischaemic stroke. She participated in a short trial of standard stroke occupational therapy and physiotherapy with additional daily virtual reality (VR)-based therapy. Outcomes were tracked using kinematic data and conventional clinical assessments. Her Functional Independence Measure score improved from 87 to 113 and Fugl-Meyer motor score improved from 56 to 62, denoting clinically significant improvement. Corresponding kinematic analysis revealed improved hand path ratios and a decrease in velocity peaks. Further research is being undertaken to elucidate the optimal type, timing, setting and duration of VR-based therapy, as well as the use of neuropharmacological adjuncts. PMID:26243983

  11. Machine learning-based augmented reality for improved surgical scene understanding.

    PubMed

    Pauly, Olivier; Diotte, Benoit; Fallavollita, Pascal; Weidert, Simon; Euler, Ekkehard; Navab, Nassir

    2015-04-01

    In orthopedic and trauma surgery, AR technology can support surgeons in the challenging task of understanding the spatial relationships between the anatomy, the implants and their tools. In this context, we propose a novel augmented visualization of the surgical scene that mixes intelligently the different sources of information provided by a mobile C-arm combined with a Kinect RGB-Depth sensor. Therefore, we introduce a learning-based paradigm that aims at (1) identifying the relevant objects or anatomy in both Kinect and X-ray data, and (2) creating an object-specific pixel-wise alpha map that permits relevance-based fusion of the video and the X-ray images within one single view. In 12 simulated surgeries, we show very promising results aiming at providing for surgeons a better surgical scene understanding as well as an improved depth perception. PMID:24998759

  12. The analysis of visual variables for use in the cartographic design of point symbols for mobile Augmented Reality applications

    NASA Astrophysics Data System (ADS)

    Halik, Łukasz

    2012-11-01

    The objective of the present deliberations was to systematise our knowledge of static visual variables used to create cartographic symbols, and also to analyse the possibility of their utilisation in the Augmented Reality (AR) applications on smartphone-type mobile devices. This was accomplished by combining the visual variables listed over the years by different researchers. Research approach was to determine the level of usefulness of particular characteristics of visual variables such as selective, associative, quantitative and order. An attempt was made to provide an overview of static visual variables and to describe the AR system which is a new paradigm of the user interface. Changing the approach to the presentation of point objects is caused by applying different perspective in the observation of objects (egocentric view) than it is done on traditional analogue maps (geocentric view). Presented topics will refer to the fast-developing field of cartography, namely mobile cartography. Particular emphasis will be put on smartphone-type mobile devices and their applicability in the process of designing cartographic symbols. Celem artykułu było usystematyzowanie wiedzy na temat statycznych zmiennych wizualnych, które sa kluczowymi składnikami budujacymi sygnatury kartograficzne. Podjeto próbe zestawienia zmiennych wizualnych wyodrebnionych przez kartografów na przestrzeni ostatnich piecdziesieciu lat, zaczynajac od klasyfikacji przedstawionej przez J. Bertin’a. Dokonano analizy stopnia uzytecznosci poszczególnych zmiennych graficznych w aspekcie ich wykorzystania w projektowaniu znaków punktowych dla mobilnych aplikacji tworzonych w technologii Rzeczywistosci Rozszerzonej (Augmented Reality). Zmienne poddano analizie pod wzgledem czterech charakterystyk: selektywnosci, skojarzeniowosci, odzwierciedlenia ilosci oraz porzadku. W artykule zwrócono uwage na odmienne zastosowanie perspektywy pomiedzy tradycyjnymi analogowymi mapami (geocentrycznosc) a

  13. Automatic localization of endoscope in intraoperative CT image: A simple approach to augmented reality guidance in laparoscopic surgery.

    PubMed

    Bernhardt, Sylvain; Nicolau, Stéphane A; Agnus, Vincent; Soler, Luc; Doignon, Christophe; Marescaux, Jacques

    2016-05-01

    The use of augmented reality in minimally invasive surgery has been the subject of much research for more than a decade. The endoscopic view of the surgical scene is typically augmented with a 3D model extracted from a preoperative acquisition. However, the organs of interest often present major changes in shape and location because of the pneumoperitoneum and patient displacement. There have been numerous attempts to compensate for this distortion between the pre- and intraoperative states. Some have attempted to recover the visible surface of the organ through image analysis and register it to the preoperative data, but this has proven insufficiently robust and may be problematic with large organs. A second approach is to introduce an intraoperative 3D imaging system as a transition. Hybrid operating rooms are becoming more and more popular, so this seems to be a viable solution, but current techniques require yet another external and constraining piece of apparatus such as an optical tracking system to determine the relationship between the intraoperative images and the endoscopic view. In this article, we propose a new approach to automatically register the reconstruction from an intraoperative CT acquisition with the static endoscopic view, by locating the endoscope tip in the volume data. We first describe our method to localize the endoscope orientation in the intraoperative image using standard image processing algorithms. Secondly, we highlight that the axis of the endoscope needs a specific calibration process to ensure proper registration accuracy. In the last section, we present quantitative and qualitative results proving the feasibility and the clinical potential of our approach. PMID:26925804

  14. An augmented reality environment for image-guidance of off-pump mitral valve implantation

    NASA Astrophysics Data System (ADS)

    Linte, Christian; Wiles, Andrew D.; Hill, Nick; Moore, John; Wedlake, Chris; Guiraudon, Gerard; Jones, Doug; Bainbridge, Daniel; Peters, Terry M.

    2007-03-01

    Clinical research has been rapidly evolving towards the development of less invasive surgical procedures. We recently embarked on a project to improve intracardiac beating heart interventions. Our novel approach employs new surgical technologies and support from image-guidance via pre-operative and intra-operative imaging (i.e. two-dimensional echocardiography) to substitute for direct vision. Our goal was to develop a versatile system that allowed for safe cardiac port access, and provide sufficient image-guidance with the aid of a virtual reality environment to substitute for the absence of direct vision, while delivering quality therapy to the target. Specific targets included the repair and replacement of heart valves and the repair of septal defects. The ultimate objective was to duplicate the success rate of conventional open-heart surgery, but to do so via a small incision, and to evaluate the efficacy of the procedure as it is performed. This paper describes the software and hardware components, along with the methodology for performing mitral valve replacement as one example of this approach, using ultrasound and virtual tool models to position and fasten the valve in place.

  15. From Survey to Education: How Augmented Reality Can Contribute to the Study and Dissemination of Archaeo-astronomy

    NASA Astrophysics Data System (ADS)

    Schiavottiello, N.

    2009-08-01

    The study and practice of archaeo-astronomy comprehend disciplines such as archaeology, positional astronomy, history and the studies of locals mythology as well as technical survey theory and practice. The research often start with an archaeological survey in order to record possible structural orientation of a particular monument towards specific cardinal directions. In a second stage theories about the visible orientations and possible alignments of a specific structure or part of a structure are drawn; often achieved with the use of some in house tools. These tools sometimes remain too ``esoteric'' and not always user friendly, especially if later they would have to be used for education purposes. Moreover they are borrowed from tools used in other disciplines such us astronomical, image processing and architectural software, thus resulting in a complicate process of trying to merge data that should instead be born in the same environment at the first place. Virtual realities have long entered our daily life in research, education and entertainment; those can represent natural models because of their 3D nature of representing data. However on an visual interpretation level what they often represent are displaced models of the reality, whatever viewed on personal computers or with ``immersive'' techniques. These can result very useful at a research stage or in order to show concepts that requires specific point of view, however they often struggle to explore all our senses to the mere detriment of our vision. A possible solution could be achieved by simply visiting the studied site, however when visiting a particular place it is hard to visualize in one simple application environment, all previously pursued analysis. This is necessary in order to discover the meaning of a specific structure and to propose new theories. Augmented reality in this sense could bridge the gap that exist when looking at this particular problem. This can be achieved with the creation

  16. Interactive Near-Field Illumination for Photorealistic Augmented Reality with Varying Materials on Mobile Devices.

    PubMed

    Rohmer, Kai; Buschel, Wolfgang; Dachselt, Raimund; Grosch, Thorsten

    2015-12-01

    At present, photorealistic augmentation is not yet possible since the computational power of mobile devices is insufficient. Even streaming solutions from stationary PCs cause a latency that affects user interactions considerably. Therefore, we introduce a differential rendering method that allows for a consistent illumination of the inserted virtual objects on mobile devices, avoiding delays. The computation effort is shared between a stationary PC and the mobile devices to make use of the capacities available on both sides. The method is designed such that only a minimum amount of data has to be transferred asynchronously between the participants. This allows for an interactive illumination of virtual objects with a consistent appearance under both temporally and spatially varying real illumination conditions. To describe the complex near-field illumination in an indoor scenario, HDR video cameras are used to capture the illumination from multiple directions. In this way, sources of illumination can be considered that are not directly visible to the mobile device because of occlusions and the limited field of view. While our method focuses on Lambertian materials, we also provide some initial approaches to approximate non-diffuse virtual objects and thereby allow for a wider field of application at nearly the same cost. PMID:26529458

  17. Lbs Augmented Reality Assistive System for Utilities Infrastructure Management Through Galileo and Egnos

    NASA Astrophysics Data System (ADS)

    Stylianidis, E.; Valaria, E.; Smagas, K.; Pagani, A.; Henriques, J.; Garca, A.; Jimeno, E.; Carrillo, I.; Patias, P.; Georgiadis, C.; Kounoudes, A.; Michail, K.

    2016-06-01

    There is a continuous and increasing demand for solutions, both software and hardware-based, that are able to productively handle underground utilities geospatial data. Innovative approaches that are based on the use of the European GNSS, Galileo and EGNOS, sensor technologies and LBS, are able to monitor, document and manage utility infrastructures' data with an intuitive 3D augmented visualisation and navigation/positioning technology. A software and hardware-based system called LARA, currently under develop- ment through a H2020 co-funded project, aims at meeting that demand. The concept of LARA is to integrate the different innovative components of existing technologies in order to design and develop an integrated navigation/positioning and information system which coordinates GNSS, AR, 3D GIS and geodatabases on a mobile platform for monitoring, documenting and managing utility infrastruc- tures on-site. The LARA system will guide utility field workers to locate the working area by helping them see beneath the ground, rendering the complexity of the 3D models of the underground grid such as water, gas and electricity. The capacity and benefits of LARA are scheduled to be tested in two case studies located in Greece and the United Kingdom with various underground utilities. The paper aspires to present the first results from this initiative. The project leading to this application has received funding from the European GNSS Agency under the European Union's Horizon 2020 research and innovation programme under grant agreement No 641460.

  18. Content Delivery Using Augmented Reality to Enhance Students' Performance in a Building Design and Assembly Project

    ERIC Educational Resources Information Center

    Shirazi, Arezoo; Behzadan, Amir H.

    2015-01-01

    Recent studies suggest that the number of students pursuing science, technology, engineering, and mathematics (STEM) degrees has been generally decreasing. An extensive body of research cites the lack of motivation and engagement in the learning process as a major underlying reason of this decline. It has been discussed that if properly…

  19. BodyWindows: enhancing a mannequin with projective augmented reality for exploring anatomy, physiology and medical procedures.

    PubMed

    Samosky, Joseph T; Wang, Bo; Nelson, Douglas A; Bregman, Russell; Hosmer, Andrew; Weaver, Robert A

    2012-01-01

    Augmented reality offers the potential to radically extend and enhance the capabilities of physical medical simulators such as full-body mannequin trainers. We have developed a system that transforms the surface of a mannequin simulator into both a display screen and an input device. The BodyWindows system enables a user to open, size, and reposition multiple viewports onto the simulator body. We demonstrate a dynamic viewport that displays a beating heart. Similar viewports could be used to display real-time physiological responses to interventions the user applies to the mannequin, such as injection of a simulated drug. Viewport windows can be overlapping and show anatomy at different depths, creating the illusion of "cutting" multiple windows into the body to reveal structures at different depths from the surface. The developed low-cost interface employees an IR light pen and the Nintendo Wiimote. We also report experiments using the Microsoft Kinect computer vision sensor to provide a completely hand-gesture based interface. PMID:22357032

  20. Automotive technicians' training as a community-of-practice: implications for the design of an augmented reality teaching aid.

    PubMed

    Anastassova, Margarita; Burkhardt, Jean-Marie

    2009-07-01

    The paper presents an ergonomic analysis carried out in the early phases of an R&D project. The purpose was to investigate the functioning of today's Automotive Service Technicians (ASTs) training in order to inform the design of an Augmented Reality (AR) teaching aid. The first part of the paper presents a literature review of some major problems encountered by ASTs today. The benefits of AR as technological aid are also introduced. Then, the methodology and the results of two case studies are presented. The first study is based on interviews with trainers and trainees; the second one on observations in real training settings. The results support the assumption that today's ASTs' training could be regarded as a community-of-practice (CoP). Therefore, AR could be useful as a collaboration tool, offering a shared virtual representation of real vehicle's parts, which are normally invisible unless dismantled (e.g. the parts of a hydraulic automatic transmission). We conclude on the methods and the technologies to support the automotive CoP. PMID:18703179

  1. Current and future possibilities of V2V and I2V technologies: an analysis directed toward Augmented Reality systems

    NASA Astrophysics Data System (ADS)

    Betancur, J. A.; Osorio-Gómez, Gilberto; Arnedo, Aida; Yarce Botero, Andrés.

    2014-06-01

    Nowadays, it is very important to explore the qualitative characteristics of autonomous mobility systems in automobiles, especially disruptive technology like Vehicle to Vehicle (V2V) and Infrastructure to Vehicle (I2V), in order to comprehend how the next generation of automobiles will be developed. In this sense, this research covers a general review about active safety in automobiles where V2V and I2V systems have been implemented; identifying the more realistic possibilities related to V2V and I2V technology and analyzing the current applications, some systems in development process and some future conceptual proposals. Mainly, it is notorious the potential development of mixing V2V and I2V systems pointing to increase the driver's attention; therefore, a configuration between these two technologies and some augmented reality system for automobiles (Head-Up Display and Head-Down Display) is proposed. There is a huge potential of implementation for this kind of configuration once the normative and the roadmap for its development can be widely established.

  2. Role of Cranial and Spinal Virtual and Augmented Reality Simulation Using Immersive Touch Modules in Neurosurgical Training

    PubMed Central

    Alaraj, Ali; Charbel, Fady T.; Birk, Daniel; Tobin, Mathew; Luciano, Cristian; Banerjee, Pat P.; Rizzi, Silvio; Sorenson, Jeff; Foley, Kevin; Slavin, Konstantin; Roitberg, Ben

    2013-01-01

    Recent studies have shown that mental script-based rehearsal and simulation-based training improves the transfer of surgical skills in various medical disciplines. Despite significant advances in technology and intraoperative techniques over the last several decades, surgical skills training on neurosurgical operations still carries significant risk of serious morbidity or mortality. Potentially avoidable technical errors are well recognized as contributing to poor surgical outcome. Surgical education is undergoing overwhelming change, with reduction of working hours and current trends to focus on patient’s safety and linking reimbursement with clinical outcomes, and there is a need for adjunctive means for neurosurgical training;this has been recent advancement in simulation technology. ImmersiveTouch (IT) is an augmented reality (AR) system that integrates a haptic device and a high-resolution stereoscopic display. This simulation platform utilizes multiple sensory modalities, recreating many of the environmental cues experienced during an actual procedure. Modules available include ventriculostomy, bone drilling, percutaneous trigeminal rhizotomy, in addition to simulated spinal modules such as pedicle screw placement, vertebroplasty, and lumbar puncture. We present our experience with development of such AR neurosurgical modules and the feedback from neurosurgical residents. PMID:23254799

  3. Employing Augmented-Reality-Embedded Instruction to Disperse the Imparities of Individual Differences in Earth Science Learning

    NASA Astrophysics Data System (ADS)

    Chen, Cheng-ping; Wang, Chang-Hwa

    2015-12-01

    Studies have proven that merging hands-on and online learning can result in an enhanced experience in learning science. In contrast to traditional online learning, multiple in-classroom activities may be involved in an augmented-reality (AR)-embedded e-learning process and thus could reduce the effects of individual differences. Using a three-stage AR-embedded instructional process, we conducted an experiment to investigate the influences of individual differences on learning earth science phenomena of "day, night, and seasons" for junior highs. The mixed-methods sequential explanatory design was employed. In the quantitative phase, factors of learning styles and ICT competences were examined alongside with the overall learning achievement. Independent t tests and ANCOVAs were employed to achieve inferential statistics. The results showed that overall learning achievement was significant for the AR-embedded instruction. Nevertheless, neither of the two learner factors exhibited significant effect on learning achievement. In the qualitative phase, we analyzed student interview records, and a wide variation on student's preferred instructional stages were revealed. These findings could provide an alternative rationale for developing ICT-supported instruction, as our three-stage AR-embedded comprehensive e-learning scheme could enhance instruction adaptiveness to disperse the imparities of individual differences between learners.

  4. Towards intelligent environments: an augmented reality-brain-machine interface operated with a see-through head-mount display.

    PubMed

    Takano, Kouji; Hata, Naoki; Kansaku, Kenji

    2011-01-01

    The brain-machine interface (BMI) or brain-computer interface is a new interface technology that uses neurophysiological signals from the brain to control external machines or computers. This technology is expected to support daily activities, especially for persons with disabilities. To expand the range of activities enabled by this type of interface, here, we added augmented reality (AR) to a P300-based BMI. In this new system, we used a see-through head-mount display (HMD) to create control panels with flicker visual stimuli to support the user in areas close to controllable devices. When the attached camera detects an AR marker, the position and orientation of the marker are calculated, and the control panel for the pre-assigned appliance is created by the AR system and superimposed on the HMD. The participants were required to control system-compatible devices, and they successfully operated them without significant training. Online performance with the HMD was not different from that using an LCD monitor. Posterior and lateral (right or left) channel selections contributed to operation of the AR-BMI with both the HMD and LCD monitor. Our results indicate that AR-BMI systems operated with a see-through HMD may be useful in building advanced intelligent environments. PMID:21541307

  5. Smart multi-level tool for remote patient monitoring based on a wireless sensor network and mobile augmented reality.

    PubMed

    González, Fernando Cornelio Jiménez; Villegas, Osslan Osiris Vergara; Ramírez, Dulce Esperanza Torres; Sánchez, Vianey Guadalupe Cruz; Domínguez, Humberto Ochoa

    2014-01-01

    Technological innovations in the field of disease prevention and maintenance of patient health have enabled the evolution of fields such as monitoring systems. One of the main advances is the development of real-time monitors that use intelligent and wireless communication technology. In this paper, a system is presented for the remote monitoring of the body temperature and heart rate of a patient by means of a wireless sensor network (WSN) and mobile augmented reality (MAR). The combination of a WSN and MAR provides a novel alternative to remotely measure body temperature and heart rate in real time during patient care. The system is composed of (1) hardware such as Arduino microcontrollers (in the patient nodes), personal computers (for the nurse server), smartphones (for the mobile nurse monitor and the virtual patient file) and sensors (to measure body temperature and heart rate), (2) a network layer using WiFly technology, and (3) software such as LabView, Android SDK, and DroidAR. The results obtained from tests show that the system can perform effectively within a range of 20 m and requires ten minutes to stabilize the temperature sensor to detect hyperthermia, hypothermia or normal body temperature conditions. Additionally, the heart rate sensor can detect conditions of tachycardia and bradycardia. PMID:25230306

  6. Smart Multi-Level Tool for Remote Patient Monitoring Based on a Wireless Sensor Network and Mobile Augmented Reality

    PubMed Central

    González, Fernando Cornelio Jimènez; Villegas, Osslan Osiris Vergara; Ramírez, Dulce Esperanza Torres; Sánchez, Vianey Guadalupe Cruz; Domínguez, Humberto Ochoa

    2014-01-01

    Technological innovations in the field of disease prevention and maintenance of patient health have enabled the evolution of fields such as monitoring systems. One of the main advances is the development of real-time monitors that use intelligent and wireless communication technology. In this paper, a system is presented for the remote monitoring of the body temperature and heart rate of a patient by means of a wireless sensor network (WSN) and mobile augmented reality (MAR). The combination of a WSN and MAR provides a novel alternative to remotely measure body temperature and heart rate in real time during patient care. The system is composed of (1) hardware such as Arduino microcontrollers (in the patient nodes), personal computers (for the nurse server), smartphones (for the mobile nurse monitor and the virtual patient file) and sensors (to measure body temperature and heart rate), (2) a network layer using WiFly technology, and (3) software such as LabView, Android SDK, and DroidAR. The results obtained from tests show that the system can perform effectively within a range of 20 m and requires ten minutes to stabilize the temperature sensor to detect hyperthermia, hypothermia or normal body temperature conditions. Additionally, the heart rate sensor can detect conditions of tachycardia and bradycardia. PMID:25230306

  7. Pico Lantern: Surface reconstruction and augmented reality in laparoscopic surgery using a pick-up laser projector.

    PubMed

    Edgcumbe, Philip; Pratt, Philip; Yang, Guang-Zhong; Nguan, Christopher; Rohling, Robert

    2015-10-01

    The Pico Lantern is a miniature projector developed for structured light surface reconstruction, augmented reality and guidance in laparoscopic surgery. During surgery it will be dropped into the patient and picked up by a laparoscopic tool. While inside the patient it projects a known coded pattern and images onto the surface of the tissue. The Pico Lantern is visually tracked in the laparoscope's field of view for the purpose of stereo triangulation between it and the laparoscope. In this paper, the first application is surface reconstruction. Using a stereo laparoscope and an untracked Pico Lantern, the absolute error for surface reconstruction for a plane, cylinder and ex vivo kidney, is 2.0 mm, 3.0 mm and 5.6 mm, respectively. Using a mono laparoscope and a tracked Pico Lantern for the same plane, cylinder and kidney the absolute error is 1.4 mm, 1.5 mm and 1.5 mm, respectively. These results confirm the benefit of the wider baseline produced by tracking the Pico Lantern. Virtual viewpoint images are generated from the kidney surface data and an in vivo proof-of-concept porcine trial is reported. Surface reconstruction of the neck of a volunteer shows that the pulsatile motion of the tissue overlying a major blood vessel can be detected and displayed in vivo. Future work will integrate the Pico Lantern into standard and robot-assisted laparoscopic surgery. PMID:26024818

  8. BoreholeAR: A mobile tablet application for effective borehole database visualization using an augmented reality technology

    NASA Astrophysics Data System (ADS)

    Lee, Sangho; Suh, Jangwon; Park, Hyeong-Dong

    2015-03-01

    Boring logs are widely used in geological field studies since the data describes various attributes of underground and surface environments. However, it is difficult to manage multiple boring logs in the field as the conventional management and visualization methods are not suitable for integrating and combining large data sets. We developed an iPad application to enable its user to search the boring log rapidly and visualize them using the augmented reality (AR) technique. For the development of the application, a standard borehole database appropriate for a mobile-based borehole database management system was designed. The application consists of three modules: an AR module, a map module, and a database module. The AR module superimposes borehole data on camera imagery as viewed by the user and provides intuitive visualization of borehole locations. The map module shows the locations of corresponding borehole data on a 2D map with additional map layers. The database module provides data management functions for large borehole databases for other modules. Field survey was also carried out using more than 100,000 borehole data.

  9. Augmented reality system using lidar point cloud data for displaying dimensional information of objects on mobile phones

    NASA Astrophysics Data System (ADS)

    Gupta, S.; Lohani, B.

    2014-05-01

    Mobile augmented reality system is the next generation technology to visualise 3D real world intelligently. The technology is expanding at a fast pace to upgrade the status of a smart phone to an intelligent device. The research problem identified and presented in the current work is to view actual dimensions of various objects that are captured by a smart phone in real time. The methodology proposed first establishes correspondence between LiDAR point cloud, that are stored in a server, and the image t hat is captured by a mobile. This correspondence is established using the exterior and interior orientation parameters of the mobile camera and the coordinates of LiDAR data points which lie in the viewshed of the mobile camera. A pseudo intensity image is generated using LiDAR points and their intensity. Mobile image and pseudo intensity image are then registered using image registration method SIFT thereby generating a pipeline to locate a point in point cloud corresponding to a point (pixel) on the mobile image. The second part of the method uses point cloud data for computing dimensional information corresponding to the pairs of points selected on mobile image and fetch the dimensions on top of the image. This paper describes all steps of the proposed method. The paper uses an experimental setup to mimic the mobile phone and server system and presents some initial but encouraging results

  10. An Investigation of University Students' Collaborative Inquiry Learning Behaviors in an Augmented Reality Simulation and a Traditional Simulation

    NASA Astrophysics Data System (ADS)

    Wang, Hung-Yuan; Duh, Henry Been-Lirn; Li, Nai; Lin, Tzung-Jin; Tsai, Chin-Chung

    2014-10-01

    The purpose of this study is to investigate and compare students' collaborative inquiry learning behaviors and their behavior patterns in an augmented reality (AR) simulation system and a traditional 2D simulation system. Their inquiry and discussion processes were analyzed by content analysis and lag sequential analysis (LSA). Forty university students were divided into dyads and then randomly assigned into AR group and traditional 2D group to collaboratively conduct an inquiry task about elastic collision. The results of the content analysis and LSA indicated that both systems supported students' collaborative inquiry learning. Particularly, students showed high frequencies on higher-level inquiry behaviors, such as interpreting experimental data or making conclusions, when using these two simulations. By comparing the behavioral patterns, similarities and differences between the two groups were revealed. The AR simulation engaged the students more thoroughly in the inquiry process. Moreover, students in both groups adopted the same approaches to design experiments. Due to the line of AR research is in its initial stage, suggestions for future studies were proposed.

  11. The GOSTT concept and hybrid mixed/virtual/augmented reality environment radioguided surgery.

    PubMed

    Valdés Olmos, R A; Vidal-Sicart, S; Giammarile, F; Zaknun, J J; Van Leeuwen, F W; Mariani, G

    2014-06-01

    The popularity gained by the sentinel lymph node (SLN) procedure in the last two decades did increase the interest of the surgical disciplines for other applications of radioguided surgery. An example is the gamma-probe guided localization of occult or difficult to locate neoplastic lesions. Such guidance can be achieved by intralesional delivery (ultrasound, stereotaxis or CT) of a radiolabelled agent that remains accumulated at the site of the injection. Another possibility rested on the use of systemic administration of a tumour-seeking radiopharmaceutical with favourable tumour accumulation and retention. On the other hand, new intraoperative imaging devices for radioguided surgery in complex anatomical areas became available. All this a few years ago led to the delineation of the concept Guided intraOperative Scintigraphic Tumour Targeting (GOSTT) to include the whole spectrum of basic and advanced nuclear medicine procedures required for providing a roadmap that would optimise surgery. The introduction of allied signatures using, e.g. hybrid tracers for simultaneous detection of the radioactive and fluorescent signals did amply the GOSTT concept. It was now possible to combine perioperative nuclear medicine imaging with the superior resolution of additional optical guidance in the operating room. This hybrid approach is currently in progress and probably will become an important model to follow in the coming years. A cornerstone in the GOSTT concept is constituted by diagnostic imaging technologies like SPECT/CT. SPECT/CT was introduced halfway the past decade and was immediately incorporated into the SLN procedure. Important reasons attributing to the success of SPECT/CT were its combination with lymphoscintigraphy, and the ability to display SLNs in an anatomical environment. This latter aspect has significantly been improved in the new generation of SPECT/CT cameras and provides the base for the novel mixed reality protocols of image-guided surgery. In

  12. Into the Wild: Neuroergonomic Differentiation of Hand-Held and Augmented Reality Wearable Displays during Outdoor Navigation with Functional Near Infrared Spectroscopy

    PubMed Central

    McKendrick, Ryan; Parasuraman, Raja; Murtza, Rabia; Formwalt, Alice; Baccus, Wendy; Paczynski, Martin; Ayaz, Hasan

    2016-01-01

    Highly mobile computing devices promise to improve quality of life, productivity, and performance. Increased situation awareness and reduced mental workload are two potential means by which this can be accomplished. However, it is difficult to measure these concepts in the “wild”. We employed ultra-portable battery operated and wireless functional near infrared spectroscopy (fNIRS) to non-invasively measure hemodynamic changes in the brain’s Prefrontal cortex (PFC). Measurements were taken during navigation of a college campus with either a hand-held display, or an Augmented reality wearable display (ARWD). Hemodynamic measures were also paired with secondary tasks of visual perception and auditory working memory to provide behavioral assessment of situation awareness and mental workload. Navigating with an augmented reality wearable display produced the least workload during the auditory working memory task, and a trend for improved situation awareness in our measures of prefrontal hemodynamics. The hemodynamics associated with errors were also different between the two devices. Errors with an augmented reality wearable display were associated with increased prefrontal activity and the opposite was observed for the hand-held display. This suggests that the cognitive mechanisms underlying errors between the two devices differ. These findings show fNIRS is a valuable tool for assessing new technology in ecologically valid settings and that ARWDs offer benefits with regards to mental workload while navigating, and potentially superior situation awareness with improved display design. PMID:27242480

  13. Into the Wild: Neuroergonomic Differentiation of Hand-Held and Augmented Reality Wearable Displays during Outdoor Navigation with Functional Near Infrared Spectroscopy.

    PubMed

    McKendrick, Ryan; Parasuraman, Raja; Murtza, Rabia; Formwalt, Alice; Baccus, Wendy; Paczynski, Martin; Ayaz, Hasan

    2016-01-01

    Highly mobile computing devices promise to improve quality of life, productivity, and performance. Increased situation awareness and reduced mental workload are two potential means by which this can be accomplished. However, it is difficult to measure these concepts in the "wild". We employed ultra-portable battery operated and wireless functional near infrared spectroscopy (fNIRS) to non-invasively measure hemodynamic changes in the brain's Prefrontal cortex (PFC). Measurements were taken during navigation of a college campus with either a hand-held display, or an Augmented reality wearable display (ARWD). Hemodynamic measures were also paired with secondary tasks of visual perception and auditory working memory to provide behavioral assessment of situation awareness and mental workload. Navigating with an augmented reality wearable display produced the least workload during the auditory working memory task, and a trend for improved situation awareness in our measures of prefrontal hemodynamics. The hemodynamics associated with errors were also different between the two devices. Errors with an augmented reality wearable display were associated with increased prefrontal activity and the opposite was observed for the hand-held display. This suggests that the cognitive mechanisms underlying errors between the two devices differ. These findings show fNIRS is a valuable tool for assessing new technology in ecologically valid settings and that ARWDs offer benefits with regards to mental workload while navigating, and potentially superior situation awareness with improved display design. PMID:27242480

  14. Learning retention of thoracic pedicle screw placement using a high-resolution augmented reality simulator with haptic feedback1

    PubMed Central

    Luciano, Cristian J.; Banerjee, P. Pat; Bellotte, Brad; Lemole, G. Michael; Oh, Michael; Charbel, Fady T.; Roitberg, Ben

    2011-01-01

    Background We evaluated the use of a part-task simulator with 3D and haptic feedback as a training tool for a common neurosurgical procedure – placement of thoracic pedicle screws. Objective To evaluate the learning retention of thoracic pedicle screw placement on a high-performance augmented reality and haptic technology workstation. Methods Fifty-one fellows and residents performed thoracic pedicle screw placement on the simulator. The virtual screws were drilled into a virtual patient’s thoracic spine derived from a computed tomography data set of a real patient. Results With a 12.5% failure rate, a two-proportion z-test yielded P= 0.08. For performance accuracy, an aggregate Euclidean distance deviation from entry landmark on the pedicle and a similar deviation from the target landmark in the vertebral body yielded P=0.04 from a two-sample t-test in which the rejected null hypothesis assumes no improvement in performance accuracy from the practice to the test sessions, and the alternative hypothesis assumes an improvement. Conclusion The performance accuracy on the simulator was comparable to the accuracy reported in literature on recent retrospective evaluation of such placements. The failure rates indicated a minor drop from practice to test sessions, and also indicated a trend (P=0.08) towards learning retention resulting in improvement from practice to test sessions. The performance accuracy showed a 15% mean score improvement and over 50% reduction in standard deviation from practice to test. It showed evidence (P=0.04) of performance accuracy improvement from practice to test session. PMID:21471846

  15. Real-time computer-generated integral imaging and 3D image calibration for augmented reality surgical navigation.

    PubMed

    Wang, Junchen; Suenaga, Hideyuki; Liao, Hongen; Hoshi, Kazuto; Yang, Liangjing; Kobayashi, Etsuko; Sakuma, Ichiro

    2015-03-01

    Autostereoscopic 3D image overlay for augmented reality (AR) based surgical navigation has been studied and reported many times. For the purpose of surgical overlay, the 3D image is expected to have the same geometric shape as the original organ, and can be transformed to a specified location for image overlay. However, how to generate a 3D image with high geometric fidelity and quantitative evaluation of 3D image's geometric accuracy have not been addressed. This paper proposes a graphics processing unit (GPU) based computer-generated integral imaging pipeline for real-time autostereoscopic 3D display, and an automatic closed-loop 3D image calibration paradigm for displaying undistorted 3D images. Based on the proposed methods, a novel AR device for 3D image surgical overlay is presented, which mainly consists of a 3D display, an AR window, a stereo camera for 3D measurement, and a workstation for information processing. The evaluation on the 3D image rendering performance with 2560×1600 elemental image resolution shows the rendering speeds of 50-60 frames per second (fps) for surface models, and 5-8 fps for large medical volumes. The evaluation of the undistorted 3D image after the calibration yields sub-millimeter geometric accuracy. A phantom experiment simulating oral and maxillofacial surgery was also performed to evaluate the proposed AR overlay device in terms of the image registration accuracy, 3D image overlay accuracy, and the visual effects of the overlay. The experimental results show satisfactory image registration and image overlay accuracy, and confirm the system usability. PMID:25465067

  16. Virtual reality applications in T and D engineering

    SciTech Connect

    Breen, P.T. Jr.; Scott, W.G.

    1995-12-31

    Visualization Technology (VT)--the author`s more meaningful definition of Virtual Reality is a commercial reality. Visualization technology can provide a realistic model of the real world, place a user within the synthetic space and allow him or her to interact within that space through head mounted displays, CRTs, data gloves, and 3D mice. Existing commercial applications of VT include the emulation of power plant control room panels, 3D models of commercial and industrial buildings and virtual models of transportation systems to train the handicapped. The authors believe that VT can greatly reduce the costs and increase the productivity of training T and D personnel, especially for hazardous assignments such as live-line maintenance. VT can also reduce the costs of design, construction and maintenance of major facilities such as power plants, substations, vaults, transmission lines and underground facilities.

  17. Studies of Operating Frequency Effects On Ejector-based Thrust Augmentation in a Pulse Detonation Engine

    NASA Technical Reports Server (NTRS)

    Landry, K.

    2005-01-01

    Studies were performed in order to characterize the thrust augmentation potential of an ejector in a Pulse Detonation Engine application. A 49-mm diameter tube of 0.914-m length was constructed with one open end and one closed end. Ethylene, oxygen, and nitrogen were introduced into the tube at the closed end through the implementation of a fast mixing injector. The tube was completely filled with a stoichiometric mixture containing a one to one molar ratio of nitrogen to oxygen. Ethylene was selected as the fuel due to its detonation sensitivity and the molar ratio of the oxidizer was chosen for heat transfer purposes. Detonations were initiated in the tube through the use of a spark ignition system. The PDE was operated in a multi-cycle mode at frequencies ranging from 20-Hz to 50-Hz. Baseline thrust measurements with no ejector present were performed while operating the engine at various frequencies and compared to theoretical estimates. The baseline values were observed to agree with the theoretical model at low operating frequencies and proved to be increasingly lower than the predicted values as the operating frequency was increased. The baseline thrust measurements were observed to agree within 15 percent of the model for all operating frequencies. A straight 152-mm diameter ejector was installed and thrust augmentation percentages were measured. The length of the ejector was varied while the overlap percentage (percent of the ejector length which overlapped the tube) was maintained at 25 percent for all tests. In addition, the effect of ejector inlet geometry was investigated by comparing results with a straight inlet to those of a 38-mm inlet diameter. The thrust augmentation of the straight inlet ejector proved to be independent of engine operating frequency, augmenting thrust by 40 percent for the 0.914-m length ejector. In contrast, the rounded lip ejector of the same length seemed to be highly dependent on the engine operating frequency. An optimum

  18. Space Shuttle Main Engine fuel preburner augmented spark igniter shutdown detonations

    NASA Technical Reports Server (NTRS)

    Dexter, C. E.; Mccay, T. D.

    1986-01-01

    Detonations were experienced in the Space Shuttle Main Engine fuel preburner (FPB) augmented spark igniter (ASI) during engine cutoff. Several of these resulted in over pressures sufficient to damage the FPB ASI oxidizer system. The detonations initiated in the FPB ASI oxidizer line when residual oxidizer (oxygen) in the line mixed with backflowing fuel (hydrogen) and detonated. This paper reviews the damage history to the FPB ASI oxidizer system, an engineering assessment of the problem cause, a verification of the mechanisms, the hazards associated with the detonations, and the solution implemented.

  19. V-TIME: a treadmill training program augmented by virtual reality to decrease fall risk in older adults: study design of a randomized controlled trial

    PubMed Central

    2013-01-01

    Background Recent work has demonstrated that fall risk can be attributed to cognitive as well as motor deficits. Indeed, everyday walking in complex environments utilizes executive function, dual tasking, planning and scanning, all while walking forward. Pilot studies suggest that a multi-modal intervention that combines treadmill training to target motor function and a virtual reality obstacle course to address the cognitive components of fall risk may be used to successfully address the motor-cognitive interactions that are fundamental for fall risk reduction. The proposed randomized controlled trial will evaluate the effects of treadmill training augmented with virtual reality on fall risk. Methods/Design Three hundred older adults with a history of falls will be recruited to participate in this study. This will include older adults (n=100), patients with mild cognitive impairment (n=100), and patients with Parkinson’s disease (n=100). These three sub-groups will be recruited in order to evaluate the effects of the intervention in people with a range of motor and cognitive deficits. Subjects will be randomly assigned to the intervention group (treadmill training with virtual reality) or to the active-control group (treadmill training without virtual reality). Each person will participate in a training program set in an outpatient setting 3 times per week for 6 weeks. Assessments will take place before, after, and 1 month and 6 months after the completion of the training. A falls calendar will be kept by each participant for 6 months after completing the training to assess fall incidence (i.e., the number of falls, multiple falls and falls rate). In addition, we will measure gait under usual and dual task conditions, balance, community mobility, health related quality of life, user satisfaction and cognitive function. Discussion This randomized controlled trial will demonstrate the extent to which an intervention that combines treadmill training augmented

  20. [Research progress in peri-implant soft tissue engineering augmentation method].

    PubMed

    Pei, T T; Yu, H Q; Wen, C J; Guo, T Q; Zhou, Y M; Peng, H M

    2016-05-01

    The sufficiency of hard and soft tissue at the implant site is the guarantee of long-term function, health and the appearance of implant denture. Problem of soft tissue recession at the implant site has always been bothering dentists. Traditional methods for augmentation of soft tissue such as gingival transplantation have disadvantages of instability of the increased soft-tissue and more trauma. Lately the methods that base on tissue engineering to increase the soft tissue of peri-implant sites have drawn great attention. This review focuses on the current methods of peri-implant restoration through tissue engineering, seed cells, biological scaffolds and cytokines. PMID:27220393

  1. A CT-ultrasound-coregistered augmented reality enhanced image-guided surgery system and its preliminary study on brain-shift estimation

    NASA Astrophysics Data System (ADS)

    Huang, C. H.; Hsieh, C. H.; Lee, J. D.; Huang, W. C.; Lee, S. T.; Wu, C. T.; Sun, Y. N.; Wu, Y. T.

    2012-08-01

    With the combined view on the physical space and the medical imaging data, augmented reality (AR) visualization can provide perceptive advantages during image-guided surgery (IGS). However, the imaging data are usually captured before surgery and might be different from the up-to-date one due to natural shift of soft tissues. This study presents an AR-enhanced IGS system which is capable to correct the movement of soft tissues from the pre-operative CT images by using intra-operative ultrasound images. First, with reconstructing 2-D free-hand ultrasound images to 3-D volume data, the system applies a Mutual-Information based registration algorithm to estimate the deformation between pre-operative and intra-operative ultrasound images. The estimated deformation transform describes the movement of soft tissues and is then applied to the pre-operative CT images which provide high-resolution anatomical information. As a result, the system thus displays the fusion of the corrected CT images or the real-time 2-D ultrasound images with the patient in the physical space through a head mounted display device, providing an immersive augmented-reality environment. For the performance validation of the proposed system, a brain phantom was utilized to simulate brain-shift scenario. Experimental results reveal that when the shift of an artificial tumor is from 5mm ~ 12mm, the correction rates can be improved from 32% ~ 45% to 87% ~ 95% by using the proposed system.

  2. Effects of Mobile Augmented Reality Learning Compared to Textbook Learning on Medical Students: Randomized Controlled Pilot Study

    PubMed Central

    2013-01-01

    Background By adding new levels of experience, mobile Augmented Reality (mAR) can significantly increase the attractiveness of mobile learning applications in medical education. Objective To compare the impact of the heightened realism of a self-developed mAR blended learning environment (mARble) on learners to textbook material, especially for ethically sensitive subjects such as forensic medicine, while taking into account basic psychological aspects (usability and higher level of emotional involvement) as well as learning outcomes (increased learning efficiency). Methods A prestudy was conducted based on a convenience sample of 10 third-year medical students. The initial emotional status was captured using the “Profile of Mood States” questionnaire (POMS, German variation); previous knowledge about forensic medicine was determined using a 10-item single-choice (SC) test. During the 30-minute learning period, the students were randomized into two groups: the first group consisted of pairs of students, each equipped with one iPhone with a preinstalled copy of mARble, while the second group was provided with textbook material. Subsequently, both groups were asked to once again complete the POMS questionnaire and SC test to measure changes in emotional state and knowledge gain. Usability as well as pragmatic and hedonic qualities of the learning material was captured using AttrakDiff2 questionnaires. Data evaluation was conducted anonymously. Descriptive statistics for the score in total and the subgroups were calculated before and after the intervention. The scores of both groups were tested against each other using paired and unpaired signed-rank tests. An item analysis was performed for the SC test to objectify difficulty and selectivity. Results Statistically significant, the mARble group (6/10) showed greater knowledge gain than the control group (4/10) (Wilcoxon z=2.232, P=.03). The item analysis of the SC test showed a difficulty of P=0.768 (s=0.09) and a

  3. NACA Conference on Turbojet-Engine Thrust Augmentation Research: A Compilation of the Papers Presented by NACA Staff Members

    NASA Technical Reports Server (NTRS)

    1948-01-01

    The conference on Turbojet-Engine Thrust-Augmentation Research was organized by the NACA to present in summarized form the results of the latest experimental and analytical investigations conducted at the Lewis Flight Propulsion Laboratory on methods of augmenting the thrust of turbojet engines. The technical discussions are reproduced herewith in the same form in which they were presented. The original presentation in this record are considered as complementary to, rather than substitutes for, the committee's system of complete and formal reports.

  4. Study of ejector geometry on thrust augmentation for pulse detonation engine ejector systems

    NASA Astrophysics Data System (ADS)

    Shehadeh, Ra'fat

    Pulse detonation engine (PDE) technology is a novel form of propulsion that offers the potential of high efficiency combustion with reduced hardware complexity. Although the primary interest of the research in the pulse detonation engine field is directed towards overcoming the problems associated with operating a pure PDE system, there are other worthy options to be considered for these engines. The PDE driven ejector concept is one such option where the system would be part of a hybrid PD/Turbofan engine. This system offers the promise of replacing the high-pressure turbine sections of the core of a high bypass turbofan engine. The purpose of the current research is to investigate the thrust augmentation capabilities of a PDE driven ejector and provide experimental data that would assist in understanding the behavior of such a system. The major potential advantages of the PDE-ejector include reduced costs due to the reduced engine weight, along with improved specific fuel consumption and specific power inherent in the incorporation of a PDE component. To achieve the goal of this research, the thrust augmentation of a PDE driven ejector was characterized for a set of configurations. Two separate PDE's were utilized in this study. The first PDE was capable of operating at a constant frequency of 10 Hz de to flow rate limitations, and another PDE built to have an operational frequency range of 10 Hz-70 Hz to test the effect of operational frequency on PDE-ejector systems. Optical diagnostics were employed at specific positions of interest to understand the physical behavior of the flow. Baseline experimental results helped define and understand the operational characteristics of the PDE's utilized in this study. Thrust measurements were then made for PDE driven ejector configurations. The parameters that were independently changed were the inlet geometry of a constant diameter ejector, as well as the overlap distance between the PDE tube exit and ejector tube inlet

  5. Engine Yaw Augmentation for Hybrid-Wing-Body Aircraft via Optimal Control Allocation Techniques

    NASA Technical Reports Server (NTRS)

    Taylor, Brian R.; Yoo, Seung-Yeun

    2011-01-01

    Asymmetric engine thrust was implemented in a hybrid-wing-body non-linear simulation to reduce the amount of aerodynamic surface deflection required for yaw stability and control. Hybrid-wing-body aircraft are especially susceptible to yaw surface deflection due to their decreased bare airframe yaw stability resulting from the lack of a large vertical tail aft of the center of gravity. Reduced surface deflection, especially for trim during cruise flight, could reduce the fuel consumption of future aircraft. Designed as an add-on, optimal control allocation techniques were used to create a control law that tracks total thrust and yaw moment commands with an emphasis on not degrading the baseline system. Implementation of engine yaw augmentation is shown and feasibility is demonstrated in simulation with a potential drag reduction of 2 to 4 percent. Future flight tests are planned to demonstrate feasibility in a flight environment.

  6. Development and Initial Validation of a Virtual Reality Haptically Augmented Surgical Knot-Tying Trainer for the Autosuture™ ENDOSTITCH™ Instrument

    PubMed Central

    KURENOV, Sergei; PUNAK, Sukitti; PETERS, Jorg; LEE, Constance; CENDAN, Juan

    2014-01-01

    The Autosuture™ Endostitch ™ device (Covidien, CT) is difficult to learn. In particular, the handle requires the use of a toggle which is unique in this instrument. We have developed a virtual reality trainer for the device that offers the use of the actual instrument handle while creating a visible virtual instrument tip complete with virtual needle and suture on a monitor. This report represents the development and initial validation experiments for the device. PMID:19377135

  7. STS-55 pad abort: Engine 2011 oxidizer preburner augmented spark igniter check valve leak

    NASA Astrophysics Data System (ADS)

    1993-03-01

    The STS-55 initial launch attempt of Columbia (OV102) was terminated on KSC launch pad A March 22, 1993 at 9:51 AM E.S.T. due to violation of an ME-3 (Engine 2011) Launch Commit Criteria (LCC) limit exceedance. The event description and timeline are summarized. Propellant loading was initiated on 22 March, 1993 at 1:15 AM EST. All SSME chill parameters and launch commit criteria (LCC) were nominal. At engine start plus 1.44 seconds, a Failure Identification (FID) was posted against Engine 2011 for exceeding the 50 psia Oxidizer Preburner (OPB) purge pressure redline. The engine was shut down at 1.50 seconds followed by Engines 2034 and 2030. All shut down sequences were nominal and the mission was safely aborted. The OPB purge pressure redline violation and the abort profile/overlay for all three engines are depicted. SSME Avionics hardware and software performed nominally during the incident. A review of vehicle data table (VDT) data and controller software logic revealed no failure indications other than the single FID 013-414, OPB purge pressure redline exceeded. Software logic was executed according to requirements and there was no anomalous controller software operation. Immediately following the abort, a Rocketdyne/NASA failure investigation team was assembled. The team successfully isolated the failure cause to the oxidizer preburner augmented spark igniter purge check valve not being fully closed due to contamination. The source of the contaminant was traced to a cut segment from a rubber O-ring which was used in a fine clean tool during valve production prior to 1992. The valve was apparently contaminated during its fabrication in 1985. The valve had performed acceptably on four previous flights of the engine, and SSME flight history shows 780 combined check valve flights without failure. The failure of an Engine 3 (SSME No. 2011) check valve to close was sensed by onboard engine instruments even though all other engine operations were normal. This

  8. STS-55 pad abort: Engine 2011 oxidizer preburner augmented spark igniter check valve leak

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The STS-55 initial launch attempt of Columbia (OV102) was terminated on KSC launch pad A March 22, 1993 at 9:51 AM E.S.T. due to violation of an ME-3 (Engine 2011) Launch Commit Criteria (LCC) limit exceedance. The event description and timeline are summarized. Propellant loading was initiated on 22 March, 1993 at 1:15 AM EST. All SSME chill parameters and launch commit criteria (LCC) were nominal. At engine start plus 1.44 seconds, a Failure Identification (FID) was posted against Engine 2011 for exceeding the 50 psia Oxidizer Preburner (OPB) purge pressure redline. The engine was shut down at 1.50 seconds followed by Engines 2034 and 2030. All shut down sequences were nominal and the mission was safely aborted. The OPB purge pressure redline violation and the abort profile/overlay for all three engines are depicted. SSME Avionics hardware and software performed nominally during the incident. A review of vehicle data table (VDT) data and controller software logic revealed no failure indications other than the single FID 013-414, OPB purge pressure redline exceeded. Software logic was executed according to requirements and there was no anomalous controller software operation. Immediately following the abort, a Rocketdyne/NASA failure investigation team was assembled. The team successfully isolated the failure cause to the oxidizer preburner augmented spark igniter purge check valve not being fully closed due to contamination. The source of the contaminant was traced to a cut segment from a rubber O-ring which was used in a fine clean tool during valve production prior to 1992. The valve was apparently contaminated during its fabrication in 1985. The valve had performed acceptably on four previous flights of the engine, and SSME flight history shows 780 combined check valve flights without failure. The failure of an Engine 3 (SSME No. 2011) check valve to close was sensed by onboard engine instruments even though all other engine operations were normal. This

  9. Engineered Nasal Cartilage by Cell Homing: A Model for Augmentative and Reconstructive Rhinoplasty

    PubMed Central

    Mendelson, Avital; Ahn, Jeffrey M.; Paluch, Kamila; Embree, Mildred C.; Mao, Jeremy J.

    2014-01-01

    Current augmentative and reconstructive rhinoplasty surgeries utilize autologous tissue grafts or synthetic bioinert materials to repair nasal trauma or attain an aesthetic shape. Autologous grafts are associated with donor site trauma and morbidity. Synthetic materials are widely used but often yield an unnatural appearance and are prone to infection or dislocation. There is an acute clinical need for the generation of native tissues to serve as rhinoplasty grafts without the undesirable features that are associated with autologous grafts or current synthetic materials. Here, we developed a bioactive scaffold that not only recruited cells in the nasal dorsum in vivo, but also induced chondrogenesis of the recruited cells. Bilayered scaffolds were fabricated with alginate containing gelatin microspheres encapsulating cytokines atop a porous poly (lactic-co-glycolic acid) (PLGA) base. Gelatin microspheres were fabricated to contain recombinant human TGFβ3 at doses of 200, 500 or 1000 ng with PBS-loaded microspheres as a control. We first created a rat model of augmentation rhinoplasty by implanting bilayered scaffolds atop the native nasal cartilage surface that was scored to induce cell migration. Tissue formation and chondrogenesis in PLGA scaffolds were evaluated by image analysis and histological staining with Hematoxylin and Eosin, Toluidine Blue, Verhoeff Elastic-Van Geison, and aggrecan immunohistochemistry. Sustained release of increasing doses of TGFβ3 for up to the tested 10 wks promoted orthotopic cartilage-like tissue formation in a dose dependent manner. These findings represent the first attempt to engineer cartilage tissue by cell homing for rhinoplasty, and combine the advantage of autologous tissue formation by cytotactic factors embedded in a biomaterial scaffold that could potentially serve as an alternative material for augmentative and reconstructive rhinoplasty. PMID:24867716

  10. Impact of distributed virtual reality on engineering knowledge retention and student engagement

    NASA Astrophysics Data System (ADS)

    Sulbaran, Tulio Alberto

    Engineering Education is facing many problems, one of which is poor knowledge retention among engineering students. This problem affects the Architecture, Engineering, and Construction (A/E/C) industry, because students are unprepared for many necessary job skills. This problem of poor knowledge retention is caused by many factors, one of which is the mismatch between student learning preferences and the media used to teach engineering. The purpose of this research is to assess the impact of Distributed Virtual Reality (DVR) as an engineering teaching tool. The implementation of DVR addresses the issue of poor knowledge retention by impacting the mismatch between learning and teaching style in the visual versus verbal spectrum. Using as a point of departure three knowledge domain areas (Learning and Instruction, Distributed Virtual Reality and Crane Selection as Part of Crane Lift Planning), a DVR engineering teaching tool is developed, deployed and assessed in engineering classrooms. The statistical analysis of the data indicates that: (1) most engineering students are visual learners; (2) most students would like more classes using DVR; (3) engineering students find DVR more engaging than traditional learning methods; (4) most students find the responsiveness of the DVR environments to be either good or very good; (5) all students are able to interact with DVR and most of the students found it easy or very easy to navigate (without previous formal training in how to use DVR); (6) students' knowledge regarding the subject (crane selection) is higher after the experiment; and, (7) students' using different instructional media do not demonstrate statistical difference in knowledge retained after the experiment. This inter-disciplinary research offers opportunities for direct and immediate application in education, research, and industry, due to the fact that the instructional module developed (on crane selection as part of construction crane lift planning) can be

  11. Treatment of phantom limb pain (PLP) based on augmented reality and gaming controlled by myoelectric pattern recognition: a case study of a chronic PLP patient

    PubMed Central

    Ortiz-Catalan, Max; Sander, Nichlas; Kristoffersen, Morten B.; Håkansson, Bo; Brånemark, Rickard

    2014-01-01

    A variety of treatments have been historically used to alleviate phantom limb pain (PLP) with varying efficacy. Recently, virtual reality (VR) has been employed as a more sophisticated mirror therapy. Despite the advantages of VR over a conventional mirror, this approach has retained the use of the contralateral limb and is therefore restricted to unilateral amputees. Moreover, this strategy disregards the actual effort made by the patient to produce phantom motions. In this work, we investigate a treatment in which the virtual limb responds directly to myoelectric activity at the stump, while the illusion of a restored limb is enhanced through augmented reality (AR). Further, phantom motions are facilitated and encouraged through gaming. The proposed set of technologies was administered to a chronic PLP patient who has shown resistance to a variety of treatments (including mirror therapy) for 48 years. Individual and simultaneous phantom movements were predicted using myoelectric pattern recognition and were then used as input for VR and AR environments, as well as for a racing game. The sustained level of pain reported by the patient was gradually reduced to complete pain-free periods. The phantom posture initially reported as a strongly closed fist was gradually relaxed, interestingly resembling the neutral posture displayed by the virtual limb. The patient acquired the ability to freely move his phantom limb, and a telescopic effect was observed where the position of the phantom hand was restored to the anatomically correct distance. More importantly, the effect of the interventions was positively and noticeably perceived by the patient and his relatives. Despite the limitation of a single case study, the successful results of the proposed system in a patient for whom other medical and non-medical treatments have been ineffective justifies and motivates further investigation in a wider study. PMID:24616655

  12. Treatment of phantom limb pain (PLP) based on augmented reality and gaming controlled by myoelectric pattern recognition: a case study of a chronic PLP patient.

    PubMed

    Ortiz-Catalan, Max; Sander, Nichlas; Kristoffersen, Morten B; Håkansson, Bo; Brånemark, Rickard

    2014-01-01

    A variety of treatments have been historically used to alleviate phantom limb pain (PLP) with varying efficacy. Recently, virtual reality (VR) has been employed as a more sophisticated mirror therapy. Despite the advantages of VR over a conventional mirror, this approach has retained the use of the contralateral limb and is therefore restricted to unilateral amputees. Moreover, this strategy disregards the actual effort made by the patient to produce phantom motions. In this work, we investigate a treatment in which the virtual limb responds directly to myoelectric activity at the stump, while the illusion of a restored limb is enhanced through augmented reality (AR). Further, phantom motions are facilitated and encouraged through gaming. The proposed set of technologies was administered to a chronic PLP patient who has shown resistance to a variety of treatments (including mirror therapy) for 48 years. Individual and simultaneous phantom movements were predicted using myoelectric pattern recognition and were then used as input for VR and AR environments, as well as for a racing game. The sustained level of pain reported by the patient was gradually reduced to complete pain-free periods. The phantom posture initially reported as a strongly closed fist was gradually relaxed, interestingly resembling the neutral posture displayed by the virtual limb. The patient acquired the ability to freely move his phantom limb, and a telescopic effect was observed where the position of the phantom hand was restored to the anatomically correct distance. More importantly, the effect of the interventions was positively and noticeably perceived by the patient and his relatives. Despite the limitation of a single case study, the successful results of the proposed system in a patient for whom other medical and non-medical treatments have been ineffective justifies and motivates further investigation in a wider study. PMID:24616655

  13. Utility of a Three-Dimensional Interactive Augmented Reality Program for Balance and Mobility Rehabilitation in the Elderly: A Feasibility Study

    PubMed Central

    Im, Dal Jae; Ku, Jeunghun; Kim, Yeun Joon; Cho, Sangwoo; Cho, Yun Kyung; Lim, Teo; Lee, Hye Sun; Kim, Hyun Jung

    2015-01-01

    Objective To improve lower extremity function and balance in elderly persons, we developed a novel, three-dimensional interactive augmented reality system (3D ARS). In this feasibility study, we assessed clinical and kinematic improvements, user participation, and the side effects of our system. Methods Eighteen participants (age, 56-76 years) capable of walking independently and standing on one leg were recruited. The participants received 3D ARS training during 10 sessions (30-minute duration each) for 4 weeks. Berg Balance Scale (BBS) and the Timed Up and Go (TUG) scores were obtained before and after the exercises. Outcome performance variables, including response time and success rate, and kinematic variables, such as hip and knee joint angle, were evaluated after each session. Results Participants exhibited significant clinical improvements in lower extremity balance and mobility following the intervention, as shown by improved BBS and TUG scores (p<0.001). Consistent kinematic improvements in the maximum joint angles of the hip and knee were observed across sessions. Outcome performance variables, such as success rate and response time, improved gradually across sessions, for each exercise. The level of participant interest also increased across sessions (p<0.001). All participants completed the program without experiencing any adverse effects. Conclusion Substantial clinical and kinematic improvements were observed after applying a novel 3D ARS training program, suggesting that this system can enhance lower extremity function and facilitate assessments of lower extremity kinematic capacity. PMID:26161353

  14. Informatics in radiology: Intuitive user interface for 3D image manipulation using augmented reality and a smartphone as a remote control.

    PubMed

    Nakata, Norio; Suzuki, Naoki; Hattori, Asaki; Hirai, Naoya; Miyamoto, Yukio; Fukuda, Kunihiko

    2012-01-01

    Although widely used as a pointing device on personal computers (PCs), the mouse was originally designed for control of two-dimensional (2D) cursor movement and is not suited to complex three-dimensional (3D) image manipulation. Augmented reality (AR) is a field of computer science that involves combining the physical world and an interactive 3D virtual world; it represents a new 3D user interface (UI) paradigm. A system for 3D and four-dimensional (4D) image manipulation has been developed that uses optical tracking AR integrated with a smartphone remote control. The smartphone is placed in a hard case (jacket) with a 2D printed fiducial marker for AR on the back. It is connected to a conventional PC with an embedded Web camera by means of WiFi. The touch screen UI of the smartphone is then used as a remote control for 3D and 4D image manipulation. Using this system, the radiologist can easily manipulate 3D and 4D images from computed tomography and magnetic resonance imaging in an AR environment with high-quality image resolution. Pilot assessment of this system suggests that radiologists will be able to manipulate 3D and 4D images in the reading room in the near future. Supplemental material available at http://radiographics.rsna.org/lookup/suppl/doi:10.1148/rg.324115086/-/DC1. PMID:22556316

  15. Heard on The Street: GIS-Guided Immersive 3D Models as an Augmented Reality for Team Collaboration

    NASA Astrophysics Data System (ADS)

    Quinn, B. B.

    2007-12-01

    ] (x,y,z meters). Visiting this model amounts to going inside a map, seeing oneself there, and seeing, gesturing with, and speaking to other visitors in the same space. The Second Life viewer client is free, and the model is hosted on a vast publicly accessible grid that as of 1 September 2007 is simulating 846 square kilometers and frequently has over 40,000 simultaneous users. For reference, this work uses less than 1/200,000 part of the total grid. For cost savings, GIS was used to construct the model at 1/3 scale so that some 35,000 square meters of Berkeley were modeled in less than 4000 square meters of simulator space. Our work in Second Life "Gualala" has shown that it is feasible to use real-life GIS data to guide the construction of a spatially accurate model that reflects the built surface and underground environment. Groups of visitors may position their proxy body, or avatars on the street corner and converse, greatly augmenting the experience of a conference call. Since each visitor controls their own camera position in real time, the model considerably augments a video conference call, and can permit individuals to manipulate 3D objects as part of a demonstration or discussion.

  16. Development Education and Engineering: A Framework for Incorporating Reality of Developing Countries into Engineering Studies

    ERIC Educational Resources Information Center

    Perez-Foguet, A.; Oliete-Josa, S.; Saz-Carranza, A.

    2005-01-01

    Purpose: To show the key points of a development education program for engineering studies fitted within the framework of the human development paradigm. Design/methodology/approach: The bases of the concept of technology for human development are presented, and the relationship with development education analysed. Special attention is dedicated…

  17. New weather depiction technology for night vision goggle (NVG) training: 3D virtual/augmented reality scene-weather-atmosphere-target simulation

    NASA Astrophysics Data System (ADS)

    Folaron, Michelle; Deacutis, Martin; Hegarty, Jennifer; Vollmerhausen, Richard; Schroeder, John; Colby, Frank P.

    2007-04-01

    US Navy and Marine Corps pilots receive Night Vision Goggle (NVG) training as part of their overall training to maintain the superiority of our forces. This training must incorporate realistic targets; backgrounds; and representative atmospheric and weather effects they may encounter under operational conditions. An approach for pilot NVG training is to use the Night Imaging and Threat Evaluation Laboratory (NITE Lab) concept. The NITE Labs utilize a 10' by 10' static terrain model equipped with both natural and cultural lighting that are used to demonstrate various illumination conditions, and visual phenomena which might be experienced when utilizing night vision goggles. With this technology, the military can safely, systematically, and reliably expose pilots to the large number of potentially dangerous environmental conditions that will be experienced in their NVG training flights. A previous SPIE presentation described our work for NAVAIR to add realistic atmospheric and weather effects to the NVG NITE Lab training facility using the NVG - WDT(Weather Depiction Technology) system (Colby, et al.). NVG -WDT consist of a high end multiprocessor server with weather simulation software, and several fixed and goggle mounted Heads Up Displays (HUDs). Atmospheric and weather effects are simulated using state-of-the-art computer codes such as the WRF (Weather Research μ Forecasting) model; and the US Air Force Research Laboratory MODTRAN radiative transport model. Imagery for a variety of natural and man-made obscurations (e.g. rain, clouds, snow, dust, smoke, chemical releases) are being calculated and injected into the scene observed through the NVG via the fixed and goggle mounted HUDs. This paper expands on the work described in the previous presentation and will describe the 3D Virtual/Augmented Reality Scene - Weather - Atmosphere - Target Simulation part of the NVG - WDT. The 3D virtual reality software is a complete simulation system to generate realistic

  18. Biologically inspired robotic inspectors: the engineering reality and future outlook (Keynote address)

    NASA Astrophysics Data System (ADS)

    Bar-Cohen, Yoseph

    2005-04-01

    Human errors have long been recognized as a major factor in the reliability of nondestructive evaluation results. To minimize such errors, there is an increasing reliance on automatic inspection tools that allow faster and consistent tests. Crawlers and various manipulation devices are commonly used to perform variety of inspection procedures that include C-scan with contour following capability to rapidly inspect complex structures. The emergence of robots has been the result of the need to deal with parts that are too complex to handle by a simple automatic system. Economical factors are continuing to hamper the wide use of robotics for inspection applications however technology advances are increasingly changing this paradigm. Autonomous robots, which may look like human, can potentially address the need to inspect structures with configuration that are not predetermined. The operation of such robots that mimic biology may take place at harsh or hazardous environments that are too dangerous for human presence. Biomimetic technologies such as artificial intelligence, artificial muscles, artificial vision and numerous others are increasingly becoming common engineering tools. Inspired by science fiction, making biomimetic robots is increasingly becoming an engineering reality and in this paper the state-of-the-art will be reviewed and the outlook for the future will be discussed.

  19. Wireless Augmented Reality Communication System

    NASA Technical Reports Server (NTRS)

    Devereaux, Ann (Inventor); Jedrey, Thomas (Inventor); Agan, Martin (Inventor)

    2015-01-01

    A portable unit is for video communication to select a user name in a user name network. A transceiver wirelessly accesses a communication network through a wireless connection to a general purpose node coupled to the communication network. A user interface can receive user input to log on to a user name network through the communication network. The user name network has a plurality of user names, at least one of the plurality of user names is associated with a remote portable unit, logged on to the user name network and available for video communication.

  20. Wireless Augmented Reality Communication System

    NASA Technical Reports Server (NTRS)

    Devereaux, Ann (Inventor); Jedrey, Thomas (Inventor); Agan, Martin (Inventor)

    2014-01-01

    The system of the present invention is a highly integrated radio communication system with a multimedia co-processor which allows true two-way multimedia (video, audio, data) access as well as real-time biomedical monitoring in a pager-sized portable access unit. The system is integrated in a network structure including one or more general purpose nodes for providing a wireless-to-wired interface. The network architecture allows video, audio and data (including biomedical data) streams to be connected directly to external users and devices. The portable access units may also be mated to various non-personal devices such as cameras or environmental sensors for providing a method for setting up wireless sensor nets from which reported data may be accessed through the portable access unit. The reported data may alternatively be automatically logged at a remote computer for access and viewing through a portable access unit, including the user's own.

  1. Wireless augmented reality communication system

    NASA Technical Reports Server (NTRS)

    Devereaux, Ann (Inventor); Jedrey, Thomas (Inventor); Agan, Martin (Inventor)

    2006-01-01

    The system of the present invention is a highly integrated radio communication system with a multimedia co-processor which allows true two-way multimedia (video, audio, data) access as well as real-time biomedical monitoring in a pager-sized portable access unit. The system is integrated in a network structure including one or more general purpose nodes for providing a wireless-to-wired interface. The network architecture allows video, audio and data (including biomedical data) streams to be connected directly to external users and devices. The portable access units may also be mated to various non-personal devices such as cameras or environmental sensors for providing a method for setting up wireless sensor nets from which reported data may be accessed through the portable access unit. The reported data may alternatively be automatically logged at a remote computer for access and viewing through a portable access unit, including the user's own.

  2. Design and implementation of a 3D ocean virtual reality and visualization engine

    NASA Astrophysics Data System (ADS)

    Chen, Ge; Li, Bo; Tian, Fenglin; Ji, Pengbo; Li, Wenqing

    2012-12-01

    In this study, a 3D virtual reality and visualization engine for rendering the ocean, named VV-Ocean, is designed for marine applications. The design goals of VV-Ocean aim at high fidelity simulation of ocean environment, visualization of massive and multidimensional marine data, and imitation of marine lives. VV-Ocean is composed of five modules, i.e. memory management module, resources management module, scene management module, rendering process management module and interaction management module. There are three core functions in VV-Ocean: reconstructing vivid virtual ocean scenes, visualizing real data dynamically in real time, imitating and simulating marine lives intuitively. Based on VV-Ocean, we establish a sea-land integration platform which can reproduce drifting and diffusion processes of oil spilling from sea bottom to surface. Environment factors such as ocean current and wind field have been considered in this simulation. On this platform oil spilling process can be abstracted as movements of abundant oil particles. The result shows that oil particles blend with water well and the platform meets the requirement for real-time and interactive rendering. VV-Ocean can be widely used in ocean applications such as demonstrating marine operations, facilitating maritime communications, developing ocean games, reducing marine hazards, forecasting the weather over oceans, serving marine tourism, and so on. Finally, further technological improvements of VV-Ocean are discussed.

  3. Study on using a digital ride quality augmentation system to trim an engine-out in a Cessna 402B

    NASA Technical Reports Server (NTRS)

    Donaldson, K. E.

    1986-01-01

    A linear model of the Cessna 402B was used to determine if the control power available to a Ride Quality Augmentation System was adequate to trim an engine-out. Two simulations were completed: one using a steady state model, and the other using a state matrix model. The amount of rudder available was not sufficient in all cases to completely trim the airplane, but it was enough to give the pilot valuable reaction time. The system would be an added measure of safety for only a relatively small amount of development.

  4. Thrust Augmentation of a Turbojet Engine at Simulated Flight Conditions by Introduction of a Water-Alcohol Mixture into the Compressor

    NASA Technical Reports Server (NTRS)

    Useller, James W.; Auble, Carmon M.; Harvey, Ray W., Sr.

    1952-01-01

    An investigation was conducted at simulated high-altitude flight conditions to evaluate the use of compressor evaporative cooling as a means of turbojet-engine thrust augmentation. Comparison of the performance of the engine with water-alcohol injection at the compressor inlet, at the sixth stage of the compressor, and at the sixth and ninth stages was made. From consideration of the thrust increases achieved, the interstage injection of the coolant was considered more desirable preferred over the combined sixth- and ninth-stage injection because of its relative simplicity. A maximum augmented net-thrust ratio of 1.106 and a maximum augmented jet-thrust ratio of 1.062 were obtained at an augmented liquid ratio of 2.98 and an engine-inlet temperature of 80 F. At lower inlet temperatures (-40 to 40 F), the maximum augmented net-thrust ratios ranged from 1.040 to 1.076 and the maximum augmented jet-thrust ratios ranged from 1.027 to 1.048, depending upon the inlet temperature. The relatively small increase in performance at the lower inlet-air temperatures can be partially attributed to the inadequate evaporation of the water-alcohol mixture, but the more significant limitation was believed to be caused by the negative influence of the liquid coolant on engine- component performance. In general, it is concluded that the effectiveness of the injection of a coolant into the compressor as a means of thrust augmentation is considerably influenced by the design characteristics of the components of the engine being used.

  5. Engineering a transformation of human-machine interaction to an augmented cognitive relationship.

    SciTech Connect

    Speed, Ann Elizabeth; Bernard, Michael Lewis; Abbott, Robert G.; Forsythe, James Chris; Xavier, Patrick Gordon; Brannon, Nathan Gregory

    2003-02-01

    This project is being conducted by Sandia National Laboratories in support of the DARPA Augmented Cognition program. Work commenced in April of 2002. The objective for the DARPA program is to 'extend, by an order of magnitude or more, the information management capacity of the human-computer warfighter.' Initially, emphasis has been placed on detection of an operator's cognitive state so that systems may adapt accordingly (e.g., adjust information throughput to the operator in response to workload). Work conducted by Sandia focuses on development of technologies to infer an operator's ongoing cognitive processes, with specific emphasis on detecting discrepancies between machine state and an operator's ongoing interpretation of events.

  6. Sensoria Patterns: Augmenting Service Engineering with Formal Analysis, Transformation and Dynamicity

    NASA Astrophysics Data System (ADS)

    Wirsing, Martin; Hölzl, Matthias; Acciai, Lucia; Banti, Federico; Clark, Allan; Fantechi, Alessandro; Gilmore, Stephen; Gnesi, Stefania; Gönczy, László; Koch, Nora; Lapadula, Alessandro; Mayer, Philip; Mazzanti, Franco; Pugliese, Rosario; Schroeder, Andreas; Tiezzi, Francesco; Tribastone, Mirco; Varró, Dániel

    The IST-FET Integrated Project Sensoria is developing a novel comprehensive approach to the engineering of service-oriented software systems where foundational theories, techniques and methods are fully integrated into pragmatic software engineering processes. The techniques and tools of Sensoria encompass the whole software development cycle, from business and architectural design, to quantitative and qualitative analysis of system properties, and to transformation and code generation. The Sensoria approach takes also into account reconfiguration of service-oriented architectures (SOAs) and re-engineering of legacy systems.

  7. Vicher: A Virtual Reality Based Educational Module for Chemical Reaction Engineering.

    ERIC Educational Resources Information Center

    Bell, John T.; Fogler, H. Scott

    1996-01-01

    A virtual reality application for undergraduate chemical kinetics and reactor design education, Vicher (Virtual Chemical Reaction Model) was originally designed to simulate a portion of a modern chemical plant. Vicher now consists of two programs: Vicher I that models catalyst deactivation and Vicher II that models nonisothermal effects in…

  8. Turbofan forced mixer lobe flow modeling. Part 3: Application to augment engines

    NASA Technical Reports Server (NTRS)

    Barber, T.; Moore, G. C.; Blatt, J. R.

    1988-01-01

    Military engines frequently need large quantities of thrust for short periods of time. The addition of an augmentor can provide such thrust increases but with a penalty of increased duct length and engine weight. The addition of a forced mixer to the augmentor improves performance and reduces the penalty, as well as providing a method for siting the required flame holders. In this report two augmentor concepts are investigated: a swirl-mixer augmentor and a mixer-flameholder augmentor. Several designs for each concept are included and an experimental assessment of one of the swirl-mixer augmentors is presented.

  9. Greening the curriculum: augmenting engineering and technology courses with sustainability topics

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Duties of engineers and technologists often entail designing and implementing solutions to problems. It is their responsibility to be cognizant of the impacts of their designs on, and thus their accountability to society in general. They must also be aware of subsequent effects upon the environment....

  10. Neuromuscular Junction Formation in Tissue-Engineered Skeletal Muscle Augments Contractile Function and Improves Cytoskeletal Organization

    PubMed Central

    Martin, Neil R.W.; Passey, Samantha L.; Player, Darren J.; Mudera, Vivek; Baar, Keith; Greensmith, Linda

    2015-01-01

    Neuromuscular and neurodegenerative diseases are conditions that affect both motor neurons and the underlying skeletal muscle tissue. At present, the majority of neuromuscular research utilizes animal models and there is a growing need to develop novel methodologies that can be used to help understand and develop treatments for these diseases. Skeletal muscle tissue-engineered constructs exhibit many of the characteristics of the native tissue such as accurate fascicular structure and generation of active contractions. However, to date, there has been little consideration toward the integration of engineered skeletal muscle with motor neurons with the aim of neuromuscular junction (NMJ) formation, which would provide a model to investigate neuromuscular diseases and basic biology. In the present work we isolated primary embryonic motor neurons and neonatal myoblasts from Sprague-Dawley rats, and cocultured the two cell types in three-dimensional tissue-engineered fibrin hydrogels with the aim of NMJ formation. Immunohistochemistry revealed myotube formation in a fascicular arrangement and neurite outgrowth from motor neuron cell bodies toward the aligned myotubes. Furthermore, colocalization of pre- and postsynaptic proteins and chemical inhibition of spontaneous myotube twitch indicated the presence of NMJs in the innervated constructs. When electrical field stimulation was employed to evoke isometric contractions, maximal twitch and tetanic force were higher in the constructs cocultured with motor neurons, which may, in part, be explained by improved myotube cytoskeletal organization in these constructs. The fabrication of such constructs may be useful tools for investigating neuromuscular pharmaceuticals and improving the understanding of neuromuscular pathologies. PMID:26166548

  11. Means and method for reducing differential pressure loading in an augmented gas turbine engine

    SciTech Connect

    Mayer, J.C.

    1991-12-10

    This patent describes a gas turbine engine having a fan, a core engine for generating combustion exhaust gases, a core outlet for discharging the exhaust gases, a bypass duct for channeling cooling bypass airflow from the fan and over the core engine, a duct outlet for discharging the bypass airflow, an augmentor including an annular combustion liner for receiving therein the exhaust gases and a first portion of the bypass airflow discharged from the duct outlet, and an annular plenum surrounding the liner and having an inlet spaced axially from the bypass duct outlet for receiving from the bypass duct outlet a second portion of the bypass airflow for cooling the augmentor liner. It comprises accelerating the bypass airflow discharged from the bypass duct outlet to a velocity greater than Mach 1 for providing accelerated bypass airflow to the augmentor; and decelerating the accelerated bypass airflow to a velocity less than Mach 1 at the plenum duct inlet for creating pressure losses due to shock waves to the bypass airflow second portion channeled to the plenum for reducing differential pressure acting across the liner.

  12. Performance Engineering Research Center and RECOVERY. Performance Engineering Research Institution SciDAC-e Augmentation. Performance enhancement

    SciTech Connect

    Hollingsworth, Jeffrey K.

    2015-10-12

    This project concentrated on various ways to improve the measurement and tuning large-scale parallel applications. This project was supplement to the project DE-FC0206ER25763 (“Performance Engineering Research Center”). The research conducted during this project is summarized in this report. The complete details of the work are available in the ten publications listed at the end of the report. It also supported the Ph.D. studies of three students and one research scientist.

  13. Computer Imagery and Neurological Rehabilitation: On the Use of Augmented Reality in Sensorimotor Training to Step Up Naturally Occurring Cortical Reorganization in Patients Following Stroke.

    PubMed

    Correa-Agudelo, Esteban; Ferrin, Carlos; Velez, Paulo; Gomez, Juan D

    2016-01-01

    This work promotes the use of computer-generated imagery -as visual illusions- to speed up motor learning in rehabilitation. In support of this, we adhere the principles of experience-dependent neuroplasticity and the positive impact of virtual reality (VR) thereof. Specifically, post-stroke patients will undergo motor therapy with a surrogate virtual limb that fakes the paralyzed limb. Along these lines, their motor intentions will match the visual evidence, which fosters physiological, functional and structural changes over time, for recovery of lost function in an injured brain. How we make up such an illusion using computer graphics, is central to this paper. PMID:27046556

  14. Virtual Reality: The New Era of Rehabilitation Engineering [From the Regional Editor].

    PubMed

    Tong, Shanbao

    2016-01-01

    Rehabilitation engineering refers to the development and application of techniques, devices, and protocols for restoring function following disability. Although in most cases the concept relates to motor functions (e.g., training after a stroke or the use of limb prosthetics), mental rehabilitation engineering is also an emerging area. PMID:27187532

  15. Learning from Experience: The Realities of Developing Mathematics Courses for an Online Engineering Programme

    ERIC Educational Resources Information Center

    Quinn, Diana; Albrecht, Amie; Webby, Brian; White, Kevin

    2015-01-01

    Rarely do university departments of mathematics redesign their basic mathematics courses. Through developing an online version of our associate degree in engineering in collaboration with Open Universities Australia, we redesigned the first in a sequence of five engineering mathematics courses. The online cohort proved different to our…

  16. Research Design Becomes Research Reality: Colorado School of Mines Implements Research Methodology for the Center for the Advancement of Engineering Education. Research Brief

    ERIC Educational Resources Information Center

    Loshbaugh, Heidi; Streveler, Ruth; Breaux, Kimberley

    2007-01-01

    The Center for the Advancement of Engineering Education was founded in 2003 with five collaborating institutions. A multi-institutional, multi-year grant offers many opportunities for the demands of reality to interfere with design goals. In particular, at Colorado School of Mines (CSM) student demographics required adjustment of the original APS…

  17. Synthetic Bone Substitute Engineered with Amniotic Epithelial Cells Enhances Bone Regeneration after Maxillary Sinus Augmentation

    PubMed Central

    Barboni, Barbara; Mangano, Carlo; Valbonetti, Luca; Marruchella, Giuseppe; Berardinelli, Paolo; Martelli, Alessandra; Muttini, Aurelio; Mauro, Annunziata; Bedini, Rossella; Turriani, Maura; Pecci, Raffaella; Nardinocchi, Delia; Zizzari, Vincenzo Luca; Tetè, Stefano; Piattelli, Adriano; Mattioli, Mauro

    2013-01-01

    Background Evidence has been provided that a cell-based therapy combined with the use of bioactive materials may significantly improve bone regeneration prior to dental implant, although the identification of an ideal source of progenitor/stem cells remains to be determined. Aim In the present research, the bone regenerative property of an emerging source of progenitor cells, the amniotic epithelial cells (AEC), loaded on a calcium-phosphate synthetic bone substitute, made by direct rapid prototyping (rPT) technique, was evaluated in an animal study. Material And Methods Two blocks of synthetic bone substitute (∼0.14 cm3), alone or engineered with 1×106 ovine AEC (oAEC), were grafted bilaterally into maxillary sinuses of six adult sheep, an animal model chosen for its high translational value in dentistry. The sheep were then randomly divided into two groups and sacrificed at 45 and 90 days post implantation (p.i.). Tissue regeneration was evaluated in the sinus explants by micro-computer tomography (micro-CT), morphological, morphometric and biochemical analyses. Results And Conclusions The obtained data suggest that scaffold integration and bone deposition are positively influenced by allotransplantated oAEC. Sinus explants derived from sheep grafted with oAEC engineered scaffolds displayed a reduced fibrotic reaction, a limited inflammatory response and an accelerated process of angiogenesis. In addition, the presence of oAEC significantly stimulated osteogenesis either by enhancing bone deposition or making more extent the foci of bone nucleation. Besides the modulatory role played by oAEC in the crucial events successfully guiding tissue regeneration (angiogenesis, vascular endothelial growth factor expression and inflammation), data provided herein show that oAEC were also able to directly participate in the process of bone deposition, as suggested by the presence of oAEC entrapped within the newly deposited osteoid matrix and by their ability to switch

  18. Augmented anti-tumor effect of dendritic cells genetically engineered by interleukin-12 plasmid DNA.

    PubMed

    Yoshida, Masataka; Jo, Jun-Ichiro; Tabata, Yasuhiko

    2010-01-01

    The objective of this study was to genetically engineer dendritic cells (DC) for biological activation and evaluate their anti-tumor activity in a tumor-bearing mouse model. Mouse DC were incubated on the surface of culture dishes which had been coated with the complexes of a cationized dextran and luciferase plasmid DNA complexes plus a cell adhesion protein, Pronectin, for gene transfection (reverse transfection). When compared with the conventional transfection where DC were transfected in the medium containing the complexes, the level of gene expression by the reverse method was significantly higher and the time period of gene expression was prolonged. Following the reverse transfection of DC by a plasmid DNA of mouse interleukin-12 (mIL-12) complexed with the cationized dextran, the mIL-12 protein was secreted at higher amounts for a longer time period. When injected intratumorally into mice carrying a mass of B16 tumor cells, the DC genetically activated showed significant anti-tumor activity. PMID:20338099

  19. Experimental Investigation of Augmented Spark Ignition of a LO2/LCH4 Reaction Control Engine at Altitude Conditions

    NASA Technical Reports Server (NTRS)

    Kleinhenz, Julie; Sarmiento, Charles; Marshall, William

    2012-01-01

    The use of nontoxic propellants in future exploration vehicles would enable safer, more cost-effective mission scenarios. One promising green alternative to existing hypergols is liquid methane (LCH4) with liquid oxygen (LO2). A 100 lbf LO2/LCH4 engine was developed under the NASA Propulsion and Cryogenic Advanced Development project and tested at the NASA Glenn Research Center Altitude Combustion Stand in a low pressure environment. High ignition energy is a perceived drawback of this propellant combination; so this ignition margin test program examined ignition performance versus delivered spark energy. Sensitivity of ignition to spark timing and repetition rate was also explored. Three different exciter units were used with the engine s augmented (torch) igniter. Captured waveforms indicated spark behavior in hot fire conditions was inconsistent compared to the well-behaved dry sparks. This suggests that rising pressure and flow rate increase spark impedance and may at some point compromise an exciter s ability to complete each spark. The reduced spark energies of such quenched deliveries resulted in more erratic ignitions, decreasing ignition probability. The timing of the sparks relative to the pressure/flow conditions also impacted the probability of ignition. Sparks occurring early in the flow could trigger ignition with energies as low as 1 to 6 mJ, though multiple, similarly timed sparks of 55 to 75 mJ were required for reliable ignition. Delayed spark application and reduced spark repetition rate both correlated with late and occasional failed ignitions. An optimum time interval for spark application and ignition therefore coincides with propellant introduction to the igniter.

  20. Myths and realities: Defining re-engineering for a large organization

    NASA Technical Reports Server (NTRS)

    Yin, Sandra; Mccreary, Julia

    1992-01-01

    This paper describes the background and results of three studies concerning software reverse engineering, re-engineering, and reuse (R3) hosted by the Internal Revenue Service in 1991 and 1992. The situation at the Internal Revenue--aging, piecemeal computer systems and outdated technology maintained by a large staff--is familiar to many institutions, especially among management information systems. The IRS is distinctive for the sheer magnitude and diversity of its problems; the country's tax records are processed using assembly language and COBOL and spread across tape and network DBMS files. How do we proceed with replacing legacy systems? The three software re-engineering studies looked at methods, CASE tool support, and performed a prototype project using re-engineering methods and tools. During the course of these projects, we discovered critical issues broader than the mechanical definitions of methods and tool technology.

  1. Digital integrated control of a Mach 2.5 mixed-compression supersonic inlet and an augmented mixed-flow turbofan engine

    NASA Technical Reports Server (NTRS)

    Batterton, P. G.; Arpasi, D. J.; Baumbick, R. J.

    1974-01-01

    A digitally implemented integrated inlet-engine control system was designed and tested on a mixed-compression, axisymmetric, Mach 2.5, supersonic inlet with 45 percent internal supersonic area contraction and a TF30-P-3 augmented turbofan engine. The control matched engine airflow to available inlet airflow. By monitoring inlet terminal shock position and over-board bypass door command, the control adjusted engine speed so that in steady state, the shock would be at the desired location and the overboard bypass doors would be closed. During engine-induced transients, such as augmentor light-off and cutoff, the inlet operating point was momentarily changed to a more supercritical point to minimize unstarts. The digital control also provided automatic inlet restart. A variable inlet throat bleed control, based on throat Mach number, provided additional inlet stability margin.

  2. Academic Drift in Danish Professional Engineering Education. Myth or Reality? Opportunity or Threat?

    ERIC Educational Resources Information Center

    Christensen, S. H.; Erno-Kjolhede, E.

    2011-01-01

    This article examines whether and, if so, to what extent academic drift can be said to have taken place in Danish professional engineering education. If the answer is affirmative, what were the driving forces behind it and what are the consequences--if any? First, a theoretical and conceptual framework for the discussion is introduced. This is…

  3. Learning from experience: the realities of developing mathematics courses for an online engineering programme

    NASA Astrophysics Data System (ADS)

    Quinn, Diana; Albrecht, Amie; Webby, Brian; White, Kevin

    2015-10-01

    Rarely do university departments of mathematics redesign their basic mathematics courses. Through developing an online version of our associate degree in engineering in collaboration with Open Universities Australia, we redesigned the first in a sequence of five engineering mathematics courses. The online cohort proved different to our face-to-face experience. We embarked on a process of refining the unit using experiential learning and action research. The 13 week unit is delivered up to four times a year and this paper reviews the first 10 cycles of enhancements over 3 years and unpacks the layers of hypotheses underlying development decisions. Several category themes were identified with a focus on students, teachers and learning activities. Investment in online developments for mathematics can have multiple flow-on impacts for other teaching modes. Good curriculum design, regardless of environment, will always be a cornerstone of effective course development processes.

  4. Cortical Modulation of Motor Control Biofeedback among the Elderly with High Fall Risk during a Posture Perturbation Task with Augmented Reality

    PubMed Central

    Chang, Chun-Ju; Yang, Tsui-Fen; Yang, Sai-Wei; Chern, Jen-Suh

    2016-01-01

    The cerebral cortex provides sensorimotor integration and coordination during motor control of daily functional activities. Power spectrum density based on electroencephalography (EEG) has been employed as an approach that allows an investigation of the spatial–temporal characteristics of neuromuscular modulation; however, the biofeedback mechanism associated with cortical activation during motor control remains unclear among elderly individuals. Thirty one community-dwelling elderly participants were divided into low fall-risk potential (LF) and high fall-risk potential (HF) groups based upon the results obtained from a receiver operating characteristic analysis of the ellipse area of the center of pressure. Electroencephalography (EEG) was performed while the participants stood on a 6-degree-of-freedom Stewart platform, which generated continuous perturbations and done either with or without the virtual reality scene. The present study showed that when there was visual stimulation and poor somatosensory coordination, a higher level of cortical response was activated in order to keep postural balance. The elderly participants in the LF group demonstrated a significant and strong correlation between postural-related cortical regions; however, the elderly individuals in the HF group did not show such a relationship. Moreover, we were able to clarify the roles of various brainwave bands functioning in motor control. Specifically, the gamma and beta bands in the parietal–occipital region facilitate the high-level cortical modulation and sensorimotor integration, whereas the theta band in the frontal–central region is responsible for mediating error detection during perceptual motor tasks. Finally, the alpha band is associated with processing visual challenges in the occipital lobe.With a variety of motor control demands, increment in brainwave band coordination is required to maintain postural stability. These investigations shed light on the cortical modulation

  5. Cortical Modulation of Motor Control Biofeedback among the Elderly with High Fall Risk during a Posture Perturbation Task with Augmented Reality.

    PubMed

    Chang, Chun-Ju; Yang, Tsui-Fen; Yang, Sai-Wei; Chern, Jen-Suh

    2016-01-01

    The cerebral cortex provides sensorimotor integration and coordination during motor control of daily functional activities. Power spectrum density based on electroencephalography (EEG) has been employed as an approach that allows an investigation of the spatial-temporal characteristics of neuromuscular modulation; however, the biofeedback mechanism associated with cortical activation during motor control remains unclear among elderly individuals. Thirty one community-dwelling elderly participants were divided into low fall-risk potential (LF) and high fall-risk potential (HF) groups based upon the results obtained from a receiver operating characteristic analysis of the ellipse area of the center of pressure. Electroencephalography (EEG) was performed while the participants stood on a 6-degree-of-freedom Stewart platform, which generated continuous perturbations and done either with or without the virtual reality scene. The present study showed that when there was visual stimulation and poor somatosensory coordination, a higher level of cortical response was activated in order to keep postural balance. The elderly participants in the LF group demonstrated a significant and strong correlation between postural-related cortical regions; however, the elderly individuals in the HF group did not show such a relationship. Moreover, we were able to clarify the roles of various brainwave bands functioning in motor control. Specifically, the gamma and beta bands in the parietal-occipital region facilitate the high-level cortical modulation and sensorimotor integration, whereas the theta band in the frontal-central region is responsible for mediating error detection during perceptual motor tasks. Finally, the alpha band is associated with processing visual challenges in the occipital lobe.With a variety of motor control demands, increment in brainwave band coordination is required to maintain postural stability. These investigations shed light on the cortical modulation of

  6. Design and application of a virtual reality 3D engine based on rapid indices

    NASA Astrophysics Data System (ADS)

    Jiang, Nan; Mai, Jin

    2007-06-01

    This article proposes a data structure of a 3D engine based on rapid indices. Taking a model for a construction unit, this data structure can construct a coordinate array with 3D vertex rapidly and arrange those vertices in a sequence of triangle strips or triangle fans, which can be rendered rapidly by OpenGL. This data structure is easy to extend. It can hold texture coordinates, normal coordinates of vertices and a model matrix. Other models can be added to it, deleted from it, or transformed by model matrix, so it is flexible. This data structure also improves the render speed of OpenGL when it holds a large amount of data.

  7. Analysis of thrust augmentation of turbojet engines by water injection at compressor inlet including charts for calculating compression processes with water injection

    NASA Technical Reports Server (NTRS)

    Wilcox, E Clinton; Trout, Arthur M

    1951-01-01

    A psychrometric chart having total pressure (sum of partial pressures of air and water vapor) as a variable, a Mollier diagram for air saturated with water vapor, and charts showing the thermodynamic properties of various air-water vapor and exhaust gas-water vapor mixtures are presented as aids in calculating the thrust augmentation of a turbojet engine resulting from the injection of water at the compressor inlet. Curves are presented that show the theoretical performance of the augmentation method for various amounts of water injected and the effects of varying flight Mach number, altitude, ambient-air temperature, ambient relative humidity, compressor pressure ratio, and inlet-diffuser efficiency. Numerical examples, illustrating the use of the psychrometric chart and the Mollier diagram in calculating both compressor-inlet and compressor-outlet conditions when water is injected at the compressor inlet, are presented.

  8. Re-engineering land administration systems for sustainable development — from rhetoric to reality

    NASA Astrophysics Data System (ADS)

    Williamson, Ian P.

    Current land administration systems are the product of 19th century economic and land market paradigms and have failed to properly support sustainable development. The need for urgent reform is accepted, but the way forward unclear in many jurisdictions. This paper will discuss current international initiatives and research to develop a new land administration vision to promote sustainable development. Within this context, this paper describes the changing humankind to land relationship, identifies some of the growing environmental pressures facing modern society and the need for sustainable development, explores the evolving role of land administration in society and highlights the need for land administration systems to play a more proactive role in supporting sustainable development objectives. The process to re-engineer land administrations is briefly reviewed. The paper then highlights the development of a national land administration vision and strategy. In proposing strategies the paper draws on international trends and experiences such as highlighted in the recent United Nations — International Federation of Surveyors Declaration on Land Administration for Sustainable Development.

  9. Geohazard identification: the gap between the possible and reality in geophysical surveys for the engineering industry

    NASA Astrophysics Data System (ADS)

    Dyer, Julie M.

    2011-03-01

    Offshore Contractors engaged in the business of installing pipelines and subsea structures routinely rely on third party survey data to identify and assess the risk of surface and buried geohazards. In many cases, route or site survey data are supplied to a marine Contractor with the latter having little or no input to the manner in which the geophysical and geotechnical data are acquired, processed and reported. Often, the completed survey reports indicate that neither data acquisition nor the subsequent data processing were carried out with the ultimate goal of the survey or the end user in mind. This lack of focus at the survey stage can contribute to incorrect or over-optimistic assessments of geohazard risk, with serious impact on offshore installation operations as a consequence. Two case studies from the North Sea are discussed where inappropriate data processing and reporting of geophysical data had a negative impact on offshore installation operations; these are used as a basis for a more general discussion of the underlying reasons for the production of engineering survey reports which are "not-fit-for-purpose".

  10. Virtual reality for emergency training

    SciTech Connect

    Altinkemer, K.

    1995-12-31

    Virtual reality is a sequence of scenes generated by a computer as a response to the five different senses. These senses are sight, sound, taste, touch, smell. Other senses that can be used in virtual reality include balance, pheromonal, and immunological senses. Many application areas include: leisure and entertainment, medicine, architecture, engineering, manufacturing, and training. Virtual reality is especially important when it is used for emergency training and management of natural disasters including earthquakes, floods, tornados and other situations which are hard to emulate. Classical training methods for these extraordinary environments lack the realistic surroundings that virtual reality can provide. In order for virtual reality to be a successful training tool the design needs to include certain aspects; such as how real virtual reality should be and how much fixed cost is entailed in setting up the virtual reality trainer. There are also pricing questions regarding the price per training session on virtual reality trainer, and the appropriate training time length(s).

  11. Virtual Reality Lab Assistant

    NASA Technical Reports Server (NTRS)

    Saha, Hrishikesh; Palmer, Timothy A.

    1996-01-01

    Virtual Reality Lab Assistant (VRLA) demonstration model is aligned for engineering and material science experiments to be performed by undergraduate and graduate students in the course as a pre-lab simulation experience. This will help students to get a preview of how to use the lab equipment and run experiments without using the lab hardware/software equipment. The quality of the time available for laboratory experiments can be significantly improved through the use of virtual reality technology.

  12. When Worlds Collide: An Augmented Reality Check

    ERIC Educational Resources Information Center

    Villano, Matt

    2008-01-01

    The technology is simple: Mobile technologies such as handheld computers and global positioning systems work in sync to create an alternate, hybrid world that mixes virtual characters with the actual physical environment. The result is a digital simulation that offers powerful game-playing opportunities and allows students to become more engaged…

  13. Augmented Reality in Education and Training

    ERIC Educational Resources Information Center

    Lee, Kangdon

    2012-01-01

    There are many different ways for people to be educated and trained with regard to specific information and skills they need. These methods include classroom lectures with textbooks, computers, handheld devices, and other electronic appliances. The choice of learning innovation is dependent on an individual's access to various technologies and the…

  14. Structure Sensor for mobile markerless augmented reality

    NASA Astrophysics Data System (ADS)

    Kilgus, T.; Bux, R.; Franz, A. M.; Johnen, W.; Heim, E.; Fangerau, M.; Müller, M.; Yen, K.; Maier-Hein, L.

    2016-03-01

    3D Visualization of anatomical data is an integral part of diagnostics and treatment in many medical disciplines, such as radiology, surgery and forensic medicine. To enable intuitive interaction with the data, we recently proposed a new concept for on-patient visualization of medical data which involves rendering of subsurface structures on a mobile display that can be moved along the human body. The data fusion is achieved with a range imaging device attached to the display. The range data is used to register static 3D medical imaging data with the patient body based on a surface matching algorithm. However, our previous prototype was based on the Microsoft Kinect camera and thus required a cable connection to acquire color and depth data. The contribution of this paper is two-fold. Firstly, we replace the Kinect with the Structure Sensor - a novel cable-free range imaging device - to improve handling and user experience and show that the resulting accuracy (target registration error: 4.8+/-1.5 mm) is comparable to that achieved with the Kinect. Secondly, a new approach to visualizing complex 3D anatomy based on this device, as well as 3D printed models of anatomical surfaces, is presented. We demonstrate that our concept can be applied to in vivo data and to a 3D printed skull of a forensic case. Our new device is the next step towards clinical integration and shows that the concept cannot only be applied during autopsy but also for presentation of forensic data to laypeople in court or medical education.

  15. Toward the Implementation of Augmented Reality Training

    ERIC Educational Resources Information Center

    Mayberry, Charles R.

    2013-01-01

    The United States Air Force (USAF) trains C-130H Loadmaster students at Little Rock Air Force Base (AFB) through a civilian contract. The Aircrew Training System (ATS) contractor utilizes a Fuselage Trainer (FuT) to provide scenarios for the Loadmaster students to practice loading and unloading a simulated aircraft. The problem was the USAF does…

  16. Psychophysical Evaluation of Haptic Perception Under Augmentation by a Handheld Device

    PubMed Central

    Wu, Bing; Klatzky, Roberta; Lee, Randy; Shivaprabhu, Vikas; Galeotti, John; Siegel, Mel; Schuman, Joel S.; Hollis, Ralph; Stetten, George

    2015-01-01

    Objective This study investigated the effectiveness of force augmentation in haptic perception tasks. Background Considerable engineering effort has been devoted to developing force augmented reality (AR) systems to assist users in delicate procedures like microsurgery. In contrast, far less has been done to characterize the behavioral outcomes of these systems, and no research has systematically examined the impact of sensory and perceptual processes on force augmentation effectiveness. Method Using a handheld force magnifier as an exemplar haptic AR, we conducted three experiments to characterize its utility in the perception of force and stiffness. Experiments 1 and 2 measured, respectively, the user’s ability to detect and differentiate weak force (<0.5 N) with or without the assistance of the device and compared it to direct perception. Experiment 3 examined the perception of stiffness through the force augmentation. Results The user’s ability to detect and differentiate small forces was significantly improved by augmentation at both threshold and suprathreshold levels. The augmentation also enhanced stiffness perception. However, although perception of augmented forces matches that of the physical equivalent for weak forces, it falls off with increasing intensity. Conclusion The loss in the effectiveness reflects the nature of sensory and perceptual processing. Such perceptual limitations should be taken into consideration in the design and development of haptic AR systems to maximize utility. Application The findings provide useful information for building effective haptic AR systems, particularly for use in microsurgery. PMID:25875439

  17. Augmented Reality Visualization with Use of Image Overlay Technology for MR Imaging–guided Interventions: Assessment of Performance in Cadaveric Shoulder and Hip Arthrography at 1.5 T

    PubMed Central

    Fritz, Jan; U-Thainual, Paweena; Ungi, Tamas; Flammang, Aaron J.; Fichtinger, Gabor; Iordachita, Iulian I.

    2012-01-01

    Purpose: To prospectively assess overlay technology in providing accurate and efficient targeting for magnetic resonance (MR) imaging–guided shoulder and hip joint arthrography. Materials and Methods: A prototype augmented reality image overlay system was used in conjunction with a clinical 1.5-T MR imager. A total of 24 shoulder joint and 24 hip joint injections were planned in 12 human cadavers. Two operators (A and B) participated, each performing procedures on different cadavers using image overlay guidance. MR imaging was used to confirm needle positions, monitor injections, and perform MR arthrography. Accuracy was assessed according to the rate of needle adjustment, target error, and whether the injection was intraarticular. Efficiency was assessed according to arthrography procedural time. Operator differences were assessed with comparison of accuracy and procedure times between the operators. Mann-Whitney U test and Fisher exact test were used to assess group differences. Results: Forty-five arthrography procedures (23 shoulders, 22 hips) were performed. Three joints had prostheses and were excluded. Operator A performed 12 shoulder and 12 hip injections. Operator B performed 11 shoulder and 10 hip injections. Needle adjustment rate was 13% (six of 45; one for operator A and five for operator B). Target error was 3.1 mm ± 1.2 (standard deviation) (operator A, 2.9 mm ± 1.4; operator B, 3.5 mm ± 0.9). Intraarticular injection rate was 100% (45 of 45). The average arthrography time was 14 minutes (range, 6–27 minutes; 12 minutes [range, 6–25 minutes] for operator A and 16 minutes [range, 6–27 min] for operator B). Operator differences were not significant with regard to needle adjustment rate (P = .08), target error (P = .07), intraarticular injection rate (P > .99), and arthrography time (P = .22). Conclusion: Image overlay technology provides accurate and efficient MR guidance for successful shoulder and hip arthrography in human cadavers.

  18. Lip augmentation.

    PubMed

    Byrne, Patrick J; Hilger, Peter A

    2004-02-01

    Lip augmentation has become increasingly popular in recent years as a reflection of cultural trends emphasizing youth and beauty. Techniques to enhance the appearance of the lips have evolved with advances in biotechnology. An understanding of lip anatomy and aesthetics forms the basis for successful results. We outline the pertinent anatomy and aesthetics of the preoperative evaluation. A summary of various filler materials available is provided. Augmentation options include both injectable and open surgical techniques. The procedures and materials currently favored by the authors are described in greater detail. PMID:15034811

  19. Maxillary sinus augmentation using an engineered porous hydroxyapatite: a clinical, histological, and transmission electron microscopy study in man.

    PubMed

    Mangano, Carlo; Scarano, Antonio; Iezzi, Giovanna; Orsini, Giovanna; Perrotti, Vittoria; Mangano, Francesco; Montini, Sergio; Piccirilli, Marcello; Piattelli, Adriano

    2006-01-01

    Porous hydroxyapatite (HA) is a calcium-phosphate-based material that is biocompatible, nonimmunological, and osteoconductive, and has a macroporosity of about 200 to 800 microm. The pores seem to be able to induce migration, adhesion, and proliferation of osteoblasts inside the pore network and to promote angiogenesis inside the pore system. The aim of this study was to evaluate the clinical behavior and the histological and ultrastructural aspects of porous HA in maxillary sinus augmentation procedures. Twenty-four patients (19 men, 5 women; average age 53.4 years) in good general physical and mental health and with partially or completely edentulous maxillae were selected for this study. Six months after sinus floor elevation, at the time of dental implant placement, biopsies were carried out under local anesthesia. These bone cores were cut in half and were processed for light and transmission electron microscopy. After a mean 3 years after implantation, all implants are clinically in function and no surgical or prosthetic complications have occurred. Under light microscopy, newly formed bone was 38.5% +/- 4.5%, whereas the residual biomaterial represented 12% +/- 2.3% and the marrow spaces represented 44.6% +/- 4.2%. In addition, in the majority of cases, the biomaterial particles were in close contact with the bone, which appeared compact with the characteristic features of well-organized lamellar bone. A cement-like line was slightly visible at the bone-biomaterial interface, but there were no gaps or interposed connective tissue in between. A high quantity (about 40%) of newly formed bone was present. Bone was closely apposed to the biomaterials particles as shown in light microscopy and transmission electron microscopy. Moreover, no signs of inflammatory cell infiltrate or foreign body reaction were present. Also, most of the biomaterial was resorbed and only a small quantity (a little more than 10%) was still present. The results of our study show that

  20. Comparisons of rational engineering correlations of thermophoretically-augmented particle mass transfer with STAN5-predictions for developing boundary layers

    NASA Technical Reports Server (NTRS)

    Gokoglu, S. A.; Rosner, D. E.

    1984-01-01

    Modification of the code STAN5 to properly include thermophoretic mass transport, and examination of selected test cases developing boundary layers which include variable properties, viscous dissipation, transition to turbulence and transpiration cooling. Under conditions representative of current and projected GT operation, local application of St(M)/St(M),o correlations evidently provides accurate and economical engineering design predictions, especially for suspended particles characterized by Schmidt numbers outside of the heavy vapor range.

  1. An MDO augmented value-based systems engineering approach to holistic design decision-making: A satellite system case study

    NASA Astrophysics Data System (ADS)

    Kannan, Hanumanthrao

    The design of large scale complex engineered systems (LSCES) involves hundreds or thousands of designers making decisions at different levels of an organizational hierarchy. Traditionally, these LSCES are designed using systems engineering methods and processes, where the preferences of the stakeholder are flowed down the hierarchy using requirements that act as surrogates for preference. Current processes do not provide a system level guidance to subsystem designers. Value-Driven Design (VDD) offers a new perspective on complex system design, where the value preferences of the stakeholder are communicated directly through a decomposable value function, thereby providing a mechanism for improved system consistency. Requirements-based systems engineering approaches do not offer a mathematically rigorous way to capture the couplings present in the system. Multidisciplinary Design Optimization (MDO) was specifically developed to address couplings in both analysis and optimization thereby enabling physics-based consistency. MDO uses an objective function with constraints but does not provide a way to formulate the objective function. Current systems engineering processes do not provide a mathematically sound way to make design decisions when designers are faced with uncertainties. Designers tend to choose designs based on their preferences towards risky/uncertain designs, and past research has shown that there needs to be a consistency in risk preferences to enable design decisions that are consistent with stakeholder's desires. This research exploits the complimentary nature of VDD, MDO and Decision Analysis (DA) to enable consistency in communication of system preferences, consistency in physics and consistency in risk preferences. The role of VDD in this research is in formulating a value function for true preferences, whereas the role of MDO is to capture couplings and enable optimization using the value function, and the role of DA is to enable consistent design

  2. Chin augmentation.

    PubMed

    Choe, K S; Stucki-McCormick, S U

    2000-01-01

    The primary goal of facial aesthetic surgery is to restore, enhance, and rejuvenate the aging face to a more youthful appearance, achieving balance and harmony. The mental area must be addressed in order to have a complete synthesis of the face. The concept of augmenting the mental area with implants has evolved so significantly that it now stands by itself as an important procedure. Various autogenous implants for chin augmentation have been in use for over 100 years but have complications. The advent of synthetic materials has given rise to various types of alloplastic implants: Gore-Tex, Medpor, Supramid, Silastic, and Mersilene. No one implant is perfect for every face. This article overviews several alloplastic implants--their advantages, disadvantages, and complications, in addition to the different techniques of preparing and delivering the implants. PMID:11802346

  3. HB-EGF embedded in PGA/PLLA scaffolds via subcritical CO2 augments the production of tissue engineered intestine.

    PubMed

    Liu, Yanchun; Nelson, Tyler; Cromeens, Barrett; Rager, Terrence; Lannutti, John; Johnson, Jed; Besner, Gail E

    2016-10-01

    The ability to deliver sustained-release, biologically active growth factors through custom designed tissue engineering scaffolds at sites of tissue regeneration offers great therapeutic opportunity. Due to the short in vivo half-lives of most growth factors, it is challenging to deliver these proteins to sites of interest where they may be used before being degraded. The application of subcritical CO2 uses gas-phase CO2 at subcritical pressures ranging from 41 to 62 bar (595-913 PSI) which avoids foaming by reducing the amount of CO2 dissolved in the polymer and maintains completely reversible plasticization. In the current study, heparin-binding EGF-like growth factor (HB-EGF) was embedded into polyglycolic acid (PGA)/Poly-l-latic acid (PLLA) scaffolds via subcritical CO2 exposure for the production of tissue engineered intestine (TEI). PGA fiber morphology after subcritical CO2 exposure was examined by scanning electron microscopy (SEM) and the distribution of HB-EGF embedded in the scaffold fibers was detected by HB-EGF immunofluorescent staining. In vivo implantation of HB-EGF-embedded scaffolds confirmed significantly improved TEI structure as a result of local delivery of the trophic growth factor. These findings may be critical for the production of TEI in the treatment of patients with short bowel syndrome in the future. PMID:27380441

  4. Augmentation of Cognition and Perception Through Advanced Synthetic Vision Technology

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Kramer, Lynda J.; Bailey, Randall E.; Arthur, Jarvis J.; Williams, Steve P.; McNabb, Jennifer

    2005-01-01

    Synthetic Vision System technology augments reality and creates a virtual visual meteorological condition that extends a pilot's cognitive and perceptual capabilities during flight operations when outside visibility is restricted. The paper describes the NASA Synthetic Vision System for commercial aviation with an emphasis on how the technology achieves Augmented Cognition objectives.

  5. Augmentation cystoplasty in neurogenic bladder

    PubMed Central

    Kocjancic, Ervin; Demirdağ, Çetin

    2016-01-01

    The aim of this review is to update the indications, contraindications, technique, complications, and the tissue engineering approaches of augmentation cystoplasty (AC) in patients with neurogenic bladder. PubMed/MEDLINE was searched for the keywords "augmentation cystoplasty," "neurogenic bladder," and "bladder augmentation." Additional relevant literature was determined by examining the reference lists of articles identified through the search. The update review of of the indications, contraindications, technique, outcome, complications, and tissue engineering approaches of AC in patients with neurogenic bladder is presented. Although some important progress has been made in tissue engineering AC, conventional AC still has an important role in the surgical treatment of refractory neurogenic lower urinary tract dysfunction. PMID:27617312

  6. Augmentation cystoplasty in neurogenic bladder.

    PubMed

    Çetinel, Bülent; Kocjancic, Ervin; Demirdağ, Çetin

    2016-09-01

    The aim of this review is to update the indications, contraindications, technique, complications, and the tissue engineering approaches of augmentation cystoplasty (AC) in patients with neurogenic bladder. PubMed/MEDLINE was searched for the keywords "augmentation cystoplasty," "neurogenic bladder," and "bladder augmentation." Additional relevant literature was determined by examining the reference lists of articles identified through the search. The update review of of the indications, contraindications, technique, outcome, complications, and tissue engineering approaches of AC in patients with neurogenic bladder is presented. Although some important progress has been made in tissue engineering AC, conventional AC still has an important role in the surgical treatment of refractory neurogenic lower urinary tract dysfunction. PMID:27617312

  7. Augmented RIGS

    NASA Technical Reports Server (NTRS)

    Kaminskas, R. A.; Mcguire, D.

    1974-01-01

    The results of the Phase 2 Resonant Infrasonic Gauging System (RIGS) development program are presented. The program consisted of design, fabrication, and testing of an "augmented" RIGS concept. The RIGS is a gauging system capable of measuring propellant quantities in zero-g as well as under accelerated conditions. Except for hydrogen, it can be used to gauge virtually any propellant in liquid form, including cryogenics. The gage consists of a sensor unit which is attached to the propellant tank and an electronic control unit which may be positioned separately from the sensor. The control unit receives signals from the sensor as well as the propellant temperature measurement and the ullage gas pressure, and computes the propellant quantity in the tank.

  8. Development of a numerical tool to study the mixing phenomenon occurring during mode one operation of a multi-mode ejector-augmented pulsed detonation rocket engine

    NASA Astrophysics Data System (ADS)

    Dawson, Joshua

    A novel multi-mode implementation of a pulsed detonation engine, put forth by Wilson et al., consists of four modes; each specifically designed to capitalize on flow features unique to the various flow regimes. This design enables the propulsion system to generate thrust through the entire flow regime. The Multi-Mode Ejector-Augmented Pulsed Detonation Rocket Engine operates in mode one during take-off conditions through the acceleration to supersonic speeds. Once the mixing chamber internal flow exceeds supersonic speed, the propulsion system transitions to mode two. While operating in mode two, supersonic air is compressed in the mixing chamber by an upstream propagating detonation wave and then exhausted through the convergent-divergent nozzle. Once the velocity of the air flow within the mixing chamber exceeds the Chapman-Jouguet Mach number, the upstream propagating detonation wave no longer has sufficient energy to propagate upstream and consequently the propulsive system shifts to mode three. As a result of the inability of the detonation wave to propagate upstream, a steady oblique shock system is established just upstream of the convergent-divergent nozzle to initiate combustion. And finally, the propulsion system progresses on to mode four operation, consisting purely of a pulsed detonation rocket for high Mach number flight and use in the upper atmosphere as is needed for orbital insertion. Modes three and four appear to be a fairly significant challenge to implement, while the challenge of implementing modes one and two may prove to be a more practical goal in the near future. A vast number of potential applications exist for a propulsion system that would utilize modes one and two, namely a high Mach number hypersonic cruise vehicle. There is particular interest in the dynamics of mode one operation, which is the subject of this research paper. Several advantages can be obtained by use of this technology. Geometrically the propulsion system is fairly

  9. Potent in vitro and in vivo activity of an Fc-engineered humanized anti-HM1.24 antibody against multiple myeloma via augmented effector function

    PubMed Central

    Tai, Yu-Tzu; Horton, Holly M.; Kong, Sun-Young; Pong, Erik; Chen, Hsing; Cemerski, Saso; Bernett, Matthew J.; Nguyen, Duc-Hanh T.; Karki, Sher; Chu, Seung Y.; Lazar, Greg A.; Munshi, Nikhil C.; Desjarlais, John R.; Anderson, Kenneth C.

    2012-01-01

    HM1.24, an immunologic target for multiple myeloma (MM) cells, has not been effectively targeted with therapeutic monoclonal antibodies (mAbs). In this study, we investigated in vitro and in vivo anti-MM activities of XmAb5592, a humanized anti-HM1.24 mAb with Fc-domain engineered to significantly enhance FcγR binding and associated immune effector functions. XmAb5592 increased antibody-dependent cellular cytotoxicity (ADCC) several fold relative to the anti-HM1.24 IgG1 analog against both MM cell lines and primary patient myeloma cells. XmAb5592 also augmented antibody dependent cellular phagocytosis (ADCP) by macrophages. Natural killer (NK) cells became more activated by XmAb5592 than the IgG1 analog, evidenced by increased cell surface expression of granzyme B–dependent CD107a and MM cell lysis, even in the presence of bone marrow stromal cells. XmAb5592 potently inhibited tumor growth in mice bearing human MM xenografts via FcγR-dependent mechanisms, and was significantly more effective than the IgG1 analog. Lenalidomide synergistically enhanced in vitro ADCC against MM cells and in vivo tumor inhibition induced by XmAb5592. A single dose of 20 mg/kg XmAb5592 effectively depleted both blood and bone marrow plasma cells in cynomolgus monkeys. These results support clinical development of XmAb5592, both as a monotherapy and in combination with lenalidomide, to improve patient outcome of MM. PMID:22246035

  10. Alternate Realities

    NASA Astrophysics Data System (ADS)

    Jones, Robert

    2010-02-01

    Two identical learners, observing different example input, or the same examples, but in different order, can form different categories and so judge newer/later input differently. (Machine Learning, T. Mitchell, McGraw Hill, 1997 and Asa H., R. Jones, Trans. Kansas Acad. Sci., vol 109, # 3/4, pg 159, 2006) It seems certain that each of us experiences a somewhat different reality, the question is just how widely these realities can vary one from another. Perhaps 4% of people exhibit synesthesia, perceiving letters or numbers as colored, numbers and dates as having personalities or occupying locations in space. (Synesthesia, R. Cytowic, MIT Press, 2002) The Sapir- Whorf hypothesis claims that a speakers language influences his category structure and the way he thinks. (Language, thought, and reality, B. Whorf, MIT Press, 1956) Those who are skillful at mathematics may know an additional language and be able to think thoughts that the layman can not. The philosophers Plato and Descartes claimed to have had, at certain moments in their lives, a new view of the world, its basic constituents, and its rules which were totally different from our conventional view of reality. (Reflections on Kurt Godel, H. Wang, MIT Press, 1987, pg. 46) Fairly large scale differences are experienced by those who believe in (make use of) concepts like spirit(s), soul(s), god(s), life after death, platonism or Everett's many worlds interpretation of quantum mechanics (The Physics of Immortality, F. Tipler, Doubleday, 1994, pg. 176) )

  11. Virtual Reality.

    ERIC Educational Resources Information Center

    Newby, Gregory B.

    1993-01-01

    Discusses the current state of the art in virtual reality (VR), its historical background, and future possibilities. Highlights include applications in medicine, art and entertainment, science, business, and telerobotics; and VR for information science, including graphical display of bibliographic data, libraries and books, and cyberspace.…

  12. Status report of RMS active damping augmentation

    NASA Technical Reports Server (NTRS)

    Gilbert, Mike; Demeo, Martha E.

    1993-01-01

    A status report of Remote Manipulator System (RMS) active damping augmentation is presented. Topics covered include: active damping augmentation; benefits of RMS ADA; simulated payload definition; sensor and actuator definition; ADA control law design; Shuttle Engineering Simulator (SES) real-time simulation; and astronaut evaluation.

  13. Virtual reality and hallucination: a technoetic perspective

    NASA Astrophysics Data System (ADS)

    Slattery, Diana R.

    2008-02-01

    Virtual Reality (VR), especially in a technologically focused discourse, is defined by a class of hardware and software, among them head-mounted displays (HMDs), navigation and pointing devices; and stereoscopic imaging. This presentation examines the experiential aspect of VR. Putting "virtual" in front of "reality" modifies the ontological status of a class of experience-that of "reality." Reality has also been modified [by artists, new media theorists, technologists and philosophers] as augmented, mixed, simulated, artificial, layered, and enhanced. Modifications of reality are closely tied to modifications of perception. Media theorist Roy Ascott creates a model of three "VR's": Verifiable Reality, Virtual Reality, and Vegetal (entheogenically induced) Reality. The ways in which we shift our perceptual assumptions, create and verify illusions, and enter "the willing suspension of disbelief" that allows us entry into imaginal worlds is central to the experience of VR worlds, whether those worlds are explicitly representational (robotic manipulations by VR) or explicitly imaginal (VR artistic creations). The early rhetoric surrounding VR was interwoven with psychedelics, a perception amplified by Timothy Leary's presence on the historic SIGGRAPH panel, and the Wall Street Journal's tag of VR as "electronic LSD." This paper discusses the connections-philosophical, social-historical, and psychological-perceptual between these two domains.

  14. Virtual reality and planetary exploration

    NASA Technical Reports Server (NTRS)

    Mcgreevy, Michael W.

    1992-01-01

    Exploring planetary environments is central to NASA's missions and goals. A new computing technology called Virtual Reality has much to offer in support of planetary exploration. This technology augments and extends human presence within computer-generated and remote spatial environments. Historically, NASA has been a leader in many of the fundamental concepts and technologies that comprise Virtual Reality. Indeed, Ames Research Center has a central role in the development of this rapidly emerging approach to using computers. This ground breaking work has inspired researchers in academia, industry, and the military. Further, NASA's leadership in this technology has spun off new businesses, has caught the attention of the international business community, and has generated several years of positive international media coverage. In the future, Virtual Reality technology will enable greatly improved human-machine interactions for more productive planetary surface exploration. Perhaps more importantly, Virtual Reality technology will democratize the experience of planetary exploration and thereby broaden understanding of, and support for, this historic enterprise.

  15. Virtual reality and planetary exploration

    NASA Technical Reports Server (NTRS)

    Mcgreevy, Michael W.

    1992-01-01

    NASA-Ames is intensively developing virtual-reality (VR) capabilities that can extend and augment computer-generated and remote spatial environments. VR is envisioned not only as a basis for improving human/machine interactions involved in planetary exploration, but also as a medium for the more widespread sharing of the experience of exploration, thereby broadening the support-base for the lunar and planetary-exploration endeavors. Imagery representative of Mars are being gathered for VR presentation at such terrestrial sites as Antarctica and Death Valley.

  16. NAESA Augmentation Pilot Project

    NASA Technical Reports Server (NTRS)

    Hoover, John J.

    1998-01-01

    This project was one project within the Native American Earth and Space Academy (NAESA). NAESA is a national initiative comprised of several organizations that support programs which focus on 1) enhancing the technological, scientific and pedagogical skills of K-14 teachers who instruct Native Americans, 2) enhancing the understanding and applications of science, technology, and engineering of college-bound Native Americans and teaching them general college "survival skills" (e.g., test taking, time management, study habits), 3) enhancing the scientific and pedagogical skills of the faculty of tribally-controllcd colleges and community colleges with large Native American enrollments, and 4) strengthening the critical relationships between students, their parents, tribal elders, and their communities. This Augmentation Pilot Project focused on the areas of community-school alliances and intemet technology use in teaching and learning and daily living addressing five major objectives.

  17. Magnetohydrodynamic Augmented Propulsion Experiment

    NASA Technical Reports Server (NTRS)

    Litchford, Ron J.

    2008-01-01

    Over the past several years, efforts have been under way to design and develop an operationally flexible research facility for investigating the use of cross-field MHD accelerators as a potential thrust augmentation device for thermal propulsion systems. The baseline configuration for this high-power experimental facility utilizes a 1.5-MWe multi-gas arc-heater as a thermal driver for a 2-MWe MHD accelerator, which resides in a large-bore 2-tesla electromagnet. A preliminary design study using NaK seeded nitrogen as the working fluid led to an externally diagonalized segmented MHD channel configuration based on an expendable heat-sink design concept. The current status report includes a review of engineering/design work and performance optimization analyses and summarizes component hardware fabrication and development efforts, preliminary testing results, and recent progress toward full-up assembly and testing

  18. New perspectives and limitations in the use of virtual reality in the rehabilitation of motor disorders

    NASA Astrophysics Data System (ADS)

    De Mauro, Alessandro; Ardanza, Aitor; Monge, Esther; Molina Rueda, Francisco

    2013-03-01

    Several studies have shown that both virtual and augmented reality are technologies suitable for rehabilitation therapy due to the inherent ability of simulating real daily life activities while improving patient motivation. In this paper we will first present the state of the art in the use of virtual and augmented reality applications for rehabilitation of motor disorders and second we will focus on the analysis of the results of our project. In particular, requirements of patients with cerebrovascular accidents, spinal cord injuries and cerebral palsy to the use of virtual and augmented reality systems will be detailed.

  19. Breast augmentation surgery

    MedlinePlus

    Breast augmentation; Breast implants; Implants - breast; Mammaplasty ... Breast augmentation is done by placing implants behind breast tissue or under the chest muscle. An implant is a sac filled with either sterile salt water (saline) or a ...

  20. Creating Tangible Interfaces by Augmenting Physical Objects with Multimodal Language

    SciTech Connect

    McGee, David R. ); Cohen, Philip R.

    2001-01-01

    Rasa is a tangible augmented reality environment that digitally enhances the existing paper-based command and control capability in a military command post. By observing and understanding the users' speech, pen, and touch-based multimodal language, Rasa computationally augments the physical objects on a command post map, linking these items to digital representations of the same; for example, linking a paper map to the world and Post-it notes to military units. Herein, we give a thorough account of Rasa's underlying multiagent framework, and its recognition, understanding, and multimodal integration components. Moreover, we examine five properties of language: generativity, comprehensibility, compositionality, referentiality, and, at times, persistence--that render it suitable as an augmentation approach, contrasting these properties to those of other augmentation methods. It is these properties of language that allow users of Rasa to augment physical objects, transforming them into tangible interfaces.

  1. Magnetohydrodynamic Augmented Propulsion Experiment

    NASA Technical Reports Server (NTRS)

    Litchford, Ron J.; Cole, John; Lineberry, John; Chapman, Jim; Schmidt, Harold; Cook, Stephen (Technical Monitor)

    2002-01-01

    A fundamental obstacle to routine space access is the specific energy limitations associated with chemical fuels. In the case of vertical take-off, the high thrust needed for vertical liftoff and acceleration to orbit translates into power levels in the 10 GW range. Furthermore, useful payload mass fractions are possible only if the exhaust particle energy (i.e., exhaust velocity) is much greater than that available with traditional chemical propulsion. The electronic binding energy released by the best chemical reactions (e.g., LOX/LH2 for example, is less than 2 eV per product molecule (approx. 1.8 eV per H2O molecule), which translates into particle velocities less than 5 km/s. Useful payload fractions, however, will require exhaust velocities exceeding 15 km/s (i.e., particle energies greater than 20 eV). As an added challenge, the envisioned hypothetical RLV (reusable launch vehicle) should accomplish these amazing performance feats while providing relatively low acceleration levels to orbit (2-3g maximum). From such fundamental considerations, it is painfully obvious that planned and current RLV solutions based on chemical fuels alone represent only a temporary solution and can only result in minor gains, at best. What is truly needed is a revolutionary approach that will dramatically reduce the amount of fuel and size of the launch vehicle. This implies the need for new compact high-power energy sources as well as advanced accelerator technologies for increasing engine exhaust velocity. Electromagnetic acceleration techniques are of immense interest since they can be used to circumvent the thermal limits associated with conventional propulsion systems. This paper describes the Magnetohydrodynamic Augmented Propulsion Experiment (MAPX) being undertaken at NASA Marshall Space Flight Center (MSFC). In this experiment, a 1-MW arc heater is being used as a feeder for a 1-MW magnetohydrodynamic (MHD) accelerator. The purpose of the experiment is to demonstrate

  2. Investigation of Thrust Augmentation of a 1600-pound Thrust Centrifugal-flow-type Turbojet Engine by Injection of Refrigerants at Compressor Inlets

    NASA Technical Reports Server (NTRS)

    Jones, William L.; Dowman, Harry W.

    1947-01-01

    Investigations were conducted to determine effectiveness of refrigerants in increasing thrust of turbojet engines. Mixtures of water an alcohol were injected for a range of total flows up to 2.2 lb/sec. Kerosene was injected into inlets covering a range of injected flows up to approximately 30% of normal engine fuel flow. Injection of 2.0 lb/sec of water alone produced an increase in thrust of 35.8% of rate engine conditions and kerosene produced a negligible increase in thrust. Carbon dioxide increased thrust 23.5 percent.

  3. The efficacy of a tissue-engineered xenograft in conjunction with sodium hyaluronate carrier in maxillary sinus augmentation: a clinical study.

    PubMed

    Emam, H A; Behiri, G; El-Alaily, M; Sharawy, M

    2015-10-01

    PepGen P-15 Putty comprises anorganic bovine bone matrix (ABM) coupled with a synthetic cell-binding peptide, suspended in a sodium hyaluronate carrier. The P-15 portion exhibits a similar structure and properties to the cell-binding region of type I collagen. This study was performed to evaluate ABM/P-15 putty as the sole graft in sinus augmentation. Ten patients for whom both a sinus augmentation and two implants were indicated in the posterior maxilla were enrolled. Bone cores were harvested at 8 and 16 weeks, followed by placement of one implant at 8 weeks and the second at 16 weeks. Twenty collected bone cores were evaluated histologically and by micro-computed tomography. Results showed a significant increase (P<0.05) in bone mineral density at 8 weeks (0.70±0.13g/cm(3)) and 16 weeks (0.97±0.08g/cm(3)) in the graft compared to native (control) bone (0.04±0.02g/cm(3)). There was no significant difference (P>0.05) in the percentage bone volume at the two time intervals (PBV 21.14±4.56 at 8 weeks and 26.33±5.60 at 16 weeks). The average increase in bone height at 16 weeks was 10.55±0.53mm. It is concluded that PepGen P-15 Putty is capable of conducting and accelerating new bone formation and can successfully support dental implants. PMID:25998934

  4. Virtual Reality Enhanced Instructional Learning

    ERIC Educational Resources Information Center

    Nachimuthu, K.; Vijayakumari, G.

    2009-01-01

    Virtual Reality (VR) is a creation of virtual 3D world in which one can feel and sense the world as if it is real. It is allowing engineers to design machines and Educationists to design AV [audiovisual] equipment in real time but in 3-dimensional hologram as if the actual material is being made and worked upon. VR allows a least-cost (energy…

  5. Virtual and augmented medical imaging environments: enabling technology for minimally invasive cardiac interventional guidance.

    PubMed

    Linte, Cristian A; White, James; Eagleson, Roy; Guiraudon, Gérard M; Peters, Terry M

    2010-01-01

    Virtual and augmented reality environments have been adopted in medicine as a means to enhance the clinician's view of the anatomy and facilitate the performance of minimally invasive procedures. Their value is truly appreciated during interventions where the surgeon cannot directly visualize the targets to be treated, such as during cardiac procedures performed on the beating heart. These environments must accurately represent the real surgical field and require seamless integration of pre- and intra-operative imaging, surgical tracking, and visualization technology in a common framework centered around the patient. This review begins with an overview of minimally invasive cardiac interventions, describes the architecture of a typical surgical guidance platform including imaging, tracking, registration and visualization, highlights both clinical and engineering accuracy limitations in cardiac image guidance, and discusses the translation of the work from the laboratory into the operating room together with typically encountered challenges. PMID:22275200

  6. Morphine: Myths and Reality

    MedlinePlus

    ... and Families Take the Quiz Morphine: Myths and Reality February, 2013 The mere mention of “Morphine” can ... due to misinformation and lack of training. The reality is that Morphine (and other opiates that work ...

  7. Virtual Reality as Metaphor.

    ERIC Educational Resources Information Center

    Gozzi, Raymond, Jr.

    1996-01-01

    Suggests that virtual reality technology has become popular because it is a miniaturization, a model, of something that already exists. Compares virtual reality to the news media, which centers on the gory, the sensational, and the distorted. (PA)

  8. Equating of Augmented Subscores

    ERIC Educational Resources Information Center

    Sinharay, Sandip; Haberman, Shelby J.

    2011-01-01

    Recently, there has been an increasing level of interest in subscores for their potential diagnostic value. Haberman (2008b) suggested reporting an augmented subscore that is a linear combination of a subscore and the total score. Sinharay and Haberman (2008) and Sinharay (2010) showed that augmented subscores often lead to more accurate…

  9. Subfascial gluteal augmentation.

    PubMed

    de la Peña, J Abel; Rubio, Omar V; Cano, Jacobo P; Cedillo, Mariana C; Garcés, Miriam T

    2006-07-01

    Developing the concept of gluteal augmentation through the past 17 years has been an academic adventure. During these years my coworkers and I have progressively improved surgical technique and devised an anatomical system for gluteal augmentation that includes an ideal implant design and templates to assist in evaluating patients in the preoperative period and to identify the most appropriate implant size. PMID:16818097

  10. Rhetoric as Reality Construction.

    ERIC Educational Resources Information Center

    Kneupper, Charles W.

    This essay provides an analytic development of a philosophy of rhetoric which focuses its concern on social reality. According to this philosophy, the activity of the human mind invents symbolic constructions of reality. Primary socialization is interpreted as a rhetorical process which tends to maintain prevailing reality constructions.…

  11. Augmenting reality in Direct View Optical (DVO) overlay applications

    NASA Astrophysics Data System (ADS)

    Hogan, Tim; Edwards, Tim

    2014-06-01

    The integration of overlay displays into rifle scopes can transform precision Direct View Optical (DVO) sights into intelligent interactive fire-control systems. Overlay displays can provide ballistic solutions within the sight for dramatically improved targeting, can fuse sensor video to extend targeting into nighttime or dirty battlefield conditions, and can overlay complex situational awareness information over the real-world scene. High brightness overlay solutions for dismounted soldier applications have previously been hindered by excessive power consumption, weight and bulk making them unsuitable for man-portable, battery powered applications. This paper describes the advancements and capabilities of a high brightness, ultra-low power text and graphics overlay display module developed specifically for integration into DVO weapon sight applications. Central to the overlay display module was the development of a new general purpose low power graphics controller and dual-path display driver electronics. The graphics controller interface is a simple 2-wire RS-232 serial interface compatible with existing weapon systems such as the IBEAM ballistic computer and the RULR and STORM laser rangefinders (LRF). The module features include multiple graphics layers, user configurable fonts and icons, and parameterized vector rendering, making it suitable for general purpose DVO overlay applications. The module is configured for graphics-only operation for daytime use and overlays graphics with video for nighttime applications. The miniature footprint and ultra-low power consumption of the module enables a new generation of intelligent DVO systems and has been implemented for resolutions from VGA to SXGA, in monochrome and color, and in graphics applications with and without sensor video.

  12. Stereoscopic augmented reality for da Vincii™ robotic biliary surgery

    PubMed Central

    Volonté, Francesco; Buchs, Nicolas C.; Pugin, François; Spaltenstein, Joël; Jung, Minoa; Ratib, Osman; Morel, Philippe

    2013-01-01

    INTRODUCTION New laparoscopic techniques put distance between the surgeon and his patient. PRESENTATION OF CASE 3D volume rendered images directly displayed in the da Vinci surgeon's console fill this gap by allowing the surgeon to fully immerse in its intervention. DISCUSSION During the robotic operation the surgeon has a greater control on the procedure because he can stay more focused not being obliged to turn is sight out of his operative field. Moreover, thanks to depth perception of the rendered images he had a precise view of important anatomical structures. CONCLUSION We describe our preliminary experience in the quest of computer-assisted robotic surgery. PMID:23466685

  13. Piloting Augmented Reality Technology to Enhance Realism in Clinical Simulation.

    PubMed

    Vaughn, Jacqueline; Lister, Michael; Shaw, Ryan J

    2016-09-01

    We describe a pilot study that incorporated an innovative hybrid simulation designed to increase the perception of realism in a high-fidelity simulation. Prelicensure students (N = 12) cared for a manikin in a simulation lab scenario wearing Google Glass, a wearable head device that projected video into the students' field of vision. Students reported that the simulation gave them confidence that they were developing skills and knowledge to perform necessary tasks in a clinical setting and that they met the learning objectives of the simulation. The video combined visual images and cues seen in a real patient and created a sense of realism the manikin alone could not provide. PMID:27258807

  14. Library Orientation Transformation: From Paper Map to Augmented Reality

    ERIC Educational Resources Information Center

    Mulch, Beth Ebenstein

    2014-01-01

    In this article, high school librarian Beth Ebenstein Mulch describes how she used iPads to introduce students at T. C. Williams High School (Alexandria, Virginia) to the school library. Her goal was for the library to come to life in front of new students and for them to learn from peers about all the great resources and services their library…

  15. The Future of Learning and Training in Augmented Reality

    ERIC Educational Resources Information Center

    Lee, Kangdon

    2012-01-01

    Students acquire knowledge and skills through different modes of instruction that include classroom lectures with textbooks, computers, and the like. The availability and choice of learning innovation depends on the individual's access to technologies and on the infrastructure environment of the surrounding community. In this rapidly changing…

  16. Latency and User Performance in Virtual Environments and Augmented Reality

    NASA Technical Reports Server (NTRS)

    Ellis, Stephen R.

    2009-01-01

    System rendering latency has been recognized by senior researchers, such as Professor Fredrick Brooks of UNC (Turing Award 1999), as a major factor limiting the realism and utility of head-referenced displays systems. Latency has been shown to reduce the user's sense of immersion within a virtual environment, disturb user interaction with virtual objects, and to contribute to motion sickness during some simulation tasks. Latency, however, is not just an issue for external display systems since finite nerve conduction rates and variation in transduction times in the human body's sensors also pose problems for latency management within the nervous system. Some of the phenomena arising from the brain's handling of sensory asynchrony due to latency will be discussed as a prelude to consideration of the effects of latency in interactive displays. The causes and consequences of the erroneous movement that appears in displays due to latency will be illustrated with examples of the user performance impact provided by several experiments. These experiments will review the generality of user sensitivity to latency when users judge either object or environment stability. Hardware and signal processing countermeasures will also be discussed. In particular the tuning of a simple extrapolative predictive filter not using a dynamic movement model will be presented. Results show that it is possible to adjust this filter so that the appearance of some latencies may be hidden without the introduction of perceptual artifacts such as overshoot. Several examples of the effects of user performance will be illustrated by three-dimensional tracking and tracing tasks executed in virtual environments. These experiments demonstrate classic phenomena known from work on manual control and show the need for very responsive systems if they are indented to support precise manipulation. The practical benefits of removing interfering latencies from interactive systems will be emphasized with some classic final examples from surgical telerobotics, and human-computer interaction.

  17. Participatory Scaling through Augmented Reality Learning through Local Games

    ERIC Educational Resources Information Center

    Martin, John; Dikkers, Seann; Squire, Kurt; Gagnon, David

    2014-01-01

    The proliferation of broadband mobile devices, which many students bring to school with them as mobile phones, makes the widespread adoption of AR pedagogies a possibility, but pedagogical, distribution, and training models are needed to make this innovation an integrated part of education, This paper employs Social Construction of Technology…

  18. Stirling engines

    SciTech Connect

    Reader, G.T.; Hooper

    1983-01-01

    The Stirling engine was invented by a Scottish clergyman in 1816, but fell into disuse with the coming of the diesel engine. Advances in materials science and the energy crisis have made a hot air engine economically attractive. Explanations are full and understandable. Includes coverage of the underlying thermodynamics and an interesting historical section. Topics include: Introduction to Stirling engine technology, Theoretical concepts--practical realities, Analysis, simulation and design, Practical aspects, Some alternative energy sources, Present research and development, Stirling engine literature.

  19. Correction of murine hemophilia A following nonmyeloablative transplantation of hematopoietic stem cells engineered to encode an enhanced human factor VIII variant using a safety-augmented retroviral vector

    PubMed Central

    Ramezani, Ali

    2009-01-01

    Insertional mutagenesis by retroviral vectors is a major impediment to the clinical application of hematopoietic stem cell gene transfer for the treatment of hematologic disorders. We recently developed an insulated self-inactivating gammaretroviral vector, RMSinOFB, which uses a novel enhancer-blocking element that significantly decreases genotoxicity of retroviral integration. In this study, we used the RMSinOFB vector to evaluate the efficacy of a newly bioengineered factor VIII (fVIII) variant (efVIII)—containing a combination of A1 domain point mutations (L303E/F309S) and an extended partial B domain for improved secretion plus A2 domain mutations (R484A/R489A/P492A) for reduced immunogenicity—toward successful treatment of murine hemophilia A. In cell lines, efVIII was secreted at up to 6-fold higher levels than an L303E/F309S A1 domain–only fVIII variant (sfVIIIΔB). Most important, when compared with a conventional gammaretroviral vector expressing sfVIIIΔB, lower doses of RMSin-efVIII-OFB–transduced hematopoietic stem cells were needed to generate comparable curative fVIII levels in hemophilia A BALB/c mice after reduced-intensity total body irradiation or nonmyeloablative chemotherapy conditioning regimens. These data suggest that the safety-augmented RMSin-efVIII-OFB platform represents an encouraging step in the development of a clinically appropriate gene addition therapy for hemophilia A. PMID:19470695

  20. Lessons from "A Really Useful Engine"[TM]: Using Thomas the Tank Engine[TM] to Examine the Relationship between Play as a Leading Activity, Imagination and Reality in Children's Contemporary Play Worlds

    ERIC Educational Resources Information Center

    Edwards, Susan

    2011-01-01

    This paper examines Vygotsky's conception of play as a leading activity in the contexts of children's contemporary play worlds. Commencing with an examination of the relationship between leading activities and the development of psychological functions, the paper moves into a consideration of the relationship between imagination and reality as a…