ARSC: Augmented Reality Student Card--An Augmented Reality Solution for the Education Field
ERIC Educational Resources Information Center
El Sayed, Neven A. M.; Zayed, Hala H.; Sharawy, Mohamed I.
2011-01-01
Augmented Reality (AR) is the technology of adding virtual objects to real scenes through enabling the addition of missing information in real life. As the lack of resources is a problem that can be solved through AR, this paper presents and explains the usage of AR technology we introduce Augmented Reality Student Card (ARSC) as an application of…
What is going on in augmented reality simulation in laparoscopic surgery?
Botden, Sanne M B I; Jakimowicz, Jack J
2009-08-01
To prevent unnecessary errors and adverse results of laparoscopic surgery, proper training is of paramount importance. A safe way to train surgeons for laparoscopic skills is simulation. For this purpose traditional box trainers are often used, however they lack objective assessment of performance. Virtual reality laparoscopic simulators assess performance, but lack realistic haptic feedback. Augmented reality (AR) combines a virtual reality (VR) setting with real physical materials, instruments, and feedback. This article presents the current developments in augmented reality laparoscopic simulation. Pubmed searches were performed to identify articles regarding surgical simulation and augmented reality. Identified companies manufacturing an AR laparoscopic simulator received the same questionnaire referring to the features of the simulator. Seven simulators that fitted the definition of augmented reality were identified during the literature search. Five of the approached manufacturers returned a completed questionnaire, of which one simulator appeared to be VR and was therefore not applicable for this review. Several augmented reality simulators have been developed over the past few years and they are improving rapidly. We recommend the development of AR laparoscopic simulators for component tasks of procedural training. AR simulators should be implemented in current laparoscopic training curricula, in particular for laparoscopic suturing training.
Social Augmented Reality: Enhancing Context-Dependent Communication and Informal Learning at Work
ERIC Educational Resources Information Center
Pejoska, Jana; Bauters, Merja; Purma, Jukka; Leinonen, Teemu
2016-01-01
Our design proposal of social augmented reality (SoAR) grows from the observed difficulties of practical applications of augmented reality (AR) in workplace learning. In our research we investigated construction workers doing physical work in the field and analyzed the data using qualitative methods in various workshops. The challenges related to…
The Effect of Augmented Reality Applications in the Learning Process: A Meta-Analysis Study
ERIC Educational Resources Information Center
Ozdemir, Muzaffer; Sahin, Cavus; Arcagok, Serdar; Demir, M. Kaan
2018-01-01
Purpose: The aim of this research is to investigate the effect of Augmented Reality (AR) applications in the learning process. Problem: Research that determines the effectiveness of Augmented Reality (AR) applications in the learning process with different variables has not been encountered in national or international literature. Research…
Adaptive multimodal interaction in mobile augmented reality: A conceptual framework
NASA Astrophysics Data System (ADS)
Abidin, Rimaniza Zainal; Arshad, Haslina; Shukri, Saidatul A'isyah Ahmad
2017-10-01
Recently, Augmented Reality (AR) is an emerging technology in many mobile applications. Mobile AR was defined as a medium for displaying information merged with the real world environment mapped with augmented reality surrounding in a single view. There are four main types of mobile augmented reality interfaces and one of them are multimodal interfaces. Multimodal interface processes two or more combined user input modes (such as speech, pen, touch, manual gesture, gaze, and head and body movements) in a coordinated manner with multimedia system output. In multimodal interface, many frameworks have been proposed to guide the designer to develop a multimodal applications including in augmented reality environment but there has been little work reviewing the framework of adaptive multimodal interface in mobile augmented reality. The main goal of this study is to propose a conceptual framework to illustrate the adaptive multimodal interface in mobile augmented reality. We reviewed several frameworks that have been proposed in the field of multimodal interfaces, adaptive interface and augmented reality. We analyzed the components in the previous frameworks and measure which can be applied in mobile devices. Our framework can be used as a guide for designers and developer to develop a mobile AR application with an adaptive multimodal interfaces.
Augmented Reality, the Future of Contextual Mobile Learning
ERIC Educational Resources Information Center
Sungkur, Roopesh Kevin; Panchoo, Akshay; Bhoyroo, Nitisha Kirtee
2016-01-01
Purpose: This study aims to show the relevance of augmented reality (AR) in mobile learning for the 21st century. With AR, any real-world environment can be augmented by providing users with accurate digital overlays. AR is a promising technology that has the potential to encourage learners to explore learning materials from a totally new…
Learning Anatomy via Mobile Augmented Reality: Effects on Achievement and Cognitive Load
ERIC Educational Resources Information Center
Küçük, Sevda; Kapakin, Samet; Göktas, Yüksel
2016-01-01
Augmented reality (AR), a new generation of technology, has attracted the attention of educators in recent years. In this study, a MagicBook was developed for a neuroanatomy topic by using mobile augmented reality (mAR) technology. This technology integrates virtual learning objects into the real world and allow users to interact with the…
Virtual Reality and Augmented Reality in Plastic Surgery: A Review.
Kim, Youngjun; Kim, Hannah; Kim, Yong Oock
2017-05-01
Recently, virtual reality (VR) and augmented reality (AR) have received increasing attention, with the development of VR/AR devices such as head-mounted displays, haptic devices, and AR glasses. Medicine is considered to be one of the most effective applications of VR/AR. In this article, we describe a systematic literature review conducted to investigate the state-of-the-art VR/AR technology relevant to plastic surgery. The 35 studies that were ultimately selected were categorized into 3 representative topics: VR/AR-based preoperative planning, navigation, and training. In addition, future trends of VR/AR technology associated with plastic surgery and related fields are discussed.
Virtual Reality and Augmented Reality in Plastic Surgery: A Review
Kim, Youngjun; Kim, Hannah
2017-01-01
Recently, virtual reality (VR) and augmented reality (AR) have received increasing attention, with the development of VR/AR devices such as head-mounted displays, haptic devices, and AR glasses. Medicine is considered to be one of the most effective applications of VR/AR. In this article, we describe a systematic literature review conducted to investigate the state-of-the-art VR/AR technology relevant to plastic surgery. The 35 studies that were ultimately selected were categorized into 3 representative topics: VR/AR-based preoperative planning, navigation, and training. In addition, future trends of VR/AR technology associated with plastic surgery and related fields are discussed. PMID:28573091
Affordances of Augmented Reality in Science Learning: Suggestions for Future Research
ERIC Educational Resources Information Center
Cheng, Kun-Hung; Tsai, Chin-Chung
2013-01-01
Augmented reality (AR) is currently considered as having potential for pedagogical applications. However, in science education, research regarding AR-aided learning is in its infancy. To understand how AR could help science learning, this review paper firstly has identified two major approaches of utilizing AR technology in science education,…
ERIC Educational Resources Information Center
Sirakaya, Mustafa; Cakmak, Ebru Kilic
2018-01-01
This study aimed to test the impact of augmented reality (AR) use on student achievement and self-efficacy in vocational education and training. For this purpose, a marker-based AR application, called HardwareAR, was developed. HardwareAR provides information about characteristics of hardware components, ports and assembly. The research design was…
Current Status, Opportunities and Challenges of Augmented Reality in Education
ERIC Educational Resources Information Center
Wu, Hsin-Kai; Lee, Silvia Wen-Yu; Chang, Hsin-Yi; Liang, Jyh-Chong
2013-01-01
Although augmented reality (AR) has gained much research attention in recent years, the term AR was given different meanings by varying researchers. In this article, we first provide an overview of definitions, taxonomies, and technologies of AR. We argue that viewing AR as a concept rather than a type of technology would be more productive for…
ERIC Educational Resources Information Center
Akgün, Özcan Erkan; Istanbullu, Aslihan; Avci, Sirin Küçük
2017-01-01
Augmented reality (AR) is a technology to supplement existing reality with additional information, descriptions and helpful images with the help of technology and therefore ensure the reality to be perceived more qualified and well-rounded. In this study, views and comments about problems, solutions and suggestions on using AR were gathered from…
The Influences of the 2D Image-Based Augmented Reality and Virtual Reality on Student Learning
ERIC Educational Resources Information Center
Liou, Hsin-Hun; Yang, Stephen J. H.; Chen, Sherry Y.; Tarng, Wernhuar
2017-01-01
Virtual reality (VR) learning environments can provide students with concepts of the simulated phenomena, but users are not allowed to interact with real elements. Conversely, augmented reality (AR) learning environments blend real-world environments so AR could enhance the effects of computer simulation and promote students' realistic experience.…
ERIC Educational Resources Information Center
Gavish, Nirit; Gutiérrez, Teresa; Webel, Sabine; Rodríguez, Jorge; Peveri, Matteo; Bockholt, Uli; Tecchia, Franco
2015-01-01
The current study evaluated the use of virtual reality (VR) and augmented reality (AR) platforms, developed within the scope of the SKILLS Integrated Project, for industrial maintenance and assembly (IMA) tasks training. VR and AR systems are now widely regarded as promising training platforms for complex and highly demanding IMA tasks. However,…
ERIC Educational Resources Information Center
Alizadeh, Mehrasa; Mehran, Parisa; Koguchi, Ichiro; Takemura, Haruo
2017-01-01
In recent years, there has been a burgeoning interest in Augmented Reality (AR) technologies, especially in educational settings to edutain (i.e. educate and entertain) students and engage them in their learning. This study reports the results of the use of an AR application called BlippAR to augment poster carousel tasks in a blended English…
NASA Astrophysics Data System (ADS)
Schmalstieg, Dieter; Langlotz, Tobias; Billinghurst, Mark
Augmented Reality (AR) was first demonstrated in the 1960s, but only recently have technologies emerged that can be used to easily deploy AR applications to many users. Camera-equipped cell phones with significant processing power and graphics abilities provide an inexpensive and versatile platform for AR applications, while the social networking technology of Web 2.0 provides a large-scale infrastructure for collaboratively producing and distributing geo-referenced AR content. This combination of widely used mobile hardware and Web 2.0 software allows the development of a new type of AR platform that can be used on a global scale. In this paper we describe the Augmented Reality 2.0 concept and present existing work on mobile AR and web technologies that could be used to create AR 2.0 applications.
Application of augmented reality for inferior alveolar nerve block anesthesia: A technical note
2017-01-01
Efforts to apply augmented reality (AR) technology in the medical field include the introduction of AR techniques into dental practice. The present report introduces a simple method of applying AR during an inferior alveolar nerve block, a procedure commonly performed in dental clinics. PMID:28879340
Application of augmented reality for inferior alveolar nerve block anesthesia: A technical note.
Won, Yu-Jin; Kang, Sang-Hoon
2017-06-01
Efforts to apply augmented reality (AR) technology in the medical field include the introduction of AR techniques into dental practice. The present report introduces a simple method of applying AR during an inferior alveolar nerve block, a procedure commonly performed in dental clinics.
Intelligent Augmented Reality Training for Motherboard Assembly
ERIC Educational Resources Information Center
Westerfield, Giles; Mitrovic, Antonija; Billinghurst, Mark
2015-01-01
We investigate the combination of Augmented Reality (AR) with Intelligent Tutoring Systems (ITS) to assist with training for manual assembly tasks. Our approach combines AR graphics with adaptive guidance from the ITS to provide a more effective learning experience. We have developed a modular software framework for intelligent AR training…
ERIC Educational Resources Information Center
Chen, Yu-Hsuan; Wang, Chang-Hwa
2018-01-01
Although research has indicated that augmented reality (AR)-facilitated instruction improves learning performance, further investigation of the usefulness of AR from a psychological perspective has been recommended. Researchers consider presence a major psychological effect when users are immersed in virtual reality environments. However, most…
Augmenting the access grid using augmented reality
NASA Astrophysics Data System (ADS)
Li, Ying
2012-01-01
The Access Grid (AG) targets an advanced collaboration environment, with which multi-party group of people from remote sites can collaborate over high-performance networks. However, current AG still employs VIC (Video Conferencing Tool) to offer only pure video for remote communication, while most AG users expect to collaboratively refer and manipulate the 3D geometric models of grid services' results in live videos of AG session. Augmented Reality (AR) technique can overcome the deficiencies with its characteristics of combining virtual and real, real-time interaction and 3D registration, so it is necessary for AG to utilize AR to better assist the advanced collaboration environment. This paper introduces an effort to augment the AG by adding support for AR capability, which is encapsulated in the node service infrastructure, named as Augmented Reality Service (ARS). The ARS can merge the 3D geometric models of grid services' results and real video scene of AG into one AR environment, and provide the opportunity for distributed AG users to interactively and collaboratively participate in the AR environment with better experience.
Using Augmented Reality Tools to Enhance Children's Library Services
ERIC Educational Resources Information Center
Meredith, Tamara R.
2015-01-01
Augmented reality (AR) has been used and documented for a variety of commercial and educational purposes, and the proliferation of mobile devices has increased the average person's access to AR systems and tools. However, little research has been done in the area of using AR to supplement traditional library services, specifically for patrons aged…
Get Real: Augmented Reality for the Classroom
ERIC Educational Resources Information Center
Mitchell, Rebecca; DeBay, Dennis
2012-01-01
Kids love augmented reality (AR) simulations because they are like real-life video games. AR simulations allow students to learn content while collaborating face to face and interacting with a multimedia-enhanced version of the world around them. Although the technology may seem advanced, AR software makes it easy to develop content-based…
ERIC Educational Resources Information Center
Cheng, Kun-Hung
2017-01-01
Since augmented reality (AR) has been increasingly applied in education recently, the investigation of students' learning experiences with AR could be helpful for educators to implement AR learning. With a quantitative survey using three questionnaires, this study explored the relationships among 153 students' perceived cognitive load, motivation,…
Constructing Liminal Blends in a Collaborative Augmented-Reality Learning Environment
ERIC Educational Resources Information Center
Enyedy, Noel; Danish, Joshua A.; DeLiema, David
2015-01-01
In vision-based augmented-reality (AR) environments, users view the physical world through a video feed or device that "augments" the display with a graphical or informational overlay. Our goal in this manuscript is to ask "how" and "why" these new technologies create opportunities for learning. We suggest that AR is…
Baus, Oliver; Bouchard, Stéphane
2014-01-01
This paper reviews the move from virtual reality exposure-based therapy to augmented reality exposure-based therapy (ARET). Unlike virtual reality (VR), which entails a complete virtual environment (VE), augmented reality (AR) limits itself to producing certain virtual elements to then merge them into the view of the physical world. Although, the general public may only have become aware of AR in the last few years, AR type applications have been around since beginning of the twentieth century. Since, then, technological developments have enabled an ever increasing level of seamless integration of virtual and physical elements into one view. Like VR, AR allows the exposure to stimuli which, due to various reasons, may not be suitable for real-life scenarios. As such, AR has proven itself to be a medium through which individuals suffering from specific phobia can be exposed “safely” to the object(s) of their fear, without the costs associated with programing complete VEs. Thus, ARET can offer an efficacious alternative to some less advantageous exposure-based therapies. Above and beyond presenting what has been accomplished in ARET, this paper covers some less well-known aspects of the history of AR, raises some ARET related issues, and proposes potential avenues to be followed. These include the type of measures to be used to qualify the user’s experience in an augmented reality environment, the exclusion of certain AR-type functionalities from the definition of AR, as well as the potential use of ARET to treat non-small animal phobias, such as social phobia. PMID:24624073
Baus, Oliver; Bouchard, Stéphane
2014-01-01
This paper reviews the move from virtual reality exposure-based therapy to augmented reality exposure-based therapy (ARET). Unlike virtual reality (VR), which entails a complete virtual environment (VE), augmented reality (AR) limits itself to producing certain virtual elements to then merge them into the view of the physical world. Although, the general public may only have become aware of AR in the last few years, AR type applications have been around since beginning of the twentieth century. Since, then, technological developments have enabled an ever increasing level of seamless integration of virtual and physical elements into one view. Like VR, AR allows the exposure to stimuli which, due to various reasons, may not be suitable for real-life scenarios. As such, AR has proven itself to be a medium through which individuals suffering from specific phobia can be exposed "safely" to the object(s) of their fear, without the costs associated with programing complete VEs. Thus, ARET can offer an efficacious alternative to some less advantageous exposure-based therapies. Above and beyond presenting what has been accomplished in ARET, this paper covers some less well-known aspects of the history of AR, raises some ARET related issues, and proposes potential avenues to be followed. These include the type of measures to be used to qualify the user's experience in an augmented reality environment, the exclusion of certain AR-type functionalities from the definition of AR, as well as the potential use of ARET to treat non-small animal phobias, such as social phobia.
ERIC Educational Resources Information Center
Squires, David R.
2017-01-01
The structure of the literature review features the current trajectory of Augmented Reality in the field including the current literature detailing how Augmented Reality has been applied in educational environments; how Augmented Reality has been applied in training environments; how Augmented Reality has been used to measure cognition and the…
ERIC Educational Resources Information Center
Taçgin, Zeynep; Arslan, Ahmet
2017-01-01
The purpose of this study is to determine perception of postgraduate Computer Education and Instructional Technologies (CEIT) students regarding the concepts of Augmented Reality (AR), Virtual Reality (VR), Mixed Reality (MR), Augmented Virtuality (AV) and Mirror Reality; and to offer a table that includes differences and similarities between…
Augmented reality (AR) and virtual reality (VR) applied in dentistry.
Huang, Ta-Ko; Yang, Chi-Hsun; Hsieh, Yu-Hsin; Wang, Jen-Chyan; Hung, Chun-Cheng
2018-04-01
The OSCE is a reliable evaluation method to estimate the preclinical examination of dental students. The most ideal assessment for OSCE is used the augmented reality simulator to evaluate. This literature review investigated a recently developed in virtual reality (VR) and augmented reality (AR) starting of the dental history to the progress of the dental skill. As result of the lacking of technology, it needs to depend on other device increasing the success rate and decreasing the risk of the surgery. The development of tracking unit changed the surgical and educational way. Clinical surgery is based on mature education. VR and AR simultaneously affected the skill of the training lesson and navigation system. Widely, the VR and AR not only applied in the dental training lesson and surgery, but also improved all field in our life. Copyright © 2018. Published by Elsevier Taiwan.
Augmented Reality-Based Simulators as Discovery Learning Tools: An Empirical Study
ERIC Educational Resources Information Center
Ibáñez, María-Blanca; Di-Serio, Ángela; Villarán-Molina, Diego; Delgado-Kloos, Carlos
2015-01-01
This paper reports empirical evidence on having students use AR-SaBEr, a simulation tool based on augmented reality (AR), to discover the basic principles of electricity through a series of experiments. AR-SaBEr was enhanced with knowledge-based support and inquiry-based scaffolding mechanisms, which proved useful for discovery learning in…
Visual Environment for Designing Interactive Learning Scenarios with Augmented Reality
ERIC Educational Resources Information Center
Mota, José Miguel; Ruiz-Rube, Iván; Dodero, Juan Manuel; Figueiredo, Mauro
2016-01-01
Augmented Reality (AR) technology allows the inclusion of virtual elements on a vision of actual physical environment for the creation of a mixed reality in real time. This kind of technology can be used in educational settings. However, the current AR authoring tools present several drawbacks, such as, the lack of a mechanism for tracking the…
ERIC Educational Resources Information Center
Wu, Po-Han; Hwang, Gwo-Jen; Yang, Mei-Ling; Chen, Chih-Hung
2018-01-01
Augmented reality (AR) offers potential advantages for intensifying environmental context awareness and augmenting students' experiences in real-world environments by dynamically overlapping digital materials with a real-world environment. However, some challenges to AR learning environments have been described, such as participants' cognitive…
Augmented Reality Comes to Physics
ERIC Educational Resources Information Center
Buesing, Mark; Cook, Michael
2013-01-01
Augmented reality (AR) is a technology used on computing devices where processor-generated graphics are rendered over real objects to enhance the sensory experience in real time. In other words, what you are really seeing is augmented by the computer. Many AR games already exist for systems such as Kinect and Nintendo 3DS and mobile apps, such as…
An acceptance model for smart glasses based tourism augmented reality
NASA Astrophysics Data System (ADS)
Obeidy, Waqas Khalid; Arshad, Haslina; Huang, Jiung Yao
2017-10-01
Recent mobile technologies have revolutionized the way people experience their environment. Although, there is only limited research on users' acceptance of AR in the cultural tourism context, previous researchers have explored the opportunities of using augmented reality (AR) in order to enhance user experience. Recent AR research lack works that integrates dimensions which are specific to cultural tourism and smart glass specific context. Hence, this work proposes an AR acceptance model in the context of cultural heritage tourism and smart glasses capable of performing augmented reality. Therefore, in this paper we aim to present an AR acceptance model to understand the AR usage behavior and visiting intention for tourists who use Smart Glass based AR at UNESCO cultural heritage destinations in Malaysia. Furthermore, this paper identifies information quality, technology readiness, visual appeal, and facilitating conditions as external variables and key factors influencing visitors' beliefs, attitudes and usage intention.
The Perception and Estimation of Egocentric Distance in Real and Augmented Reality Environments
2008-05-01
MICHELLE SAMS, PhD. Research Program Manager Director Training and Leader Development Technical review by Jennifer L. Solberg, U.S. Army Research...with augmented reality technology that are essential for determining the usefulness of current augmented reality (AR) for training and performance...determine perceived distance. 15. SUBJECT TERMS Augmented Environments, Augmented Reality, Dismounted Infantry, Training , Presence, distance perception
Gunner Goggles: Implementing Augmented Reality into Medical Education.
Wang, Leo L; Wu, Hao-Hua; Bilici, Nadir; Tenney-Soeiro, Rebecca
2016-01-01
There is evidence that both smartphone and tablet integration into medical education has been lacking. At the same time, there is a niche for augmented reality (AR) to improve this process through the enhancement of textbook learning. Gunner Goggles is an attempt to enhance textbook learning in shelf exam preparatory review with augmented reality. Here we describe our initial prototype and detail the process by which augmented reality was implemented into our textbook through Layar. We describe the unique functionalities of our textbook pages upon augmented reality implementation, which includes links, videos and 3D figures, and surveyed 24 third year medical students for their impression of the technology. Upon demonstrating an initial prototype textbook chapter, 100% (24/24) of students felt that augmented reality improved the quality of our textbook chapter as a learning tool. Of these students, 92% (22/24) agreed that their shelf exam review was inadequate and 19/24 (79%) felt that a completed Gunner Goggles product would have been a viable alternative to their shelf exam review. Thus, while students report interest in the integration of AR into medical education test prep, future investigation into how the use of AR can improve performance on exams is warranted.
ERIC Educational Resources Information Center
Cheng, Kun-Hung; Tsai, Chin-Chung
2016-01-01
Following a previous study (Cheng & Tsai, 2014. "Computers & Education"), this study aimed to probe the interaction of child-parent shared reading with the augmented reality (AR) picture book in more depth. A series of sequential analyses were thus conducted to infer the behavioral transition diagrams and visualize the continuity…
Augmented Reality: A Brand New Challenge for the Assessment and Treatment of Psychological Disorders
Chicchi Giglioli, Irene Alice; Pedroli, Elisa
2015-01-01
Augmented Reality is a new technological system that allows introducing virtual contents in the real world in order to run in the same representation and, in real time, enhancing the user's sensory perception of reality. From another point of view, Augmented Reality can be defined as a set of techniques and tools that add information to the physical reality. To date, Augmented Reality has been used in many fields, such as medicine, entertainment, maintenance, architecture, education, and cognitive and motor rehabilitation but very few studies and applications of AR exist in clinical psychology. In the treatment of psychological disorders, Augmented Reality has given preliminary evidence to be a useful tool due to its adaptability to the patient needs and therapeutic purposes and interactivity. Another relevant factor is the quality of the user's experience in the Augmented Reality system determined from emotional engagement and sense of presence. This experience could increase the AR ecological validity in the treatment of psychological disorders. This paper reviews the recent studies on the use of Augmented Reality in the evaluation and treatment of psychological disorders, focusing on current uses of this technology and on the specific features that delineate Augmented Reality a new technique useful for psychology. PMID:26339283
Chicchi Giglioli, Irene Alice; Pallavicini, Federica; Pedroli, Elisa; Serino, Silvia; Riva, Giuseppe
2015-01-01
Augmented Reality is a new technological system that allows introducing virtual contents in the real world in order to run in the same representation and, in real time, enhancing the user's sensory perception of reality. From another point of view, Augmented Reality can be defined as a set of techniques and tools that add information to the physical reality. To date, Augmented Reality has been used in many fields, such as medicine, entertainment, maintenance, architecture, education, and cognitive and motor rehabilitation but very few studies and applications of AR exist in clinical psychology. In the treatment of psychological disorders, Augmented Reality has given preliminary evidence to be a useful tool due to its adaptability to the patient needs and therapeutic purposes and interactivity. Another relevant factor is the quality of the user's experience in the Augmented Reality system determined from emotional engagement and sense of presence. This experience could increase the AR ecological validity in the treatment of psychological disorders. This paper reviews the recent studies on the use of Augmented Reality in the evaluation and treatment of psychological disorders, focusing on current uses of this technology and on the specific features that delineate Augmented Reality a new technique useful for psychology.
Augmented Reality Comes to Physics
NASA Astrophysics Data System (ADS)
Buesing, Mark; Cook, Michael
2013-04-01
Augmented reality (AR) is a technology used on computing devices where processor-generated graphics are rendered over real objects to enhance the sensory experience in real time. In other words, what you are really seeing is augmented by the computer. Many AR games already exist for systems such as Kinect and Nintendo 3DS and mobile apps, such as Tagwhat and Star Chart (a must for astronomy class). The yellow line marking first downs in a televised football game2 and the enhanced puck that makes televised hockey easier to follow3 both use augmented reality to do the job.
Examining Young Children's Perception toward Augmented Reality-Infused Dramatic Play
ERIC Educational Resources Information Center
Han, Jeonghye; Jo, Miheon; Hyun, Eunja; So, Hyo-jeong
2015-01-01
Amid the increasing interest in applying augmented reality (AR) in educational settings, this study explores the design and enactment of an AR-infused robot system to enhance children's satisfaction and sensory engagement with dramatic play activities. In particular, we conducted an exploratory study to empirically examine children's perceptions…
An Augmented-Reality-Based Concept Map to Support Mobile Learning for Science
ERIC Educational Resources Information Center
Chen, Chien-Hsu; Chou, Yin-Yu; Huang, Chun-Yen
2016-01-01
Computer hardware and mobile devices have developed rapidly in recent years, and augmented reality (AR) technology has been increasingly applied in mobile learning. Although instructional AR applications have yielded satisfactory results and prompted students' curiosity and interest, a number of problems remain. The crucial topic for AR…
AR Feels "Softer" than VR: Haptic Perception of Stiffness in Augmented versus Virtual Reality.
Gaffary, Yoren; Le Gouis, Benoit; Marchal, Maud; Argelaguet, Ferran; Arnaldi, Bruno; Lecuyer, Anatole
2017-11-01
Does it feel the same when you touch an object in Augmented Reality (AR) or in Virtual Reality (VR)? In this paper we study and compare the haptic perception of stiffness of a virtual object in two situations: (1) a purely virtual environment versus (2) a real and augmented environment. We have designed an experimental setup based on a Microsoft HoloLens and a haptic force-feedback device, enabling to press a virtual piston, and compare its stiffness successively in either Augmented Reality (the virtual piston is surrounded by several real objects all located inside a cardboard box) or in Virtual Reality (the same virtual piston is displayed in a fully virtual scene composed of the same other objects). We have conducted a psychophysical experiment with 12 participants. Our results show a surprising bias in perception between the two conditions. The virtual piston is on average perceived stiffer in the VR condition compared to the AR condition. For instance, when the piston had the same stiffness in AR and VR, participants would select the VR piston as the stiffer one in 60% of cases. This suggests a psychological effect as if objects in AR would feel "softer" than in pure VR. Taken together, our results open new perspectives on perception in AR versus VR, and pave the way to future studies aiming at characterizing potential perceptual biases.
ERIC Educational Resources Information Center
Tsai, Chia-Wen; Shen, Pei-Di; Fan, Ya-Ting
2014-01-01
In this paper, the authors reviewed the empirical augmented reality (AR) and online education studies, and those focused on designing or development of AR to help students learn, published in SSCI, SCI-EXPANDED, and A&HCI journals from 2003 to 2012. The authors in this study found that the number of AR and online education studies has…
Trends in Educational Augmented Reality Studies: A Systematic Review
ERIC Educational Resources Information Center
Sirakaya, Mustafa; Alsancak Sirakaya, Didem
2018-01-01
This study aimed to identify the trends in the studies conducted on Educational Augmented Reality (AR). 105 articles found in ERIC, EBSCOhost and ScienceDirect databases were reviewed with this purpose in mind. Analyses displayed that the number of educational AR studies has increased over the years. Quantitative methods were mostly preferred in…
ARTutor--An Augmented Reality Platform for Interactive Distance Learning
ERIC Educational Resources Information Center
Lytridis, Chris; Tsinakos, Avgoustos; Kazanidis, Ioannis
2018-01-01
Augmented Reality (AR) has been used in various contexts in recent years in order to enhance user experiences in mobile and wearable devices. Various studies have shown the utility of AR, especially in the field of education, where it has been observed that learning results are improved. However, such applications require specialized teams of…
The Design of Immersive English Learning Environment Using Augmented Reality
ERIC Educational Resources Information Center
Li, Kuo-Chen; Chen, Cheng-Ting; Cheng, Shein-Yung; Tsai, Chung-Wei
2016-01-01
The study uses augmented reality (AR) technology to integrate virtual objects into the real learning environment for language learning. The English AR classroom is constructed using the system prototyping method and evaluated by semi-structured in-depth interviews. According to the flow theory by Csikszenmihalyi in 1975 along with the immersive…
Using Augmented Reality to Support a Software Editing Course for College Students
ERIC Educational Resources Information Center
Wang, Y.-H.
2017-01-01
This study aimed to explore whether integrating augmented reality (AR) techniques could support a software editing course and to examine the different learning effects for students using online-based and AR-based blended learning strategies. The researcher adopted a comparative research approach with a total of 103 college students participating…
3D interactive augmented reality-enhanced digital learning systems for mobile devices
NASA Astrophysics Data System (ADS)
Feng, Kai-Ten; Tseng, Po-Hsuan; Chiu, Pei-Shuan; Yang, Jia-Lin; Chiu, Chun-Jie
2013-03-01
With enhanced processing capability of mobile platforms, augmented reality (AR) has been considered a promising technology for achieving enhanced user experiences (UX). Augmented reality is to impose virtual information, e.g., videos and images, onto a live-view digital display. UX on real-world environment via the display can be e ectively enhanced with the adoption of interactive AR technology. Enhancement on UX can be bene cial for digital learning systems. There are existing research works based on AR targeting for the design of e-learning systems. However, none of these work focuses on providing three-dimensional (3-D) object modeling for en- hanced UX based on interactive AR techniques. In this paper, the 3-D interactive augmented reality-enhanced learning (IARL) systems will be proposed to provide enhanced UX for digital learning. The proposed IARL systems consist of two major components, including the markerless pattern recognition (MPR) for 3-D models and velocity-based object tracking (VOT) algorithms. Realistic implementation of proposed IARL system is conducted on Android-based mobile platforms. UX on digital learning can be greatly improved with the adoption of proposed IARL systems.
Fiia: A Model-Based Approach to Engineering Collaborative Augmented Reality
NASA Astrophysics Data System (ADS)
Wolfe, Christopher; Smith, J. David; Phillips, W. Greg; Graham, T. C. Nicholas
Augmented reality systems often involve collaboration among groups of people. While there are numerous toolkits that aid the development of such augmented reality groupware systems (e.g., ARToolkit and Groupkit), there remains an enormous gap between the specification of an AR groupware application and its implementation. In this chapter, we present Fiia, a toolkit which simplifies the development of collaborative AR applications. Developers specify the structure of their applications using the Fiia modeling language, which abstracts details of networking and provides high-level support for specifying adapters between the physical and virtual world. The Fiia.Net runtime system then maps this conceptual model to a runtime implementation. We illustrate Fiia via Raptor, an augmented reality application used to help small groups collaboratively prototype video games.
Usability Evaluation of an Augmented Reality System for Teaching Euclidean Vectors
ERIC Educational Resources Information Center
Martin-Gonzalez, Anabel; Chi-Poot, Angel; Uc-Cetina, Victor
2016-01-01
Augmented reality (AR) is one of the emerging technologies that has demonstrated to be an efficient technological tool to enhance learning techniques. In this paper, we describe the development and evaluation of an AR system for teaching Euclidean vectors in physics and mathematics. The goal of this pedagogical tool is to facilitate user's…
Making the Invisible Visible in Science Museums through Augmented Reality Devices
ERIC Educational Resources Information Center
Yoon, Susan A.; Wang, Joyce
2014-01-01
Despite the potential of augmented reality (AR) in enabling students to construct new understanding, little is known about how the processes and interactions with the multimedia lead to increased learning. This study seeks to explore the affordances of an AR tool on learning that is focused on the science concept of magnets and magnetic fields.…
ERIC Educational Resources Information Center
Bressler, D. M.; Bodzin, A. M.
2013-01-01
Current studies have reported that secondary students are highly engaged while playing mobile augmented reality (AR) learning games. Some researchers have posited that players' engagement may indicate a flow experience, but no research results have confirmed this hypothesis with vision-based AR learning games. This study investigated factors…
ERIC Educational Resources Information Center
Dunleavy, Matt; Dede, Chris; Mitchell, Rebecca
2009-01-01
The purpose of this study was to document how teachers and students describe and comprehend the ways in which participating in an augmented reality (AR) simulation aids or hinders teaching and learning. Like the multi-user virtual environment (MUVE) interface that underlies Internet games, AR is a good medium for immersive collaborative…
Augmented Reality Learning Experiences: Survey of Prototype Design and Evaluation
ERIC Educational Resources Information Center
Santos, Marc Ericson C.; Chen, Angie; Taketomi, Takafumi; Yamamoto, Goshiro; Miyazaki, Jun; Kato, Hirokazu
2014-01-01
Augmented reality (AR) technology is mature for creating learning experiences for K-12 (pre-school, grade school, and high school) educational settings. We reviewed the applications intended to complement traditional curriculum materials for K-12. We found 87 research articles on augmented reality learning experiences (ARLEs) in the IEEE Xplore…
Augmented Reality Trends in Education: A Systematic Review of Research and Applications
ERIC Educational Resources Information Center
Bacca, Jorge; Baldiris, Silvia; Fabregat, Ramon; Graf, Sabine; Kinshuk
2014-01-01
In recent years, there has been an increasing interest in applying Augmented Reality (AR) to create unique educational settings. So far, however, there is a lack of review studies with focus on investigating factors such as: the uses, advantages, limitations, effectiveness, challenges and features of augmented reality in educational settings.…
Augmented Reality and Mobile Learning: The State of the Art
ERIC Educational Resources Information Center
FitzGerald, Elizabeth; Ferguson, Rebecca; Adams, Anne; Gaved, Mark; Mor, Yishay; Thomas, Rhodri
2013-01-01
In this paper, the authors examine the state of the art in augmented reality (AR) for mobile learning. Previous work in the field of mobile learning has included AR as a component of a wider toolkit but little has been done to discuss the phenomenon in detail or to examine in a balanced fashion its potential for learning, identifying both positive…
The Use of Augmented Reality in Formal Education: A Scoping Review
ERIC Educational Resources Information Center
Saltan, Fatih; Arslan, Ömer
2017-01-01
Augmented Reality (AR) is recognized as one of the most important developments in educational technology for both higher and K-12 education as emphasized in Horizon report (Johnson et al., 2016, 2015). Furthermore, AR is expected to achieve widespread adoption that will take two to three years in higher education and four to five years in K-12…
Investigating the Role of Augmented Reality Technology in the Language Classroom
ERIC Educational Resources Information Center
Solak, Ekrem; Cakir, Recep
2016-01-01
The purpose of this study was to inform about some of the current applications and literature on Augmented Reality (AR) technology in education and to present experimental data about the effectiveness of AR application in a language classroom at the elementary level in Turkey. The research design of the study was quasi-experimental. Sixty-one 5th…
ERIC Educational Resources Information Center
Hsiao, Hsien-Sheng; Chang, Cheng-Sian; Lin, Chien-Yu; Wang, Yau-Zng
2016-01-01
This study focused on how to enhance the interactivity and usefulness of augmented reality (AR) by integrating manipulative interactive tools with a real-world environment. A manipulative AR (MAR) system, which included 3D interactive models and manipulative aids, was designed and developed to teach the unit "Understanding Weather" in a…
The Viability and Value of Student- and Teacher-Created Augmented Reality Experiences
ERIC Educational Resources Information Center
O'Shea, Patrick; Curry-Corcoran, Daniel
2013-01-01
This paper describes the process and results of a project to incorporate Augmented Reality (AR) technologies and pedagogical approaches into a Virginian elementary school. The process involved training 5th grade teachers on the design and production of narrative-based AR games in order to give them the skills that they could then pass on to their…
Augmented Reality for Close Quarters Combat
None
2018-01-16
Sandia National Laboratories has developed a state-of-the-art augmented reality training system for close-quarters combat (CQB). This system uses a wearable augmented reality system to place the user in a real environment while engaging enemy combatants in virtual space (Boston Dynamics DI-Guy). Umbra modeling and simulation environment is used to integrate and control the AR system.
Augmenting Your Own Reality: Student Authoring of Science-Based Augmented Reality Games
ERIC Educational Resources Information Center
Klopfer, Eric; Sheldon, Josh
2010-01-01
Augmented Reality (AR) simulations superimpose a virtual overlay of data and interactions onto a real-world context. The simulation engine at the heart of this technology is built to afford elements of game play that support explorations and learning in students' natural context--their own community and surroundings. In one of the more recent…
Potential Use of Augmented Reality in LIS Education
ERIC Educational Resources Information Center
Wójcik, Magdalena
2016-01-01
The subject of this article is the use of augmented reality technology in library and information science education. The aim is to determine the scope and potential uses of augmented reality in the education of information professionals. In order to determine the scope and forms of potential use of AR technology in LIS education a two-step…
Lee, Jin; Yoo, Ha-Na; Lee, Byoung-Hee
2017-09-01
[Purpose] To determine the effect of augmented reality (AR)-based otago exercise on muscle strength, balance, and physical factors in falls of elderly women. [Subjects and Methods] Thirty subjects were randomly assigned to AR group (AR, n=10), yoga group (yoga, n=10), and self-exercise group (self, n=10). For 12 weeks, these groups were given lessons related to AR-based otago exercise including strengthening, balance training, or yoga three times a week (60 minutes each time) and self-exercise using elastic band exercise program. [Results] Knee flexion and ankle dorsiflexion strength were significantly improved in all three groups (AR, yoga, and self-exercise groups). Regarding balance, eye open center of pressure-x (EO CoP-x) was significantly decreased in AR group and yoga group. However, eye close CoP-x, eye open standard deviation-x (EO SD-x), and eye open height of ellipse (EO HoE) were only significantly decreased in AR group. AR group also showed meaningfully improved results in morse fall scale. [Conclusion] Augmented reality-based otago exercise can improve muscle strength, balance, and physical factors in elderly women to prevent falls.
Utilizing virtual and augmented reality for educational and clinical enhancements in neurosurgery.
Pelargos, Panayiotis E; Nagasawa, Daniel T; Lagman, Carlito; Tenn, Stephen; Demos, Joanna V; Lee, Seung J; Bui, Timothy T; Barnette, Natalie E; Bhatt, Nikhilesh S; Ung, Nolan; Bari, Ausaf; Martin, Neil A; Yang, Isaac
2017-01-01
Neurosurgery has undergone a technological revolution over the past several decades, from trephination to image-guided navigation. Advancements in virtual reality (VR) and augmented reality (AR) represent some of the newest modalities being integrated into neurosurgical practice and resident education. In this review, we present a historical perspective of the development of VR and AR technologies, analyze its current uses, and discuss its emerging applications in the field of neurosurgery. Copyright © 2016 Elsevier Ltd. All rights reserved.
Towards Pervasive Augmented Reality: Context-Awareness in Augmented Reality.
Grubert, Jens; Langlotz, Tobias; Zollmann, Stefanie; Regenbrecht, Holger
2017-06-01
Augmented Reality is a technique that enables users to interact with their physical environment through the overlay of digital information. While being researched for decades, more recently, Augmented Reality moved out of the research labs and into the field. While most of the applications are used sporadically and for one particular task only, current and future scenarios will provide a continuous and multi-purpose user experience. Therefore, in this paper, we present the concept of Pervasive Augmented Reality, aiming to provide such an experience by sensing the user's current context and adapting the AR system based on the changing requirements and constraints. We present a taxonomy for Pervasive Augmented Reality and context-aware Augmented Reality, which classifies context sources and context targets relevant for implementing such a context-aware, continuous Augmented Reality experience. We further summarize existing approaches that contribute towards Pervasive Augmented Reality. Based our taxonomy and survey, we identify challenges for future research directions in Pervasive Augmented Reality.
Augmented Reality for the Improvement of Remote Laboratories: An Augmented Remote Laboratory
ERIC Educational Resources Information Center
Andujar, J. M.; Mejias, A.; Marquez, M. A.
2011-01-01
Augmented reality (AR) provides huge opportunities for online teaching in science and engineering, as these disciplines place emphasis on practical training and unsuited to completely nonclassroom training. This paper proposes a new concept in virtual and remote laboratories: the augmented remote laboratory (ARL). ARL is being tested in the first…
ERIC Educational Resources Information Center
Safar, Ammar H.; Al-Jafar, Ali A.; Al-Yousefi, Zainab H.
2017-01-01
This experimental research study scrutinized the effectiveness of using augmented reality (AR) applications (apps) as a teaching and learning tool when instructing kindergarten children in the English alphabet in the State of Kuwait. The study compared two groups: (a) experimental, taught using AR apps, and (b) control, taught using traditional…
ERIC Educational Resources Information Center
Perez-Lopez, David; Contero, Manuel
2013-01-01
This paper presents a study to analyze the use of augmented reality (AR) for delivering multimedia content to support the teaching and learning process of the digestive and circulatory systems at the primary school level, and its impact on knowledge retention. Our AR application combines oral explanations and 3D models and animations of anatomical…
ERIC Educational Resources Information Center
Cheng, Kun-Hung
2017-01-01
With the increasing attention to the role of parents in children's learning, what issues parents consider and how they behave when learning with their children when confronted with the emerging augmented reality (AR) technology may be worth exploring. This study was therefore conducted to qualitatively understand parents' conceptions of AR…
Augmented reality in neurovascular surgery: feasibility and first uses in the operating room.
Kersten-Oertel, Marta; Gerard, Ian; Drouin, Simon; Mok, Kelvin; Sirhan, Denis; Sinclair, David S; Collins, D Louis
2015-11-01
The aim of this report is to present a prototype augmented reality (AR) intra-operative brain imaging system. We present our experience of using this new neuronavigation system in neurovascular surgery and discuss the feasibility of this technology for aneurysms, arteriovenous malformations (AVMs), and arteriovenous fistulae (AVFs). We developed an augmented reality system that uses an external camera to capture the live view of the patient on the operating room table and to merge this view with pre-operative volume-rendered vessels. We have extensively tested the system in the laboratory and have used the system in four surgical cases: one aneurysm, two AVMs and one AVF case. The developed AR neuronavigation system allows for precise patient-to-image registration and calibration of the camera, resulting in a well-aligned augmented reality view. Initial results suggest that augmented reality is useful for tailoring craniotomies, localizing vessels of interest, and planning resection corridors. Augmented reality is a promising technology for neurovascular surgery. However, for more complex anomalies such as AVMs and AVFs, better visualization techniques that allow one to distinguish between arteries and veins and determine the absolute depth of a vessel of interest are needed.
Webizing mobile augmented reality content
NASA Astrophysics Data System (ADS)
Ahn, Sangchul; Ko, Heedong; Yoo, Byounghyun
2014-01-01
This paper presents a content structure for building mobile augmented reality (AR) applications in HTML5 to achieve a clean separation of the mobile AR content and the application logic for scaling as on the Web. We propose that the content structure contains the physical world as well as virtual assets for mobile AR applications as document object model (DOM) elements and that their behaviour and user interactions are controlled through DOM events by representing objects and places with a uniform resource identifier. Our content structure enables mobile AR applications to be seamlessly developed as normal HTML documents under the current Web eco-system.
Location-Based Learning through Augmented Reality
ERIC Educational Resources Information Center
Chou, Te-Lien; Chanlin, Lih-Juan
2014-01-01
A context-aware and mixed-reality exploring tool cannot only effectively provide an information-rich environment to users, but also allows them to quickly utilize useful resources and enhance environment awareness. This study integrates Augmented Reality (AR) technology into smartphones to create a stimulating learning experience at a university…
Choi, Hyunseok; Cho, Byunghyun; Masamune, Ken; Hashizume, Makoto; Hong, Jaesung
2016-03-01
Depth perception is a major issue in augmented reality (AR)-based surgical navigation. We propose an AR and virtual reality (VR) switchable visualization system with distance information, and evaluate its performance in a surgical navigation set-up. To improve depth perception, seamless switching from AR to VR was implemented. In addition, the minimum distance between the tip of the surgical tool and the nearest organ was provided in real time. To evaluate the proposed techniques, five physicians and 20 non-medical volunteers participated in experiments. Targeting error, time taken, and numbers of collisions were measured in simulation experiments. There was a statistically significant difference between a simple AR technique and the proposed technique. We confirmed that depth perception in AR could be improved by the proposed seamless switching between AR and VR, and providing an indication of the minimum distance also facilitated the surgical tasks. Copyright © 2015 John Wiley & Sons, Ltd.
Markman, Adam; Shen, Xin; Hua, Hong; Javidi, Bahram
2016-01-15
An augmented reality (AR) smartglass display combines real-world scenes with digital information enabling the rapid growth of AR-based applications. We present an augmented reality-based approach for three-dimensional (3D) optical visualization and object recognition using axially distributed sensing (ADS). For object recognition, the 3D scene is reconstructed, and feature extraction is performed by calculating the histogram of oriented gradients (HOG) of a sliding window. A support vector machine (SVM) is then used for classification. Once an object has been identified, the 3D reconstructed scene with the detected object is optically displayed in the smartglasses allowing the user to see the object, remove partial occlusions of the object, and provide critical information about the object such as 3D coordinates, which are not possible with conventional AR devices. To the best of our knowledge, this is the first report on combining axially distributed sensing with 3D object visualization and recognition for applications to augmented reality. The proposed approach can have benefits for many applications, including medical, military, transportation, and manufacturing.
Game-Based Evacuation Drill Using Augmented Reality and Head-Mounted Display
ERIC Educational Resources Information Center
Kawai, Junya; Mitsuhara, Hiroyuki; Shishibori, Masami
2016-01-01
Purpose: Evacuation drills should be more realistic and interactive. Focusing on situational and audio-visual realities and scenario-based interactivity, the authors have developed a game-based evacuation drill (GBED) system that presents augmented reality (AR) materials on tablet computers. The paper's current research purpose is to improve…
ERIC Educational Resources Information Center
Hwang, Gwo-Jen; Wu, Po-Han; Chen, Chi-Chang; Tu, Nien-Ting
2016-01-01
Augmented reality (AR) has been recognized as a potential technology to help students link what they are observing in the real world to their prior knowledge. One of the most challenging issues of AR-based learning is the provision of effective strategy to help students focus on what they need to observe in the field. In this study, a competitive…
Augmented REality Sandtables (ARESs) Impact on Learning
2016-07-01
Disclaimers The findings in this report are not to be construed as an official Department of the Army position unless so designated by other authorized...The use of augmented reality ( AR ) to supplement training tools, specifically sand tables, can produce highly effective systems at relatively low...engagement and enhanced-scenario customization. The Augmented REality Sandtable ( ARES ) is projected to enhance training and retention of spatial
NASA Astrophysics Data System (ADS)
Lin, Chien-Liang; Su, Yu-Zheng; Hung, Min-Wei; Huang, Kuo-Cheng
2010-08-01
In recent years, Augmented Reality (AR)[1][2][3] is very popular in universities and research organizations. The AR technology has been widely used in Virtual Reality (VR) fields, such as sophisticated weapons, flight vehicle development, data model visualization, virtual training, entertainment and arts. AR has characteristics to enhance the display output as a real environment with specific user interactive functions or specific object recognitions. It can be use in medical treatment, anatomy training, precision instrument casting, warplane guidance, engineering and distance robot control. AR has a lot of vantages than VR. This system developed combines sensors, software and imaging algorithms to make users feel real, actual and existing. Imaging algorithms include gray level method, image binarization method, and white balance method in order to make accurate image recognition and overcome the effects of light.
Augmenting your own reality: student authoring of science-based augmented reality games.
Klopfer, Eric; Sheldon, Josh
2010-01-01
Augmented Reality (AR) simulations superimpose a virtual overlay of data and interactions onto a real-world context. The simulation engine at the heart of this technology is built to afford elements of game play that support explorations and learning in students' natural context--their own community and surroundings. In one of the more recent games, TimeLab 2100, players role-play citizens of the early 22nd century when global climate change is out of control. Through AR, they see their community as it might be nearly one hundred years in the future. TimeLab and other similar AR games balance location specificity and portability--they are games that are tied to a location and games that are movable from place to place. Focusing students on developing their own AR games provides the best of both virtual and physical worlds: a more portable solution that deeply connects young people to their own surroundings. A series of initiatives has focused on technical and pedagogical solutions to supporting students authoring their own games.
Augmented reality in medical education?
Kamphuis, Carolien; Barsom, Esther; Schijven, Marlies; Christoph, Noor
2014-09-01
Learning in the medical domain is to a large extent workplace learning and involves mastery of complex skills that require performance up to professional standards in the work environment. Since training in this real-life context is not always possible for reasons of safety, costs, or didactics, alternative ways are needed to achieve clinical excellence. Educational technology and more specifically augmented reality (AR) has the potential to offer a highly realistic situated learning experience supportive of complex medical learning and transfer. AR is a technology that adds virtual content to the physical real world, thereby augmenting the perception of reality. Three examples of dedicated AR learning environments for the medical domain are described. Five types of research questions are identified that may guide empirical research into the effects of these learning environments. Up to now, empirical research mainly appears to focus on the development, usability and initial implementation of AR for learning. Limited review results reflect the motivational value of AR, its potential for training psychomotor skills and the capacity to visualize the invisible, possibly leading to enhanced conceptual understanding of complex causality.
ERIC Educational Resources Information Center
Richardson, R. Thomas; Sammons, Dotty; Del-Parte, Donna
2018-01-01
This study compared learning performance during and following AR and non-AR topographic map instruction and practice Two-way ANOVA testing indicated no significant differences on a posttest assessment between map type and spatial ability. Prior learning activity results revealed a significant performance difference between AR and non-AR treatment…
Virtual reality, augmented reality…I call it i-Reality.
Grossmann, Rafael J
2015-01-01
The new term improved reality (i-Reality) is suggested to include virtual reality (VR) and augmented reality (AR). It refers to a real world that includes improved, enhanced and digitally created features that would offer an advantage on a particular occasion (i.e., a medical act). I-Reality may help us bridge the gap between the high demand for medical providers and the low supply of them by improving the interaction between providers and patients.
Borrel, Alexandre; Fourches, Denis
2017-12-01
There is a growing interest for the broad use of Augmented Reality (AR) and Virtual Reality (VR) in the fields of bioinformatics and cheminformatics to visualize complex biological and chemical structures. AR and VR technologies allow for stunning and immersive experiences, offering untapped opportunities for both research and education purposes. However, preparing 3D models ready to use for AR and VR is time-consuming and requires a technical expertise that severely limits the development of new contents of potential interest for structural biologists, medicinal chemists, molecular modellers and teachers. Herein we present the RealityConvert software tool and associated website, which allow users to easily convert molecular objects to high quality 3D models directly compatible for AR and VR applications. For chemical structures, in addition to the 3D model generation, RealityConvert also generates image trackers, useful to universally call and anchor that particular 3D model when used in AR applications. The ultimate goal of RealityConvert is to facilitate and boost the development and accessibility of AR and VR contents for bioinformatics and cheminformatics applications. http://www.realityconvert.com. dfourch@ncsu.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Augmented reality for breast imaging.
Rancati, Alberto; Angrigiani, Claudio; Nava, Maurizio B; Catanuto, Giuseppe; Rocco, Nicola; Ventrice, Fernando; Dorr, Julio
2018-06-01
Augmented reality (AR) enables the superimposition of virtual reality reconstructions onto clinical images of a real patient, in real time. This allows visualization of internal structures through overlying tissues, thereby providing a virtual transparency vision of surgical anatomy. AR has been applied to neurosurgery, which utilizes a relatively fixed space, frames, and bony references; the application of AR facilitates the relationship between virtual and real data. Augmented breast imaging (ABI) is described. Breast MRI studies for breast implant patients with seroma were performed using a Siemens 3T system with a body coil and a four-channel bilateral phased-array breast coil as the transmitter and receiver, respectively. Gadolinium was injected as a contrast agent (0.1 mmol/kg at 2 mL/s) using a programmable power injector. Dicom formatted images data from 10 MRI cases of breast implant seroma and 10 MRI cases with T1-2 N0 M0 breast cancer, were imported and transformed into augmented reality images. ABI demonstrated stereoscopic depth perception, focal point convergence, 3D cursor use, and joystick fly-through. ABI can improve clinical outcomes, providing an enhanced view of the structures to work on. It should be further studied to determine its utility in clinical practice.
Khademi, Maryam; Hondori, Hossein Mousavi; Dodakian, Lucy; Cramer, Steve; Lopes, Cristina V
2013-01-01
Introducing computer games to the rehabilitation market led to development of numerous Virtual Reality (VR) training applications. Although VR has provided tremendous benefit to the patients and caregivers, it has inherent limitations, some of which might be solved by replacing it with Augmented Reality (AR). The task of pick-and-place, which is part of many activities of daily living (ADL's), is one of the major affected functions stroke patients mainly expect to recover. We developed an exercise consisting of moving an object between various points, following a flash light that indicates the next target. The results show superior performance of subjects in spatial AR versus non-immersive VR setting. This could be due to the extraneous hand-eye coordination which exists in VR whereas it is eliminated in spatial AR.
A see through future: augmented reality and health information systems.
Monkman, Helen; Kushniruk, Andre W
2015-01-01
Augmented Reality (AR) is a method whereby virtual objects are superimposed on the real world. AR technology is becoming increasingly accessible and affordable and it has many potential health applications. This paper discusses current research on AR health applications such as medical education and medical practice. Some of the potential future uses for this technology (e.g., health information systems, consumer health applications) will also be presented. Additionally, there will be a discussion outlining some of usability and human factors challenges associated with AR in healthcare. It is expected that AR will become increasingly prevalent in healthcare; however, further investigation is required to demonstrate that they provide benefits over traditional methods. Moreover, AR applications must be thoroughly tested to ensure they do not introduce new errors into practice and have patient safety implications.
Creating a Vision Channel for Observing Deep-Seated Anatomy in Medical Augmented Reality
NASA Astrophysics Data System (ADS)
Wimmer, Felix; Bichlmeier, Christoph; Heining, Sandro M.; Navab, Nassir
The intent of medical Augmented Reality (AR) is to augment the surgeon's real view on the patient with the patient's interior anatomy resulting from a suitable visualization of medical imaging data. This paper presents a fast and user-defined clipping technique for medical AR allowing for cutting away any parts of the virtual anatomy and images of the real part of the AR scene hindering the surgeon's view onto the deepseated region of interest. Modeled on cut-away techniques from scientific illustrations and computer graphics, the method creates a fixed vision channel to the inside of the patient. It enables a clear view on the focussed virtual anatomy and moreover improves the perception of spatial depth.
ERIC Educational Resources Information Center
Menorath, Darren; Antonczak, Laurent
2017-01-01
This paper examines the state of the art of mobile Augmented Reality (AR) and mobile Virtual Reality (VR) in relation to collaboration and professional practices in a creative digital environment and higher education. To support their discussion, the authors use a recent design-based research project named "Juxtapose," which explores…
Integrating Augmented Reality in Higher Education: A Multidisciplinary Study of Student Perceptions
ERIC Educational Resources Information Center
Delello, Julie A.; McWhorter, Rochell R.; Camp, Kerri M.
2015-01-01
Augmented reality (AR) is an emerging technology that blends physical objects with virtual reality. Through the integration of digital and print media, a gap between the "on and offline" worlds are merged, radically shifting student-computer interaction in the classroom. This research examined the results of a multiple case study on the…
Holographic and light-field imaging for augmented reality
NASA Astrophysics Data System (ADS)
Lee, Byoungho; Hong, Jong-Young; Jang, Changwon; Jeong, Jinsoo; Lee, Chang-Kun
2017-02-01
We discuss on the recent state of the augmented reality (AR) display technology. In order to realize AR, various seethrough three-dimensional (3D) display techniques have been reported. We describe the AR display with 3D functionality such as light-field display and holography. See-through light-field display can be categorized by the optical elements which are used for see-through property: optical elements controlling path of the light-fields and those generating see-through light-field. Holographic display can be also a good candidate for AR display because it can reconstruct wavefront information and provide realistic virtual information. We introduce the see-through holographic display using various optical techniques.
Design Principles for Augmented Reality Learning
ERIC Educational Resources Information Center
Dunleavy, Matt
2014-01-01
Augmented reality is an emerging technology that utilizes mobile, context-aware devices (e.g., smartphones, tablets) that enable participants to interact with digital information embedded within the physical environment. This overview of design principles focuses on specific strategies that instructional designers can use to develop AR learning…
Flexible augmented reality architecture applied to environmental management
NASA Astrophysics Data System (ADS)
Correia, Nuno M. R.; Romao, Teresa; Santos, Carlos; Trabuco, Adelaide; Santos, Rossana; Romero, Luis; Danado, Jose; Dias, Eduardo; Camara, Antonio; Nobre, Edmundo
2003-05-01
Environmental management often requires in loco observation of the area under analysis. Augmented Reality (AR) technologies allow real time superimposition of synthetic objects on real images, providing augmented knowledge about the surrounding world. Users of an AR system can visualize the real surrounding world together with additional data generated in real time in a contextual way. The work reported in this paper was done in the scope of ANTS (Augmented Environments) project. ANTS is an AR project that explores the development of an augmented reality technological infrastructure for environmental management. This paper presents the architecture and the most relevant modules of ANTS. The system"s architecture follows the client-server model and is based on several independent, but functionally interdependent modules. It has a flexible design, which allows the transfer of some modules to and from the client side, according to the available processing capacities of the client device and the application"s requirements. It combines several techniques to identify the user"s position and orientation allowing the system to adapt to the particular characteristics of each environment. The determination of the data associated to a certain location involves the use of both a 3D Model of the location and the multimedia geo-referenced database.
An augmented reality system validation for the treatment of cockroach phobia.
Bretón-López, Juani; Quero, Soledad; Botella, Cristina; García-Palacios, Azucena; Baños, Rosa Maria; Alcañiz, Mariano
2010-12-01
Augmented reality (AR) is a new technology in which various virtual elements are incorporated into the user's perception of the real world. The most significant aspect of AR is that the virtual elements add relevant and helpful information to the real scene. AR shares some important characteristics with virtual reality as applied in clinical psychology. However, AR offers additional features that might be crucial for treating certain problems. An AR system designed to treat insect phobia has been used for treating phobia of small animals, and positive preliminary data about the global efficacy of the system have been obtained. However, it is necessary to determine the capacity of similar AR systems and their elements that are designed to evoke anxiety in participants; this is achieved by testing the correspondence between the inclusion of feared stimuli and the induction of anxiety. The objective of the present work is to validate whether the stimuli included in the AR-Insect Phobia system are capable of inducing anxiety in six participants diagnosed with cockroach phobia. Results support the adequacy of each element of the system in inducing anxiety in all participants.
DOE Office of Scientific and Technical Information (OSTI.GOV)
This software is an iOS (Apple) Augmented Reality (AR) application that runs on the iPhone and iPad. It is designed to scan in a photograph or graphic and "play" an associated video. This release, SNLSimMagic, was built using Wikitude Augmented Reality (AR) software development kit (SDK) integrated into Apple iOS SDK application and the Cordova libraries. These codes enable the generation of runtime targets using cloud recognition and developer-defined target features which are then accessed by means of a custom application.
Enhancing a Multi-body Mechanism with Learning-Aided Cues in an Augmented Reality Environment
NASA Astrophysics Data System (ADS)
Singh Sidhu, Manjit
2013-06-01
Augmented Reality (AR) is a potential area of research for education, covering issues such as tracking and calibration, and realistic rendering of virtual objects. The ability to augment real world with virtual information has opened the possibility of using AR technology in areas such as education and training as well. In the domain of Computer Aided Learning (CAL), researchers have long been looking into enhancing the effectiveness of the teaching and learning process by providing cues that could assist learners to better comprehend the materials presented. Although a number of works were done looking into the effectiveness of learning-aided cues, but none has really addressed this issue for AR-based learning solutions. This paper discusses the design and model of an AR based software that uses visual cues to enhance the learning process and the outcome perception results of the cues.
ERIC Educational Resources Information Center
Ong, Alex
2010-01-01
The use of augmented reality (AR) tools, where virtual objects such as tables and graphs can be displayed and be interacted with in real scenes created from imaging devices, in mainstream school curriculum is uncommon, as they are potentially costly and sometimes bulky. Thus, such learning tools are mainly applied in tertiary institutions, such as…
Augmenting the Thermal Flux Experiment: A Mixed Reality Approach with the HoloLens
ERIC Educational Resources Information Center
Strzys, M. P.; Kapp, S.; Thees, M.; Kuhn, J.; Lukowicz, P.; Knierim, P.; Schmidt, A.
2017-01-01
In the field of Virtual Reality (VR) and Augmented Reality (AR), technologies have made huge progress during the last years and also reached the field of education. The virtuality continuum, ranging from pure virtuality on one side to the real world on the other, has been successfully covered by the use of immersive technologies like head-mounted…
Using Augmented Reality to Teach and Learn Biochemistry
ERIC Educational Resources Information Center
Vega Garzón, Juan Carlos; Magrini, Marcio Luiz; Galembeck, Eduardo
2017-01-01
Understanding metabolism and metabolic pathways constitutes one of the central aims for students of biological sciences. Learning metabolic pathways should be focused on the understanding of general concepts and core principles. New technologies such Augmented Reality (AR) have shown potential to improve assimilation of biochemistry abstract…
Augmented Reality in Neurosurgery: A Review of Current Concepts and Emerging Applications.
Guha, Daipayan; Alotaibi, Naif M; Nguyen, Nhu; Gupta, Shaurya; McFaul, Christopher; Yang, Victor X D
2017-05-01
Augmented reality (AR) superimposes computer-generated virtual objects onto the user's view of the real world. Among medical disciplines, neurosurgery has long been at the forefront of image-guided surgery, and it continues to push the frontiers of AR technology in the operating room. In this systematic review, we explore the history of AR in neurosurgery and examine the literature on current neurosurgical applications of AR. Significant challenges to surgical AR exist, including compounded sources of registration error, impaired depth perception, visual and tactile temporal asynchrony, and operator inattentional blindness. Nevertheless, the ability to accurately display multiple three-dimensional datasets congruently over the area where they are most useful, coupled with future advances in imaging, registration, display technology, and robotic actuation, portend a promising role for AR in the neurosurgical operating room.
NASA Astrophysics Data System (ADS)
Dörner, Ralf; Lok, Benjamin; Broll, Wolfgang
Backed by a large consumer market, entertainment and education applications have spurred developments in the fields of real-time rendering and interactive computer graphics. Relying on Computer Graphics methodologies, Virtual Reality and Augmented Reality benefited indirectly from this; however, there is no large scale demand for VR and AR in gaming and learning. What are the shortcomings of current VR/AR technology that prevent a widespread use in these application areas? What advances in VR/AR will be necessary? And what might future “VR-enhanced” gaming and learning look like? Which role can and will Virtual Humans play? Concerning these questions, this article analyzes the current situation and provides an outlook on future developments. The focus is on social gaming and learning.
The virtual mirror: a new interaction paradigm for augmented reality environments.
Bichlmeier, Christoph; Heining, Sandro Michael; Feuerstein, Marco; Navab, Nassir
2009-09-01
Medical augmented reality (AR) has been widely discussed within the medical imaging as well as computer aided surgery communities. Different systems for exemplary medical applications have been proposed. Some of them produced promising results. One major issue still hindering AR technology to be regularly used in medical applications is the interaction between physician and the superimposed 3-D virtual data. Classical interaction paradigms, for instance with keyboard and mouse, to interact with visualized medical 3-D imaging data are not adequate for an AR environment. This paper introduces the concept of a tangible/controllable Virtual Mirror for medical AR applications. This concept intuitively augments the direct view of the surgeon with all desired views on volumetric medical imaging data registered with the operation site without moving around the operating table or displacing the patient. We selected two medical procedures to demonstrate and evaluate the potentials of the Virtual Mirror for the surgical workflow. Results confirm the intuitiveness of this new paradigm and its perceptive advantages for AR-based computer aided interventions.
Use of display technologies for augmented reality enhancement
NASA Astrophysics Data System (ADS)
Harding, Kevin
2016-06-01
Augmented reality (AR) is seen as an important tool for the future of user interfaces as well as training applications. An important application area for AR is expected to be in the digitization of training and worker instructions used in the Brilliant Factory environment. The transition of work instructions methods from printed pages in a book or taped to a machine to virtual simulations is a long step with many challenges along the way. A variety of augmented reality tools are being explored today for industrial applications that range from simple programmable projections in the work space to 3D displays and head mounted gear. This paper will review where some of these tool are today and some of the pros and cons being considered for the future worker environment.
Augmented Reality versus Virtual Reality for 3D Object Manipulation.
Krichenbauer, Max; Yamamoto, Goshiro; Taketom, Takafumi; Sandor, Christian; Kato, Hirokazu
2018-02-01
Virtual Reality (VR) Head-Mounted Displays (HMDs) are on the verge of becoming commodity hardware available to the average user and feasible to use as a tool for 3D work. Some HMDs include front-facing cameras, enabling Augmented Reality (AR) functionality. Apart from avoiding collisions with the environment, interaction with virtual objects may also be affected by seeing the real environment. However, whether these effects are positive or negative has not yet been studied extensively. For most tasks it is unknown whether AR has any advantage over VR. In this work we present the results of a user study in which we compared user performance measured in task completion time on a 9 degrees of freedom object selection and transformation task performed either in AR or VR, both with a 3D input device and a mouse. Our results show faster task completion time in AR over VR. When using a 3D input device, a purely VR environment increased task completion time by 22.5 percent on average compared to AR ( ). Surprisingly, a similar effect occurred when using a mouse: users were about 17.3 percent slower in VR than in AR ( ). Mouse and 3D input device produced similar task completion times in each condition (AR or VR) respectively. We further found no differences in reported comfort.
Augmented reality cube game for cognitive training: an interaction study.
Boletsis, Costas; Mccallum, Simon
2014-01-01
There is the potential that cognitive activity may delay cognitive decline in people with mild cognitive impairment. Games provide both cognitive challenge and motivation for repeated use, a prerequisite for long lasting effect. Recent advances in technology introduce several new interaction methods, potentially leading to more efficient, personalized cognitive gaming experiences. In this paper, we present an Augmented Reality (AR) cognitive training game, utilizing cubes as input tools, and we test the cube interaction with a pilot study. The results of the study revealed the marker occlusion problem, and that novice AR users can adjust to the developed AR environment after a small number of sessions.
Chang, Yao-Jen; Kang, Ya-Shu; Huang, Po-Chiao
2013-10-01
This study assessed the possibility of training three people with cognitive impairments using an augmented reality (AR)-based task prompting system. Using AR technology, the system provided picture cues, identified incorrect task steps on the fly, and helped users make corrections. Based on a multiple baseline design, the data showed that the three participants considerably increased their target response, which improved their vocational job skills during the intervention phases and enabled them to maintain the acquired job skills after intervention. The practical and developmental implications of the results are discussed. Copyright © 2013 Elsevier Ltd. All rights reserved.
A telescope with augmented reality functions
NASA Astrophysics Data System (ADS)
Hou, Qichao; Cheng, Dewen; Wang, Qiwei; Wang, Yongtian
2016-10-01
This study introduces a telescope with virtual reality (VR) and augmented reality (AR) functions. In this telescope, information on the micro-display screen is integrated to the reticule of telescope through a beam splitter and is then received by the observer. The design and analysis of telescope optical system with AR and VR ability is accomplished and the opto-mechanical structure is designed. Finally, a proof-of-concept prototype is fabricated and demonstrated. The telescope has an exit pupil diameter of 6 mm at an eye relief of 19 mm, 6° field of view, 5 to 8 times visual magnification , and a 30° field of view of the virtual image.
Applications of Augmented Reality-Based Natural Interactive Learning in Magnetic Field Instruction
ERIC Educational Resources Information Center
Cai, Su; Chiang, Feng-Kuang; Sun, Yuchen; Lin, Chenglong; Lee, Joey J.
2017-01-01
Educators must address several challenges inherent to the instruction of scientific disciplines such as physics -- expensive or insufficient laboratory equipment, equipment error, difficulty in simulating certain experimental conditions. Augmented reality (AR) can be a promising approach to address these challenges. In this paper, we discuss the…
Integrating Augmented Reality Technology to Enhance Children's Learning in Marine Education
ERIC Educational Resources Information Center
Lu, Su-Ju; Liu, Ying-Chieh
2015-01-01
Marine education comprises rich and multifaceted issues. Raising general awareness of marine environments and issues demands the development of new learning materials. This study adapts concepts from digital game-based learning to design an innovative marine learning program integrating augmented reality (AR) technology for lower grade primary…
Learning Molecular Structures in a Tangible Augmented Reality Environment
ERIC Educational Resources Information Center
Asai, Kikuo; Takase, Norio
2011-01-01
This article presents the characteristics of using a tangible table top environment produced by augmented reality (AR), aimed at improving the environment in which learners observe three-dimensional molecular structures. The authors perform two evaluation experiments. A performance test for a user interface demonstrates that learners with a…
Augmented Reality and Mobile Art
NASA Astrophysics Data System (ADS)
Gwilt, Ian
The combined notions of augmented-reality (AR) and mobile art are based on the amalgamation of a number of enabling technologies including computer imaging, emergent display and tracking systems and the increased computing-power in hand-held devices such as Tablet PCs, smart phones, or personal digital assistants (PDAs) which have been utilized in the making of works of art. There is much published research on the technical aspects of AR and the ongoing work being undertaken in the development of faster more efficient AR systems [1] [2]. In this text I intend to concentrate on how AR and its associated typologies can be applied in the context of new media art practices, with particular reference to its application on hand-held or mobile devices.
Computer vision and augmented reality in gastrointestinal endoscopy
Mahmud, Nadim; Cohen, Jonah; Tsourides, Kleovoulos; Berzin, Tyler M.
2015-01-01
Augmented reality (AR) is an environment-enhancing technology, widely applied in the computer sciences, which has only recently begun to permeate the medical field. Gastrointestinal endoscopy—which relies on the integration of high-definition video data with pathologic correlates—requires endoscopists to assimilate and process a tremendous amount of data in real time. We believe that AR is well positioned to provide computer-guided assistance with a wide variety of endoscopic applications, beginning with polyp detection. In this article, we review the principles of AR, describe its potential integration into an endoscopy set-up, and envisage a series of novel uses. With close collaboration between physicians and computer scientists, AR promises to contribute significant improvements to the field of endoscopy. PMID:26133175
Virtual and augmented reality in the treatment of phantom limb pain: A literature review.
Dunn, Justin; Yeo, Elizabeth; Moghaddampour, Parisah; Chau, Brian; Humbert, Sarah
2017-01-01
Phantom limb pain (PLP), the perception of discomfort in a limb no longer present, commonly occurs following amputation. A variety of interventions have been employed for PLP, including mirror therapy. Virtual Reality (VR) and augmented reality (AR) mirror therapy treatments have also been utilized and have the potential to provide an even greater immersive experience for the amputee. However, there is not currently a consensus on the efficacy of VR and AR therapy. The aim of this review is to evaluate and summarize the current research on the effect of immersive VR and AR in the treatment of PLP. A comprehensive literature search was conducted utilizing PubMed and Google Scholar in order to collect all available studies concerning the use of VR and/or AR in the treatment of PLP using the search terms "virtual reality," "augmented reality," and "phantom limb pain." Eight studies in total were evaluated, with six of those reporting quantitative data and the other two reporting qualitative findings. All studies located were of low-level evidence. Each noted improved pain with VR and AR treatment for phantom limb pain, through quantitative or qualitative reporting. Additionally, adverse effects were limited only to simulator sickness occurring in one trial for one patient. Despite the positive findings, all of the studies were confined purely to case studies and case report series. No studies of higher evidence have been conducted, thus considerably limiting the strength of the findings. As such, the current use of VR and AR for PLP management, while attractive due to the increasing levels of immersion, customizable environments, and decreasing cost, is yet to be fully proven and continues to need further research with higher quality studies to fully explore its benefits.
ERIC Educational Resources Information Center
Huisinga, Laura Anne
2017-01-01
Technology has shown promise to aid struggling readers in higher education, particularly through new and emerging technologies. Augmented reality (AR) has been used successfully in the classroom to motivate and engage struggling learners, yet little research exists on how augmented print might help struggling readers. This study explores this gap,…
Magic cards: a new augmented-reality approach.
Demuynck, Olivier; Menendez, José Manuel
2013-01-01
Augmented reality (AR) commonly uses markers for detection and tracking. Such multimedia applications associate each marker with a virtual 3D model stored in the memory of the camera-equipped device running the application. Application users are limited in their interactions, which require knowing how to design and program 3D objects. This generally prevents them from developing their own entertainment AR applications. The Magic Cards application solves this problem by offering an easy way to create and manage an unlimited number of virtual objects that are encoded on special markers.
NASA Astrophysics Data System (ADS)
Wüest, Robert; Nebiker, Stephan
2018-05-01
In this paper we present an app framework for augmenting large-scale walkable maps and orthoimages in museums or public spaces using standard smartphones and tablets. We first introduce a novel approach for using huge orthoimage mosaic floor prints covering several hundred square meters as natural Augmented Reality (AR) markers. We then present a new app architecture and subsequent tests in the Swissarena of the Swiss National Transport Museum in Lucerne demonstrating the capabilities of accurately tracking and augmenting different map topics, including dynamic 3d data such as live air traffic. The resulting prototype was tested with everyday visitors of the museum to get feedback on the usability of the AR app and to identify pitfalls when using AR in the context of a potentially crowded museum. The prototype is to be rolled out to the public after successful testing and optimization of the app. We were able to show that AR apps on standard smartphone devices can dramatically enhance the interactive use of large-scale maps for different purposes such as education or serious gaming in a museum context.
Augmented Reality Based Doppler Lidar Data Visualization: Promises and Challenges
NASA Astrophysics Data System (ADS)
Cherukuru, N. W.; Calhoun, R.
2016-06-01
Augmented reality (AR) is a technology in which the enables the user to view virtual content as if it existed in real world. We are exploring the possibility of using this technology to view radial velocities or processed wind vectors from a Doppler wind lidar, thus giving the user an ability to see the wind in a literal sense. This approach could find possible applications in aviation safety, atmospheric data visualization as well as in weather education and public outreach. As a proof of concept, we used the lidar data from a recent field campaign and developed a smartphone application to view the lidar scan in augmented reality. In this paper, we give a brief methodology of this feasibility study, present the challenges and promises of using AR technology in conjunction with Doppler wind lidars.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Sandia National Laboratories has developed a state-of-the-art augmented reality training system for close-quarters combat (CQB). This system uses a wearable augmented reality system to place the user in a real environment while engaging enemy combatants in virtual space (Boston Dynamics DI-Guy). Umbra modeling and simulation environment is used to integrate and control the AR system.
Augmented Reality Games: Using Technology on a Budget
ERIC Educational Resources Information Center
Annetta, Leonard; Burton, Erin Peters; Frazier, Wendy; Cheng, Rebecca; Chmiel, Margaret
2012-01-01
As smartphones become more ubiquitous among adolescents, there is increasing potential for these as a tool to engage students in science instruction through innovative learning environments such as augmented reality (AR). Aligned with the National Science Education Standards (NRC 1996) and integrating the three dimensions of "A Framework for K-12…
ERIC Educational Resources Information Center
Lin, Hao-Chiang Koong; Chen, Mei-Chi; Chang, Chih-Kai
2015-01-01
This study integrates augmented reality (AR) technology into teaching activities to design a learning system that assists junior high-school students in learning solid geometry. The following issues are addressed: (1) the relationship between achievements in mathematics and performance in spatial perception; (2) whether system-assisted learning…
What Teachers Need to Know about Augmented Reality Enhanced Learning Environments
ERIC Educational Resources Information Center
Wasko, Christopher
2013-01-01
Augmented reality (AR) enhanced learning environments have been designed to teach a variety of subjects by having learners act like professionals in the field as opposed to students in a classroom. The environments, grounded in constructivist and situated learning theories, place students in a meaningful, non-classroom environment and force them…
Teaching binocular indirect ophthalmoscopy to novice residents using an augmented reality simulator.
Rai, Amandeep S; Rai, Amrit S; Mavrikakis, Emmanouil; Lam, Wai Ching
2017-10-01
To compare the traditional teaching approach of binocular indirect ophthalmoscopy (BIO) to the EyeSI augmented reality (AR) BIO simulator. Prospective randomized control trial. 28 post-graduate year one (PGY1) ophthalmology residents. Residents were recruited at the 2012 Toronto Ophthalmology Residents Introductory Course (TORIC). 15 were randomized to conventional teaching (Group 1), and 13 to augmented reality simulator training (Group 2). 3 vitreoretinal fellows were enrolled to serve as experts. Evaluations were completed on the simulator, with 3 tasks, and outcome measures were total raw score, total time elapsed, and performance. Following conventional training, Group 1 residents were outperformed by vitreoretinal fellows with respect to all 3 outcome measures. Following AR training, Group 2 residents demonstrated superior total scores and performance compared to Group 1 residents. Once the Group 1 residents also completed the AR BIO training, there was a significant improvement compared to their baseline scores, and were now on par with Group 2 residents. This study provides construct validity for the EyeSI AR BIO simulator and demonstrates that it may be superior to conventional BIO teaching for novice ophthalmology residents. Copyright © 2017 Canadian Ophthalmological Society. Published by Elsevier Inc. All rights reserved.
Mobile devices, Virtual Reality, Augmented Reality, and Digital Geoscience Education.
NASA Astrophysics Data System (ADS)
Crompton, H.; De Paor, D. G.; Whitmeyer, S. J.; Bentley, C.
2016-12-01
Mobile devices are playing an increasing role in geoscience education. Affordances include instructor-student communication and class management in large classrooms, virtual and augmented reality applications, digital mapping, and crowd-sourcing. Mobile technologies have spawned the sub field of mobile learning or m-learning, which is defined as learning across multiple contexts, through social and content interactions. Geoscientists have traditionally engaged in non-digital mobile learning via fieldwork, but digital devices are greatly extending the possibilities, especially for non-traditional students. Smartphones and tablets are the most common devices but smart glasses such as Pivothead enable live streaming of a first-person view (see for example, https://youtu.be/gWrDaYP5w58). Virtual reality headsets such as Google Cardboard create an immersive virtual field experience and digital imagery such as GigaPan and Structure from Motion enables instructors and/or students to create virtual specimens and outcrops that are sharable across the globe. Whereas virtual reality (VR) replaces the real world with a virtual representation, augmented reality (AR) overlays digital data on the live scene visible to the user in real time. We have previously reported on our use of the AR application called FreshAiR for geoscientific "egg hunts." The popularity of Pokémon Go demonstrates the potential of AR for mobile learning in the geosciences.
The NASA Augmented/Virtual Reality Lab: The State of the Art at KSC
NASA Technical Reports Server (NTRS)
Little, William
2017-01-01
The NASA Augmented Virtual Reality (AVR) Lab at Kennedy Space Center is dedicated to the investigation of Augmented Reality (AR) and Virtual Reality (VR) technologies, with the goal of determining potential uses of these technologies as human-computer interaction (HCI) devices in an aerospace engineering context. Begun in 2012, the AVR Lab has concentrated on commercially available AR and VR devices that are gaining in popularity and use in a number of fields such as gaming, training, and telepresence. We are working with such devices as the Microsoft Kinect, the Oculus Rift, the Leap Motion, the HTC Vive, motion capture systems, and the Microsoft Hololens. The focus of our work has been on human interaction with the virtual environment, which in turn acts as a communications bridge to remote physical devices and environments which the operator cannot or should not control or experience directly. Particularly in reference to dealing with spacecraft and the oftentimes hazardous environments they inhabit, it is our hope that AR and VR technologies can be utilized to increase human safety and mission success by physically removing humans from those hazardous environments while virtually putting them right in the middle of those environments.
Perform light and optic experiments in Augmented Reality
NASA Astrophysics Data System (ADS)
Wozniak, Peter; Vauderwange, Oliver; Curticapean, Dan; Javahiraly, Nicolas; Israel, Kai
2015-10-01
In many scientific studies lens experiments are part of the curriculum. The conducted experiments are meant to give the students a basic understanding for the laws of optics and its applications. Most of the experiments need special hardware like e.g. an optical bench, light sources, apertures and different lens types. Therefore it is not possible for the students to conduct any of the experiments outside of the university's laboratory. Simple optical software simulators enabling the students to virtually perform lens experiments already exist, but are mostly desktop or web browser based. Augmented Reality (AR) is a special case of mediated and mixed reality concepts, where computers are used to add, subtract or modify one's perception of reality. As a result of the success and widespread availability of handheld mobile devices, like e.g. tablet computers and smartphones, mobile augmented reality applications are easy to use. Augmented reality can be easily used to visualize a simulated optical bench. The students can interactively modify properties like e.g. lens type, lens curvature, lens diameter, lens refractive index and the positions of the instruments in space. Light rays can be visualized and promote an additional understanding of the laws of optics. An AR application like this is ideally suited to prepare the actual laboratory sessions and/or recap the teaching content. The authors will present their experience with handheld augmented reality applications and their possibilities for light and optic experiments without the needs for specialized optical hardware.
Using augmented reality to inform consumer choice and lower carbon footprints
DOE Office of Scientific and Technical Information (OSTI.GOV)
Isley, Steven C.; Ketcham, Robert; Arent, Douglas J.
Consumers who wish to consider product attributes like carbon footprints in their purchasing decisions are often blocked from meaningful action by a lack of information. We conducted a single randomized controlled trial at a grocery store to evaluate the effects of providing such product attribute and carbon footprint information via augmented reality (AR) displays on bottled water and breakfast cereal, two frequently purchased goods. Using an AR smartphone app that incorporates comparative and detailed product information into personalized data and recommendations, a 23% statistically significant reduction in carbon footprint was found for bottled water, and non-significant reductions for breakfast cereal.more » Furthermore, AR informed choice lead to healthier cereal choices.« less
Using augmented reality to inform consumer choice and lower carbon footprints
Isley, Steven C.; Ketcham, Robert; Arent, Douglas J.
2017-05-23
Consumers who wish to consider product attributes like carbon footprints in their purchasing decisions are often blocked from meaningful action by a lack of information. We conducted a single randomized controlled trial at a grocery store to evaluate the effects of providing such product attribute and carbon footprint information via augmented reality (AR) displays on bottled water and breakfast cereal, two frequently purchased goods. Using an AR smartphone app that incorporates comparative and detailed product information into personalized data and recommendations, a 23% statistically significant reduction in carbon footprint was found for bottled water, and non-significant reductions for breakfast cereal.more » Furthermore, AR informed choice lead to healthier cereal choices.« less
Augmented reality application for industrial non-destructive inspection training
NASA Astrophysics Data System (ADS)
Amza, Catalin Gheorghe; Zapciu, Aurelian; Teodorescu, Octav
2018-02-01
Such a technology - Augmented Reality (AR) has great potential of use, especially for training purposes of new operators on using expensive equipment. In this context, the paper presents an augmented reality training system developed for phased-array ultrasonic non-destructive testing (NDT) equipment. The application has been developed using Unity 5.6.0 game-engine platform integrated with Vuforia sdk toolkit for devices with Android operating system. The test results performed by several NDT operators showed good results, thus proving the potential of using the application in the industrial field.
Augmenting Reality and Formality of Informal and Non-Formal Settings to Enhance Blended Learning
ERIC Educational Resources Information Center
Pérez-Sanagustin, Mar; Hernández-Leo, Davinia; Santos, Patricia; Kloos, Carlos Delgado; Blat, Josep
2014-01-01
Visits to museums and city tours have been part of higher and secondary education curriculum activities for many years. However these activities are typically considered "less formal" when compared to those carried out in the classroom, mainly because they take place in informal or non-formal settings. Augmented Reality (AR) technologies…
ERIC Educational Resources Information Center
Efstathiou, Irene; Kyza, Eleni A.; Georgiou, Yiannis
2018-01-01
This study investigated the contribution of a location-based augmented reality (AR) inquiry-learning environment in developing 3rd grade students' historical empathy and conceptual understanding. Historical empathy is an important element of historical thinking, which is considered to improve conceptual understanding and support the development of…
ERIC Educational Resources Information Center
Harley, Jason M.; Poitras, Eric G.; Jarrell, Amanda; Duffy, Melissa C.; Lajoie, Susanne P.
2016-01-01
Research on the effectiveness of augmented reality (AR) on learning exists, but there is a paucity of empirical work that explores the role that positive emotions play in supporting learning in such settings. To address this gap, this study compared undergraduate students' emotions and learning outcomes during a guided historical tour using mobile…
A Mobile Augmented Reality System for the Learning of Dental Morphology
ERIC Educational Resources Information Center
Juan, M.-Carmen; Alexandrescu, Lucian; Folguera, Fernando; García-García, Inmaculada
2016-01-01
Three-dimensional models are important when the learning content is difficult to acquire from 2D images or other traditional methods. This is the case for learning dental morphology. In this paper, we present a mobile augmented reality (AR) system for learning dental morphology. A study with students was carried out to determine whether learning…
ERIC Educational Resources Information Center
Oh, Seungjae; So, Hyo-Jeong; Gaydos, Matthew
2018-01-01
The goal for this research is to articulate and test a new hybrid Augmented Reality (AR) environment for conceptual understanding. From the theoretical lens of embodied interaction, we have designed a multi-user participatory simulation called ARfract where visitors in a science museum can learn about complex scientific concepts on the refraction…
Applying Augmented Reality to Enhance Learning: A Study of Different Teaching Materials
ERIC Educational Resources Information Center
Hung, Y.-H.; Chen, C.-H.; Huang, S.-W.
2017-01-01
The objective of this study was to determine the usefulness of augmented reality (AR) in teaching. An experiment was conducted to examine children's learning performances, which included the number of errors they made, their ability to remember the content of what they had read and their satisfaction with the three types of teaching materials,…
Making the Invisible Observable by Augmented Reality in Informal Science Education Context
ERIC Educational Resources Information Center
Salmi, Hannu; Thuneberg, Helena; Vainikainen, Mari-Pauliina
2017-01-01
The aim of the study was to analyse learning using Augmented Reality (AR) technology and the motivational and cognitive aspects related to it in an informal learning context. The 146 participants were 11- to 13-year-old Finnish pupils visiting a science centre exhibition. The data, which consisted of both cognitive tasks and self-report…
ERIC Educational Resources Information Center
Yang, Mau-Tsuen; Liao, Wan-Che
2014-01-01
The physical-virtual immersion and real-time interaction play an essential role in cultural and language learning. Augmented reality (AR) technology can be used to seamlessly merge virtual objects with real-world images to realize immersions. Additionally, computer vision (CV) technology can recognize free-hand gestures from live images to enable…
Real-time 3D image reconstruction guidance in liver resection surgery.
Soler, Luc; Nicolau, Stephane; Pessaux, Patrick; Mutter, Didier; Marescaux, Jacques
2014-04-01
Minimally invasive surgery represents one of the main evolutions of surgical techniques. However, minimally invasive surgery adds difficulty that can be reduced through computer technology. From a patient's medical image [US, computed tomography (CT) or MRI], we have developed an Augmented Reality (AR) system that increases the surgeon's intraoperative vision by providing a virtual transparency of the patient. AR is based on two major processes: 3D modeling and visualization of anatomical or pathological structures appearing in the medical image, and the registration of this visualization onto the real patient. We have thus developed a new online service, named Visible Patient, providing efficient 3D modeling of patients. We have then developed several 3D visualization and surgical planning software tools to combine direct volume rendering and surface rendering. Finally, we have developed two registration techniques, one interactive and one automatic providing intraoperative augmented reality view. From January 2009 to June 2013, 769 clinical cases have been modeled by the Visible Patient service. Moreover, three clinical validations have been realized demonstrating the accuracy of 3D models and their great benefit, potentially increasing surgical eligibility in liver surgery (20% of cases). From these 3D models, more than 50 interactive AR-assisted surgical procedures have been realized illustrating the potential clinical benefit of such assistance to gain safety, but also current limits that automatic augmented reality will overcome. Virtual patient modeling should be mandatory for certain interventions that have now to be defined, such as liver surgery. Augmented reality is clearly the next step of the new surgical instrumentation but remains currently limited due to the complexity of organ deformations during surgery. Intraoperative medical imaging used in new generation of automated augmented reality should solve this issue thanks to the development of Hybrid OR.
Integration of Mobile AR Technology in Performance Assessment
ERIC Educational Resources Information Center
Kuo-Hung, Chao; Kuo-En, Chang; Chung-Hsien, Lan; Kinshuk; Yao-Ting, Sung
2016-01-01
This study was aimed at exploring how to use augmented reality (AR) technology to enhance the effect of performance assessment (PA). A mobile AR performance assessment system (MARPAS) was developed by integrating AR technology to reduce the limitations in observation and assessment during PA. This system includes three modules: Authentication, AR…
Sensor fusion and augmented reality with the SAFIRE system
NASA Astrophysics Data System (ADS)
Saponaro, Philip; Treible, Wayne; Phelan, Brian; Sherbondy, Kelly; Kambhamettu, Chandra
2018-04-01
The Spectrally Agile Frequency-Incrementing Reconfigurable (SAFIRE) mobile radar system was developed and exercised at an arid U.S. test site. The system can detect hidden target using radar, a global positioning system (GPS), dual stereo color cameras, and dual stereo thermal cameras. An Augmented Reality (AR) software interface allows the user to see a single fused video stream containing the SAR, color, and thermal imagery. The stereo sensors allow the AR system to display both fused 2D imagery and 3D metric reconstructions, where the user can "fly" around the 3D model and switch between the modalities.
Using augmented reality to inform consumer choice and lower carbon footprints
NASA Astrophysics Data System (ADS)
Isley, Steven C.; Ketcham, Robert; Arent, Douglas J.
2017-05-01
Consumers who wish to consider product attributes like carbon footprints in their purchasing decisions are often blocked from meaningful action by a lack of information. We conducted a randomized controlled trial at a grocery store to evaluate the effects of providing such product attribute and carbon footprint information via augmented reality (AR) displays on bottled water and breakfast cereal, two frequently purchased goods. Using an AR smartphone app that combines comparative and detailed product information into personalized data and recommendations, a 23% reduction in carbon footprint was found for bottled water, and non-significant reductions for breakfast cereal. However, AR informed choice lead to healthier cereal purchases with an average of 32% less sugar, 15% less fat, and 9.8% less sodium. This research suggests that AR techniques can help facilitate complex decision-making and lead to better choices.
Qu, Miao; Hou, Yikang; Xu, Yourong; Shen, Congcong; Zhu, Ming; Xie, Le; Wang, Hao; Zhang, Yan; Chai, Gang
2015-01-01
Through three-dimensional real time imaging, augmented reality (AR) can provide an overlay of the anatomical structure, or visual cues for specific landmarks. In this study, an AR Toolkit was used for distraction osteogenesis with hemifacial microsomia to define the mandibular osteotomy line and assist with intraoral distractor placement. 20 patients with hemifacial microsomia were studied and were randomly assigned to experimental and control groups. Pre-operative computed tomography was used in both groups, whereas AR was used in the experimental group. Afterwards, pre- and post-operative computed tomographic scans of both groups were superimposed, and several measurements were made and analysed. Both the conventional method and AR technique achieved proper positioning of the osteotomy planes, although the AR was more accurate. The difference in average vertical distance from the coronoid and condyle process to the pre- and post-operative cutting planes was significant (p < 0.01) between the two groups, whereas no significant difference (p > 0.05) was observed in the average angle between the two planes. The difference in deviations between the intersection points of the overlaid mandible across two cutting planes was also significant (p < 0.01). This study reports on an efficient approach for guiding intraoperative distraction osteogenesis. Augmented reality tools such as the AR Toolkit may be helpful for precise positioning of intraoral distractors in patients with hemifacial microsomia in craniofacial surgery. Copyright © 2014 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.
Luck, Joshua; Billingsley, Michael L.; Heyes, Richard; Smith, Oliver J.; Mosahebi, Afshin; Khoussa, Abu; Abu-Sittah, Ghassan; Hachach-Haram, Nadine
2018-01-01
Summary: Augmented reality (AR) is defined as “a technology that superimposes a computer-generated image on a user’s view of the real world, thus providing a composite view.”1 This case report describes how emerging AR telesurgery technologies may be used to facilitate international surgeon–surgeon collaboration and training. Here, we illustrate how a remote surgeon in Beirut, Lebanon, was able to offer assistance to a surgeon in Gaza, Palestine, during a complex hand reconstruction case following a bomb-blast injury in an 18-year-old male. We discuss the implications of AR technology on the future of global surgery and how it may be used to reduce structural inequities in access to safe surgical care. PMID:29707463
Augmented reality glass-free three-dimensional display with the stereo camera
NASA Astrophysics Data System (ADS)
Pang, Bo; Sang, Xinzhu; Chen, Duo; Xing, Shujun; Yu, Xunbo; Yan, Binbin; Wang, Kuiru; Yu, Chongxiu
2017-10-01
An improved method for Augmented Reality (AR) glass-free three-dimensional (3D) display based on stereo camera used for presenting parallax contents from different angle with lenticular lens array is proposed. Compared with the previous implementation method of AR techniques based on two-dimensional (2D) panel display with only one viewpoint, the proposed method can realize glass-free 3D display of virtual objects and real scene with 32 virtual viewpoints. Accordingly, viewers can get abundant 3D stereo information from different viewing angles based on binocular parallax. Experimental results show that this improved method based on stereo camera can realize AR glass-free 3D display, and both of virtual objects and real scene have realistic and obvious stereo performance.
Directing driver attention with augmented reality cues
Rusch, Michelle L.; Schall, Mark C.; Gavin, Patrick; Lee, John D.; Dawson, Jeffrey D.; Vecera, Shaun; Rizzo, Matthew
2013-01-01
This simulator study evaluated the effects of augmented reality (AR) cues designed to direct the attention of experienced drivers to roadside hazards. Twenty-seven healthy middle-aged licensed drivers with a range of attention capacity participated in a 54 mile (1.5 hour) drive in an interactive fixed-base driving simulator. Each participant received AR cues to potential roadside hazards in six simulated straight (9 mile long) rural roadway segments. Drivers were evaluated on response time for detecting a potentially hazardous event, detection accuracy for target (hazard) and non-target objects, and headway with respect to the hazards. Results showed no negative outcomes associated with interference. AR cues did not impair perception of non-target objects, including for drivers with lower attentional capacity. Results showed near significant response time benefits for AR cued hazards. AR cueing increased response rate for detecting pedestrians and warning signs but not vehicles. AR system false alarms and misses did not impair driver responses to potential hazards. PMID:24436635
Augmented Reality M-Learning to Enhance Nursing Skills Acquisition in the Clinical Skills Laboratory
ERIC Educational Resources Information Center
Garrett, Bernard M.; Jackson, Cathryn; Wilson, Brian
2015-01-01
Purpose: This paper aims to report on a pilot research project designed to explore if new mobile augmented reality (AR) technologies have the potential to enhance the learning of clinical skills in the lab. Design/methodology/approach: An exploratory action-research-based pilot study was undertaken to explore an initial proof-of-concept design in…
Using Augmented Reality in Early Art Education: A Case Study in Hong Kong Kindergarten
ERIC Educational Resources Information Center
Huang, Yujia; Li, Hui; Fong, Ricci
2016-01-01
Innovation in pedagogy by technology integration in kindergarten classroom has always been a challenge for most teachers. This design-based research aimed to explore the feasibility of using Augmented Reality (AR) technology in early art education with a focus on the gains and pains of this innovation. A case study was conducted in a typical…
ERIC Educational Resources Information Center
Conley, Quincy
2013-01-01
Statistics is taught at every level of education, yet teachers often have to assume their students have no knowledge of statistics and start from scratch each time they set out to teach statistics. The motivation for this experimental study comes from interest in exploring educational applications of augmented reality (AR) delivered via mobile…
ERIC Educational Resources Information Center
Chang, Hsin-Yi; Hsu, Ying-Shao; Wu, Hsin-Kai
2016-01-01
We investigated the impact of an augmented reality (AR) versus interactive simulation (IS) activity incorporated in a computer learning environment to facilitate students' learning of a socio-scientific issue (SSI) on nuclear power plants and radiation pollution. We employed a quasi-experimental research design. Two classes (a total of 45…
ERIC Educational Resources Information Center
Woods, Terri L.; Reed, Sarah; Hsi, Sherry; Woods, John A.; Woods, Michael R.
2016-01-01
Spatial thinking is often challenging for introductory geology students. A pilot study using the Augmented Reality sandbox (AR sandbox) suggests it can be a powerful tool for bridging the gap between two-dimensional (2D) representations and real landscapes, as well as enhancing the spatial thinking and modeling abilities of students. The AR…
ARBOOK: Development and Assessment of a Tool Based on Augmented Reality for Anatomy
ERIC Educational Resources Information Center
Ferrer-Torregrosa, J.; Torralba, J.; Jimenez, M. A.; García, S.; Barcia, J. M.
2015-01-01
The evolution of technologies and the development of new tools with educational purposes are growing up. This work presents the experience of a new tool based on augmented reality (AR) focusing on the anatomy of the lower limb. ARBOOK was constructed and developed based on TC and MRN images, dissections and drawings. For ARBOOK evaluation, a…
ERIC Educational Resources Information Center
Wang, Hung-Yuan; Duh, Henry Been-Lirn; Li, Nai; Lin, Tzung-Jin; Tsai, Chin-Chung
2014-01-01
The purpose of this study is to investigate and compare students' collaborative inquiry learning behaviors and their behavior patterns in an augmented reality (AR) simulation system and a traditional 2D simulation system. Their inquiry and discussion processes were analyzed by content analysis and lag sequential analysis (LSA). Forty…
ERIC Educational Resources Information Center
Chen, Cheng-ping; Wang, Chang-Hwa
2015-01-01
Studies have proven that merging hands-on and online learning can result in an enhanced experience in learning science. In contrast to traditional online learning, multiple in-classroom activities may be involved in an augmented-reality (AR)-embedded e-learning process and thus could reduce the effects of individual differences. Using a…
Augmented reality for anatomical education.
Thomas, Rhys Gethin; John, Nigel William; Delieu, John Michael
2010-03-01
The use of Virtual Environments has been widely reported as a method of teaching anatomy. Generally such environments only convey the shape of the anatomy to the student. We present the Bangor Augmented Reality Education Tool for Anatomy (BARETA), a system that combines Augmented Reality (AR) technology with models produced using Rapid Prototyping (RP) technology, to provide the student with stimulation for touch as well as sight. The principal aims of this work were to provide an interface more intuitive than a mouse and keyboard, and to evaluate such a system as a viable supplement to traditional cadaver based education.
Tang, Rui; Ma, Long-Fei; Rong, Zhi-Xia; Li, Mo-Dan; Zeng, Jian-Ping; Wang, Xue-Dong; Liao, Hong-En; Dong, Jia-Hong
2018-04-01
Augmented reality (AR) technology is used to reconstruct three-dimensional (3D) images of hepatic and biliary structures from computed tomography and magnetic resonance imaging data, and to superimpose the virtual images onto a view of the surgical field. In liver surgery, these superimposed virtual images help the surgeon to visualize intrahepatic structures and therefore, to operate precisely and to improve clinical outcomes. The keywords "augmented reality", "liver", "laparoscopic" and "hepatectomy" were used for searching publications in the PubMed database. The primary source of literatures was from peer-reviewed journals up to December 2016. Additional articles were identified by manual search of references found in the key articles. In general, AR technology mainly includes 3D reconstruction, display, registration as well as tracking techniques and has recently been adopted gradually for liver surgeries including laparoscopy and laparotomy with video-based AR assisted laparoscopic resection as the main technical application. By applying AR technology, blood vessels and tumor structures in the liver can be displayed during surgery, which permits precise navigation during complex surgical procedures. Liver transformation and registration errors during surgery were the main factors that limit the application of AR technology. With recent advances, AR technologies have the potential to improve hepatobiliary surgical procedures. However, additional clinical studies will be required to evaluate AR as a tool for reducing postoperative morbidity and mortality and for the improvement of long-term clinical outcomes. Future research is needed in the fusion of multiple imaging modalities, improving biomechanical liver modeling, and enhancing image data processing and tracking technologies to increase the accuracy of current AR methods. Copyright © 2018 First Affiliated Hospital, Zhejiang University School of Medicine in China. Published by Elsevier B.V. All rights reserved.
Pose tracking for augmented reality applications in outdoor archaeological sites
NASA Astrophysics Data System (ADS)
Younes, Georges; Asmar, Daniel; Elhajj, Imad; Al-Harithy, Howayda
2017-01-01
In recent years, agencies around the world have invested huge amounts of effort toward digitizing many aspects of the world's cultural heritage. Of particular importance is the digitization of outdoor archaeological sites. In the spirit of valorization of this digital information, many groups have developed virtual or augmented reality (AR) computer applications themed around a particular archaeological object. The problem of pose tracking in outdoor AR applications is addressed. Different positional systems are analyzed, resulting in the selection of a monocular camera-based user tracker. The limitations that challenge this technique from map generation, scale, anchoring, to lighting conditions are analyzed and systematically addressed. Finally, as a case study, our pose tracking system is implemented within an AR experience in the Byblos Roman theater in Lebanon.
archAR: an archaeological augmented reality experience
NASA Astrophysics Data System (ADS)
Wiley, Bridgette; Schulze, Jürgen P.
2015-03-01
We present an application for Android phones or tablets called "archAR" that uses augmented reality as an alternative, portable way of viewing archaeological information from UCSD's Levantine Archaeology Laboratory. archAR provides a unique experience of flying through an archaeological dig site in the Levantine area and exploring the artifacts uncovered there. Using a Google Nexus tablet and Qualcomm's Vuforia API, we use an image target as a map and overlay a three-dimensional model of the dig site onto it, augmenting reality such that we are able to interact with the plotted artifacts. The user can physically move the Android device around the image target and see the dig site model from any perspective. The user can also move the device closer to the model in order to "zoom" into the view of a particular section of the model and its associated artifacts. This is especially useful, as the dig site model and the collection of artifacts are very detailed. The artifacts are plotted as points, colored by type. The user can touch the virtual points to trigger a popup information window that contains details of the artifact, such as photographs, material descriptions, and more.
Measuring the Perceived Quality of an AR-Based Learning Application: A Multidimensional Model
ERIC Educational Resources Information Center
Pribeanu, Costin; Balog, Alexandru; Iordache, Dragos Daniel
2017-01-01
Augmented reality (AR) technologies could enhance learning in several ways. The quality of an AR-based educational platform is a combination of key features that manifests in usability, usefulness, and enjoyment for the learner. In this paper, we present a multidimensional model to measure the quality of an AR-based application as perceived by…
NASA Astrophysics Data System (ADS)
Zhao, Shouwei; Zhang, Yong; Zhou, Bin; Ma, Dongxi
2014-09-01
Interaction is one of the key techniques of augmented reality (AR) maintenance guiding system. Because of the complexity of the maintenance guiding system's image background and the high dimensionality of gesture characteristics, the whole process of gesture recognition can be divided into three stages which are gesture segmentation, gesture characteristic feature modeling and trick recognition. In segmentation stage, for solving the misrecognition of skin-like region, a segmentation algorithm combing background mode and skin color to preclude some skin-like regions is adopted. In gesture characteristic feature modeling of image attributes stage, plenty of characteristic features are analyzed and acquired, such as structure characteristics, Hu invariant moments features and Fourier descriptor. In trick recognition stage, a classifier based on Support Vector Machine (SVM) is introduced into the augmented reality maintenance guiding process. SVM is a novel learning method based on statistical learning theory, processing academic foundation and excellent learning ability, having a lot of issues in machine learning area and special advantages in dealing with small samples, non-linear pattern recognition at high dimension. The gesture recognition of augmented reality maintenance guiding system is realized by SVM after the granulation of all the characteristic features. The experimental results of the simulation of number gesture recognition and its application in augmented reality maintenance guiding system show that the real-time performance and robustness of gesture recognition of AR maintenance guiding system can be greatly enhanced by improved SVM.
The development of augmented video system on postcards
NASA Astrophysics Data System (ADS)
Chen, Chien-Hsu; Chou, Yin-Ju
2013-03-01
This study focuses on development of augmented video system on traditional picture postcards. The system will provide users to print out the augmented reality marker on the sticker to stick on the picture postcard, and it also allows users to record their real time image and video to augment on that stick marker. According dynamic image, users can share travel moods, greeting, and travel experience to their friends. Without changing in the traditional picture postcards, we develop augmented video system on them by augmented reality (AR) technology. It not only keeps the functions of traditional picture postcards, but also enhances user's experience to keep the user's memories and emotional expression by augmented digital media information on them.
Augmented reality and photogrammetry: A synergy to visualize physical and virtual city environments
NASA Astrophysics Data System (ADS)
Portalés, Cristina; Lerma, José Luis; Navarro, Santiago
2010-01-01
Close-range photogrammetry is based on the acquisition of imagery to make accurate measurements and, eventually, three-dimensional (3D) photo-realistic models. These models are a photogrammetric product per se. They are usually integrated into virtual reality scenarios where additional data such as sound, text or video can be introduced, leading to multimedia virtual environments. These environments allow users both to navigate and interact on different platforms such as desktop PCs, laptops and small hand-held devices (mobile phones or PDAs). In very recent years, a new technology derived from virtual reality has emerged: Augmented Reality (AR), which is based on mixing real and virtual environments to boost human interactions and real-life navigations. The synergy of AR and photogrammetry opens up new possibilities in the field of 3D data visualization, navigation and interaction far beyond the traditional static navigation and interaction in front of a computer screen. In this paper we introduce a low-cost outdoor mobile AR application to integrate buildings of different urban spaces. High-accuracy 3D photo-models derived from close-range photogrammetry are integrated in real (physical) urban worlds. The augmented environment that is presented herein requires for visualization a see-through video head mounted display (HMD), whereas user's movement navigation is achieved in the real world with the help of an inertial navigation sensor. After introducing the basics of AR technology, the paper will deal with real-time orientation and tracking in combined physical and virtual city environments, merging close-range photogrammetry and AR. There are, however, some software and complex issues, which are discussed in the paper.
Huang, Cynthia Y; Thomas, Jonathan B; Alismail, Abdullah; Cohen, Avi; Almutairi, Waleed; Daher, Noha S; Terry, Michael H; Tan, Laren D
2018-01-01
The aim of this study was to investigate the feasibility of using augmented reality (AR) glasses in central line simulation by novice operators and compare its efficacy to standard central line simulation/teaching. This was a prospective randomized controlled study enrolling 32 novice operators. Subjects were randomized on a 1:1 basis to either simulation using the augmented virtual reality glasses or simulation using conventional instruction. The study was conducted in tertiary-care urban teaching hospital. A total of 32 adult novice central line operators with no visual or auditory impairments were enrolled. Medical doctors, respiratory therapists, and sleep technicians were recruited from the medical field. The mean time for AR placement in the AR group was 71±43 s, and the time to internal jugular (IJ) cannulation was 316±112 s. There was no significant difference in median (minimum, maximum) time (seconds) to IJ cannulation for those who were in the AR group and those who were not (339 [130, 550] vs 287 [35, 475], p =0.09), respectively. There was also no significant difference between the two groups in median total procedure time (524 [329, 792] vs 469 [198, 781], p =0.29), respectively. There was a significant difference in the adherence level between the two groups favoring the AR group ( p =0.003). AR simulation of central venous catheters in manikins is feasible and efficacious in novice operators as an educational tool. Future studies are recommended in this area as it is a promising area of medical education.
Zhu, Ming; Liu, Fei; Zhou, Chaozheng; Lin, Li; Zhang, Yan; Chai, Gang; Xie, Le; Qi, Fazhi; Li, Qingfeng
2018-04-11
Augmented reality (AR)-based navigation surgery has evolved to be an advanced assisted technology. The aim of this study is to manifest the accuracy of AR navigation for the intraoperative mandibular angle osteotomy by comparing the navigation with other interventional techniques. A retrospective study was conducted with 93 post-surgical patients with mandibular angle hypertrophy admitted at our plastic and reconstructive surgery department between September 2011 and June 2016. Thirty-one patients received osteotomy conducted using a navigation system based on augmented reality (AR group), 28 patients received osteotomy conducted using individualised templates (IT group) and the remaining 34 patients received osteotomy performed by free hand (free-hand group). The post-operative computed tomography (CT) images were reviewed and analysed by comparing with pre-surgical planning generated by three-dimensional (3D) software. The preparation time, cutting time, whole operating time and discrepancy in osteotomy lines were measured. The preparation time was much shorter for the free-hand group than that for the AR group and the IT group (P < 0.01). However, no significant difference in the whole operating time was observed among the three groups (P > 0.05). In addition, the discrepancy in osteotomy lines was lower for the AR group and in the IT group than for the free-hand group (P < 0.01). The navigation system based on AR has a higher accuracy, more reliability and better user friendliness for some particular clinical procedures than for other techniques, which has a promising clinical prospect. Copyright © 2018. Published by Elsevier Ltd.
Learning anatomy via mobile augmented reality: Effects on achievement and cognitive load.
Küçük, Sevda; Kapakin, Samet; Göktaş, Yüksel
2016-10-01
Augmented reality (AR), a new generation of technology, has attracted the attention of educators in recent years. In this study, a MagicBook was developed for a neuroanatomy topic by using mobile augmented reality (mAR) technology. This technology integrates virtual learning objects into the real world and allow users to interact with the environment using mobile devices. The purpose of this study was to determine the effects of learning anatomy via mAR on medical students' academic achievement and cognitive load. The mixed method was applied in the study. The random sample consisted of 70 second-year undergraduate medical students: 34 in an experimental group and 36 in a control group. Academic achievement test and cognitive load scale were used as data collection tool. A one-way MANOVA test was used for analysis. The experimental group, which used mAR applications, reported higher achievement and lower cognitive load. The use of mAR applications in anatomy education contributed to the formation of an effective and productive learning environment. Student cognitive load decreased as abstract information became concrete in printed books via multimedia materials in mAR applications. Additionally, students were able to access the materials in the MagicBook anytime and anywhere they wanted. The mobile learning approach helped students learn better by exerting less cognitive effort. Moreover, the sensory experience and real time interaction with environment may provide learning satisfaction and enable students to structure their knowledge to complete the learning tasks. Anat Sci Educ 9: 411-421. © 2016 American Association of Anatomists. © 2016 American Association of Anatomists.
Rochlen, Lauryn R; Levine, Robert; Tait, Alan R
2017-02-01
The value of simulation in medical education and procedural skills training is well recognized. Despite this, many mannequin-based trainers are limited by the inability of the trainee to view the internal anatomical structures. This study evaluates the usability and feasibility of a first-person point-of-view-augmented reality (AR) trainer on needle insertion as a component of central venous catheter placement. Forty subjects, including medical students and anesthesiology residents and faculty, participated. Augmented reality glasses were provided through which the relevant internal anatomical landmarks were projected. After a practice period, participants were asked to place the needle in the mannequin without the benefit of the AR-projected internal anatomy. The ability of the trainees to correctly place the needle was documented. Participants also completed a short survey describing their perceptions of the AR technology. Participants reported that the AR technology was realistic (77.5%) and that the ability to view the internal anatomy was helpful (92.5%). Furthermore, 85% and 82.1%, respectively, believed that the AR technology promoted learning and should be incorporated into medical training. The ability to successfully place the needle was similar between experienced and nonexperienced participants; however, less experienced participants were more likely to inadvertently puncture the carotid artery. Results of this pilot study demonstrated the usability and feasibility of AR technology as a potentially important adjunct to simulated medical skills training. Further development and evaluation of this innovative technology under a variety of simulated medical training settings would be an important next step.
Kim, Hyungil; Gabbard, Joseph L; Anon, Alexandre Miranda; Misu, Teruhisa
2018-04-01
This article investigates the effects of visual warning presentation methods on human performance in augmented reality (AR) driving. An experimental user study was conducted in a parking lot where participants drove a test vehicle while braking for any cross traffic with assistance from AR visual warnings presented on a monoscopic and volumetric head-up display (HUD). Results showed that monoscopic displays can be as effective as volumetric displays for human performance in AR braking tasks. The experiment also demonstrated the benefits of conformal graphics, which are tightly integrated into the real world, such as their ability to guide drivers' attention and their positive consequences on driver behavior and performance. These findings suggest that conformal graphics presented via monoscopic HUDs can enhance driver performance by leveraging the effectiveness of monocular depth cues. The proposed approaches and methods can be used and further developed by future researchers and practitioners to better understand driver performance in AR as well as inform usability evaluation of future automotive AR applications.
ERIC Educational Resources Information Center
Giorgis, Scott; Mahlen, Nancy; Anne, Kirk
2017-01-01
The augmented reality (AR) sandbox bridges the gap between two-dimensional (2D) and three-dimensional (3D) visualization by projecting a digital topographic map onto a sandbox landscape. As the landscape is altered, the map dynamically adjusts, providing an opportunity to discover how to read topographic maps. We tested the hypothesis that the AR…
ERIC Educational Resources Information Center
Hsu, Wen-Chun; Shih, Ju-Ling
2016-01-01
In this study, to learn the routine of Tantui, a branch of martial arts was taken as an object of research. Fitts' stages of motor learning and augmented reality (AR) were applied to a 3D mobile-assisted learning system for martial arts, which was characterized by free viewing angles. With the new system, learners could rotate the viewing angle of…
ERIC Educational Resources Information Center
Chang, Rong-Chi; Chung, Liang-Yi; Huang, Yong-Ming
2016-01-01
The learning of plants has garnered considerable attention in recent years, but students often lack the motivation to learn about the process of plant growth. Also, students are not able to apply what they have learned in class in the form of observation, since plant growth takes a long time. In this study, we use augmented reality (AR) technology…
Real-time 3D image reconstruction guidance in liver resection surgery
Nicolau, Stephane; Pessaux, Patrick; Mutter, Didier; Marescaux, Jacques
2014-01-01
Background Minimally invasive surgery represents one of the main evolutions of surgical techniques. However, minimally invasive surgery adds difficulty that can be reduced through computer technology. Methods From a patient’s medical image [US, computed tomography (CT) or MRI], we have developed an Augmented Reality (AR) system that increases the surgeon’s intraoperative vision by providing a virtual transparency of the patient. AR is based on two major processes: 3D modeling and visualization of anatomical or pathological structures appearing in the medical image, and the registration of this visualization onto the real patient. We have thus developed a new online service, named Visible Patient, providing efficient 3D modeling of patients. We have then developed several 3D visualization and surgical planning software tools to combine direct volume rendering and surface rendering. Finally, we have developed two registration techniques, one interactive and one automatic providing intraoperative augmented reality view. Results From January 2009 to June 2013, 769 clinical cases have been modeled by the Visible Patient service. Moreover, three clinical validations have been realized demonstrating the accuracy of 3D models and their great benefit, potentially increasing surgical eligibility in liver surgery (20% of cases). From these 3D models, more than 50 interactive AR-assisted surgical procedures have been realized illustrating the potential clinical benefit of such assistance to gain safety, but also current limits that automatic augmented reality will overcome. Conclusions Virtual patient modeling should be mandatory for certain interventions that have now to be defined, such as liver surgery. Augmented reality is clearly the next step of the new surgical instrumentation but remains currently limited due to the complexity of organ deformations during surgery. Intraoperative medical imaging used in new generation of automated augmented reality should solve this issue thanks to the development of Hybrid OR. PMID:24812598
A Framework for Designing Collaborative Learning Environments Using Mobile AR
ERIC Educational Resources Information Center
Cochrane, Thomas; Narayan, Vickel; Antonczak, Laurent
2016-01-01
Smartphones provide a powerful platform for augmented reality (AR). Using a smartphone's camera together with the built in GPS, compass, gyroscope, and touch screen enables the real world environment to be overlaid with contextual digital information. The creation of mobile AR environments is relatively simple, with the development of mobile AR…
Quantifying attention shifts in augmented reality image-guided neurosurgery
Drouin, Simon; Collins, D. Louis; Popa, Tiberiu; Kersten-Oertel, Marta
2017-01-01
Image-guided surgery (IGS) has allowed for more minimally invasive procedures, leading to better patient outcomes, reduced risk of infection, less pain, shorter hospital stays and faster recoveries. One drawback that has emerged with IGS is that the surgeon must shift their attention from the patient to the monitor for guidance. Yet both cognitive and motor tasks are negatively affected with attention shifts. Augmented reality (AR), which merges the realworld surgical scene with preoperative virtual patient images and plans, has been proposed as a solution to this drawback. In this work, we studied the impact of two different types of AR IGS set-ups (mobile AR and desktop AR) and traditional navigation on attention shifts for the specific task of craniotomy planning. We found a significant difference in terms of the time taken to perform the task and attention shifts between traditional navigation, but no significant difference between the different AR set-ups. With mobile AR, however, users felt that the system was easier to use and that their performance was better. These results suggest that regardless of where the AR visualisation is shown to the surgeon, AR may reduce attention shifts, leading to more streamlined and focused procedures. PMID:29184663
Quantifying attention shifts in augmented reality image-guided neurosurgery.
Léger, Étienne; Drouin, Simon; Collins, D Louis; Popa, Tiberiu; Kersten-Oertel, Marta
2017-10-01
Image-guided surgery (IGS) has allowed for more minimally invasive procedures, leading to better patient outcomes, reduced risk of infection, less pain, shorter hospital stays and faster recoveries. One drawback that has emerged with IGS is that the surgeon must shift their attention from the patient to the monitor for guidance. Yet both cognitive and motor tasks are negatively affected with attention shifts. Augmented reality (AR), which merges the realworld surgical scene with preoperative virtual patient images and plans, has been proposed as a solution to this drawback. In this work, we studied the impact of two different types of AR IGS set-ups (mobile AR and desktop AR) and traditional navigation on attention shifts for the specific task of craniotomy planning. We found a significant difference in terms of the time taken to perform the task and attention shifts between traditional navigation, but no significant difference between the different AR set-ups. With mobile AR, however, users felt that the system was easier to use and that their performance was better. These results suggest that regardless of where the AR visualisation is shown to the surgeon, AR may reduce attention shifts, leading to more streamlined and focused procedures.
Kobayashi, Leo; Zhang, Xiao Chi; Collins, Scott A; Karim, Naz; Merck, Derek L
2018-01-01
Augmented reality (AR), mixed reality (MR), and virtual reality devices are enabling technologies that may facilitate effective communication in healthcare between those with information and knowledge (clinician/specialist; expert; educator) and those seeking understanding and insight (patient/family; non-expert; learner). Investigators initiated an exploratory program to enable the study of AR/MR use-cases in acute care clinical and instructional settings. Academic clinician educators, computer scientists, and diagnostic imaging specialists conducted a proof-of-concept project to 1) implement a core holoimaging pipeline infrastructure and open-access repository at the study institution, and 2) use novel AR/MR techniques on off-the-shelf devices with holoimages generated by the infrastructure to demonstrate their potential role in the instructive communication of complex medical information. The study team successfully developed a medical holoimaging infrastructure methodology to identify, retrieve, and manipulate real patients' de-identified computed tomography and magnetic resonance imagesets for rendering, packaging, transfer, and display of modular holoimages onto AR/MR headset devices and connected displays. Holoimages containing key segmentations of cervical and thoracic anatomic structures and pathology were overlaid and registered onto physical task trainers for simulation-based "blind insertion" invasive procedural training. During the session, learners experienced and used task-relevant anatomic holoimages for central venous catheter and tube thoracostomy insertion training with enhanced visual cues and haptic feedback. Direct instructor access into the learner's AR/MR headset view of the task trainer was achieved for visual-axis interactive instructional guidance. Investigators implemented a core holoimaging pipeline infrastructure and modular open-access repository to generate and enable access to modular holoimages during exploratory pilot stage applications for invasive procedure training that featured innovative AR/MR techniques on off-the-shelf headset devices.
Exploratory Application of Augmented Reality/Mixed Reality Devices for Acute Care Procedure Training
Kobayashi, Leo; Zhang, Xiao Chi; Collins, Scott A.; Karim, Naz; Merck, Derek L.
2018-01-01
Introduction Augmented reality (AR), mixed reality (MR), and virtual reality devices are enabling technologies that may facilitate effective communication in healthcare between those with information and knowledge (clinician/specialist; expert; educator) and those seeking understanding and insight (patient/family; non-expert; learner). Investigators initiated an exploratory program to enable the study of AR/MR use-cases in acute care clinical and instructional settings. Methods Academic clinician educators, computer scientists, and diagnostic imaging specialists conducted a proof-of-concept project to 1) implement a core holoimaging pipeline infrastructure and open-access repository at the study institution, and 2) use novel AR/MR techniques on off-the-shelf devices with holoimages generated by the infrastructure to demonstrate their potential role in the instructive communication of complex medical information. Results The study team successfully developed a medical holoimaging infrastructure methodology to identify, retrieve, and manipulate real patients’ de-identified computed tomography and magnetic resonance imagesets for rendering, packaging, transfer, and display of modular holoimages onto AR/MR headset devices and connected displays. Holoimages containing key segmentations of cervical and thoracic anatomic structures and pathology were overlaid and registered onto physical task trainers for simulation-based “blind insertion” invasive procedural training. During the session, learners experienced and used task-relevant anatomic holoimages for central venous catheter and tube thoracostomy insertion training with enhanced visual cues and haptic feedback. Direct instructor access into the learner’s AR/MR headset view of the task trainer was achieved for visual-axis interactive instructional guidance. Conclusion Investigators implemented a core holoimaging pipeline infrastructure and modular open-access repository to generate and enable access to modular holoimages during exploratory pilot stage applications for invasive procedure training that featured innovative AR/MR techniques on off-the-shelf headset devices. PMID:29383074
Preliminary development of augmented reality systems for spinal surgery
NASA Astrophysics Data System (ADS)
Nguyen, Nhu Q.; Ramjist, Joel M.; Jivraj, Jamil; Jakubovic, Raphael; Deorajh, Ryan; Yang, Victor X. D.
2017-02-01
Surgical navigation has been more actively deployed in open spinal surgeries due to the need for improved precision during procedures. This is increasingly difficult in minimally invasive surgeries due to the lack of visual cues caused by smaller exposure sites, and increases a surgeon's dependence on their knowledge of anatomical landmarks as well as the CT or MRI images. The use of augmented reality (AR) systems and registration technologies in spinal surgeries could allow for improvements to techniques by overlaying a 3D reconstruction of patient anatomy in the surgeon's field of view, creating a mixed reality visualization. The AR system will be capable of projecting the 3D reconstruction onto a field and preliminary object tracking on a phantom. Dimensional accuracy of the mixed media will also be quantified to account for distortions in tracking.
Distributed augmented reality with 3-D lung dynamics--a planning tool concept.
Hamza-Lup, Felix G; Santhanam, Anand P; Imielińska, Celina; Meeks, Sanford L; Rolland, Jannick P
2007-01-01
Augmented reality (AR) systems add visual information to the world by using advanced display techniques. The advances in miniaturization and reduced hardware costs make some of these systems feasible for applications in a wide set of fields. We present a potential component of the cyber infrastructure for the operating room of the future: a distributed AR-based software-hardware system that allows real-time visualization of three-dimensional (3-D) lung dynamics superimposed directly on the patient's body. Several emergency events (e.g., closed and tension pneumothorax) and surgical procedures related to lung (e.g., lung transplantation, lung volume reduction surgery, surgical treatment of lung infections, lung cancer surgery) could benefit from the proposed prototype.
Miragall, Marta; Baños, Rosa M.; Cebolla, Ausiàs; Botella, Cristina
2015-01-01
This study examines the psychometric properties of the Working Alliance Inventory-Short (WAI-S) adaptation to Virtual Reality (VR) and Augmented Reality (AR) therapies (WAI-VAR). The relationship between the therapeutic alliance (TA) with VR and AR and clinically significant change (CSC) is also explored. Seventy-five patients took part in this study (74.7% women, Mage = 34.41). Fear of flying and adjustment disorder patients received VR therapy, and cockroach phobia patients received AR therapy. Psychometric properties, CSC, one-way ANOVA, Spearman’s Correlations and Multiple Regression were calculated. The WAI-VAR showed a unidimensional structure, high internal consistency and adequate convergent validity. “Not changed” patients scored lower on the WAI-VAR than “improved” and “recovered” patients. Correlation between the WAI-VAR and CSC was moderate. The best fitting model for predicting CSC was a linear combination of the TA with therapist (WAI-S) and the TA with VR and AR (WAI-VAR), due to the latter variable slightly increased the percentage of variability accounted for in CSC. The WAI-VAR is the first validated instrument to measure the TA with VR and AR in research and clinical practice. This study reveals the importance of the quality of the TA with technologies in achieving positive outcomes in the therapy. PMID:26500589
Miragall, Marta; Baños, Rosa M; Cebolla, Ausiàs; Botella, Cristina
2015-01-01
This study examines the psychometric properties of the Working Alliance Inventory-Short (WAI-S) adaptation to Virtual Reality (VR) and Augmented Reality (AR) therapies (WAI-VAR). The relationship between the therapeutic alliance (TA) with VR and AR and clinically significant change (CSC) is also explored. Seventy-five patients took part in this study (74.7% women, M age = 34.41). Fear of flying and adjustment disorder patients received VR therapy, and cockroach phobia patients received AR therapy. Psychometric properties, CSC, one-way ANOVA, Spearman's Correlations and Multiple Regression were calculated. The WAI-VAR showed a unidimensional structure, high internal consistency and adequate convergent validity. "Not changed" patients scored lower on the WAI-VAR than "improved" and "recovered" patients. Correlation between the WAI-VAR and CSC was moderate. The best fitting model for predicting CSC was a linear combination of the TA with therapist (WAI-S) and the TA with VR and AR (WAI-VAR), due to the latter variable slightly increased the percentage of variability accounted for in CSC. The WAI-VAR is the first validated instrument to measure the TA with VR and AR in research and clinical practice. This study reveals the importance of the quality of the TA with technologies in achieving positive outcomes in the therapy.
Huang, Cynthia Y; Thomas, Jonathan B; Alismail, Abdullah; Cohen, Avi; Almutairi, Waleed; Daher, Noha S; Terry, Michael H; Tan, Laren D
2018-01-01
Objective The aim of this study was to investigate the feasibility of using augmented reality (AR) glasses in central line simulation by novice operators and compare its efficacy to standard central line simulation/teaching. Design This was a prospective randomized controlled study enrolling 32 novice operators. Subjects were randomized on a 1:1 basis to either simulation using the augmented virtual reality glasses or simulation using conventional instruction. Setting The study was conducted in tertiary-care urban teaching hospital. Subjects A total of 32 adult novice central line operators with no visual or auditory impairments were enrolled. Medical doctors, respiratory therapists, and sleep technicians were recruited from the medical field. Measurements and main results The mean time for AR placement in the AR group was 71±43 s, and the time to internal jugular (IJ) cannulation was 316±112 s. There was no significant difference in median (minimum, maximum) time (seconds) to IJ cannulation for those who were in the AR group and those who were not (339 [130, 550] vs 287 [35, 475], p=0.09), respectively. There was also no significant difference between the two groups in median total procedure time (524 [329, 792] vs 469 [198, 781], p=0.29), respectively. There was a significant difference in the adherence level between the two groups favoring the AR group (p=0.003). Conclusion AR simulation of central venous catheters in manikins is feasible and efficacious in novice operators as an educational tool. Future studies are recommended in this area as it is a promising area of medical education. PMID:29785148
Augmented reality in bone tumour resection: An experimental study.
Cho, H S; Park, Y K; Gupta, S; Yoon, C; Han, I; Kim, H-S; Choi, H; Hong, J
2017-03-01
We evaluated the accuracy of augmented reality (AR)-based navigation assistance through simulation of bone tumours in a pig femur model. We developed an AR-based navigation system for bone tumour resection, which could be used on a tablet PC. To simulate a bone tumour in the pig femur, a cortical window was made in the diaphysis and bone cement was inserted. A total of 133 pig femurs were used and tumour resection was simulated with AR-assisted resection (164 resection in 82 femurs, half by an orthropaedic oncology expert and half by an orthopaedic resident) and resection with the conventional method (82 resection in 41 femurs). In the conventional group, resection was performed after measuring the distance from the edge of the condyle to the expected resection margin with a ruler as per routine clinical practice. The mean error of 164 resections in 82 femurs in the AR group was 1.71 mm (0 to 6). The mean error of 82 resections in 41 femurs in the conventional resection group was 2.64 mm (0 to 11) (p < 0.05, one-way analysis of variance). The probabilities of a surgeon obtaining a 10 mm surgical margin with a 3 mm tolerance were 90.2% in AR-assisted resections, and 70.7% in conventional resections. We demonstrated that the accuracy of tumour resection was satisfactory with the help of the AR navigation system, with the tumour shown as a virtual template. In addition, this concept made the navigation system simple and available without additional cost or time. Cite this article: H. S. Cho, Y. K. Park, S. Gupta, C. Yoon, I. Han, H-S. Kim, H. Choi, J. Hong. Augmented reality in bone tumour resection: An experimental study. Bone Joint Res 2017;6:137-143. © 2017 Cho et al.
Augmented reality in bone tumour resection
Park, Y. K.; Gupta, S.; Yoon, C.; Han, I.; Kim, H-S.; Choi, H.; Hong, J.
2017-01-01
Objectives We evaluated the accuracy of augmented reality (AR)-based navigation assistance through simulation of bone tumours in a pig femur model. Methods We developed an AR-based navigation system for bone tumour resection, which could be used on a tablet PC. To simulate a bone tumour in the pig femur, a cortical window was made in the diaphysis and bone cement was inserted. A total of 133 pig femurs were used and tumour resection was simulated with AR-assisted resection (164 resection in 82 femurs, half by an orthropaedic oncology expert and half by an orthopaedic resident) and resection with the conventional method (82 resection in 41 femurs). In the conventional group, resection was performed after measuring the distance from the edge of the condyle to the expected resection margin with a ruler as per routine clinical practice. Results The mean error of 164 resections in 82 femurs in the AR group was 1.71 mm (0 to 6). The mean error of 82 resections in 41 femurs in the conventional resection group was 2.64 mm (0 to 11) (p < 0.05, one-way analysis of variance). The probabilities of a surgeon obtaining a 10 mm surgical margin with a 3 mm tolerance were 90.2% in AR-assisted resections, and 70.7% in conventional resections. Conclusion We demonstrated that the accuracy of tumour resection was satisfactory with the help of the AR navigation system, with the tumour shown as a virtual template. In addition, this concept made the navigation system simple and available without additional cost or time. Cite this article: H. S. Cho, Y. K. Park, S. Gupta, C. Yoon, I. Han, H-S. Kim, H. Choi, J. Hong. Augmented reality in bone tumour resection: An experimental study. Bone Joint Res 2017;6:137–143. PMID:28258117
Real-time geometry-aware augmented reality in minimally invasive surgery.
Chen, Long; Tang, Wen; John, Nigel W
2017-10-01
The potential of augmented reality (AR) technology to assist minimally invasive surgery (MIS) lies in its computational performance and accuracy in dealing with challenging MIS scenes. Even with the latest hardware and software technologies, achieving both real-time and accurate augmented information overlay in MIS is still a formidable task. In this Letter, the authors present a novel real-time AR framework for MIS that achieves interactive geometric aware AR in endoscopic surgery with stereo views. The authors' framework tracks the movement of the endoscopic camera and simultaneously reconstructs a dense geometric mesh of the MIS scene. The movement of the camera is predicted by minimising the re-projection error to achieve a fast tracking performance, while the three-dimensional mesh is incrementally built by a dense zero mean normalised cross-correlation stereo-matching method to improve the accuracy of the surface reconstruction. The proposed system does not require any prior template or pre-operative scan and can infer the geometric information intra-operatively in real time. With the geometric information available, the proposed AR framework is able to interactively add annotations, localisation of tumours and vessels, and measurement labelling with greater precision and accuracy compared with the state-of-the-art approaches.
Augmented reality in neurosurgery: a systematic review.
Meola, Antonio; Cutolo, Fabrizio; Carbone, Marina; Cagnazzo, Federico; Ferrari, Mauro; Ferrari, Vincenzo
2017-10-01
Neuronavigation has become an essential neurosurgical tool in pursuing minimal invasiveness and maximal safety, even though it has several technical limitations. Augmented reality (AR) neuronavigation is a significant advance, providing a real-time updated 3D virtual model of anatomical details, overlaid on the real surgical field. Currently, only a few AR systems have been tested in a clinical setting. The aim is to review such devices. We performed a PubMed search of reports restricted to human studies of in vivo applications of AR in any neurosurgical procedure using the search terms "Augmented reality" and "Neurosurgery." Eligibility assessment was performed independently by two reviewers in an unblinded standardized manner. The systems were qualitatively evaluated on the basis of the following: neurosurgical subspecialty of application, pathology of treated lesions and lesion locations, real data source, virtual data source, tracking modality, registration technique, visualization processing, display type, and perception location. Eighteen studies were included during the period 1996 to September 30, 2015. The AR systems were grouped by the real data source: microscope (8), hand- or head-held cameras (4), direct patient view (2), endoscope (1), and X-ray fluoroscopy (1) head-mounted display (1). A total of 195 lesions were treated: 75 (38.46 %) were neoplastic, 77 (39.48 %) neurovascular, and 1 (0.51 %) hydrocephalus, and 42 (21.53 %) were undetermined. Current literature confirms that AR is a reliable and versatile tool when performing minimally invasive approaches in a wide range of neurosurgical diseases, although prospective randomized studies are not yet available and technical improvements are needed.
NASA Astrophysics Data System (ADS)
Dastageeri, H.; Storz, M.; Koukofikis, A.; Knauth, S.; Coors, V.
2016-09-01
Providing mobile location-based information for pedestrians faces many challenges. On one hand the accuracy of localisation indoors and outdoors is restricted due to technical limitations of GPS and Beacons. Then again only a small display is available to display information as well as to develop a user interface. Plus, the software solution has to consider the hardware characteristics of mobile devices during the implementation process for aiming a performance with minimum latency. This paper describes our approach by including a combination of image tracking and GPS or Beacons to ensure orientation and precision of localisation. To communicate the information on Points of Interest (POIs), we decided to choose Augmented Reality (AR). For this concept of operations, we used besides the display also the acceleration and positions sensors as a user interface. This paper especially goes into detail on the optimization of the image tracking algorithms, the development of the video-based AR player for the Android platform and the evaluation of videos as an AR element in consideration of providing a good user experience. For setting up content for the POIs or even generate a tour we used and extended the Open Geospatial Consortium (OGC) standard Augmented Reality Markup Language (ARML).
Applying an AR Technique to Enhance Situated Heritage Learning in a Ubiquitous Learning Environment
ERIC Educational Resources Information Center
Chang, Yi Hsing; Liu, Jen-ch'iang
2013-01-01
Since AR can display 3D materials and learner motivation is enhanced in a situated learning environment, this study explores the learning effectiveness of learners when combining AR technology and the situation learning theory. Based on the concept of embedding the characteristics of augmented reality and situated learning into a real situation to…
Learning Protein Structure with Peers in an AR-Enhanced Learning Environment
ERIC Educational Resources Information Center
Chen, Yu-Chien
2013-01-01
Augmented reality (AR) is an interactive system that allows users to interact with virtual objects and the real world at the same time. The purpose of this dissertation was to explore how AR, as a new visualization tool, that can demonstrate spatial relationships by representing three dimensional objects and animations, facilitates students to…
Applications of Augmented Reality in Informal Science Learning Sites: a Review
NASA Astrophysics Data System (ADS)
Goff, Eric E.; Mulvey, Kelly Lynn; Irvin, Matthew J.; Hartstone-Rose, Adam
2018-05-01
The importance of increasing interest in the STEM disciplines has been noted in a number of recent national reports. While many previous studies have focused on such efforts inside of the formal classroom, comparatively few have looked closely at informal learning environments. We investigate the innovative use of technology in informal learning by reviewing research on the incorporation of augmented reality (AR) at exhibit-based informal science education (ISE) settings in the literature. We report on the common STEM-focused topics that are covered by current AR applications for ISE learning, as well as the different devices used to support these applications. Additionally, we report on the prevalence of positive learning outcomes and engagement factors commonly associated with the use AR applications in informal environments. This review aims to foster continued development and implementation of AR technology in exhibit-based ISE settings by informing the community of recent findings and promoting additional rigorous research for the future.
Spacecraft 3D Augmented Reality Mobile App
NASA Technical Reports Server (NTRS)
Hussey, Kevin J.; Doronila, Paul R.; Kumanchik, Brian E.; Chan, Evan G.; Ellison, Douglas J.; Boeck, Andrea; Moore, Justin M.
2013-01-01
The Spacecraft 3D application allows users to learn about and interact with iconic NASA missions in a new and immersive way using common mobile devices. Using Augmented Reality (AR) techniques to project 3D renditions of the mission spacecraft into real-world surroundings, users can interact with and learn about Curiosity, GRAIL, Cassini, and Voyager. Additional updates on future missions, animations, and information will be ongoing. Using a printed AR Target and camera on a mobile device, users can get up close with these robotic explorers, see how some move, and learn about these engineering feats, which are used to expand knowledge and understanding about space. The software receives input from the mobile device's camera to recognize the presence of an AR marker in the camera's field of view. It then displays a 3D rendition of the selected spacecraft in the user's physical surroundings, on the mobile device's screen, while it tracks the device's movement in relation to the physical position of the spacecraft's 3D image on the AR marker.
Augmented Reality Tool for the Situational Awareness Improvement of UAV Operators
Ruano, Susana; Cuevas, Carlos; Gallego, Guillermo; García, Narciso
2017-01-01
Unmanned Aerial Vehicles (UAVs) are being extensively used nowadays. Therefore, pilots of traditional aerial platforms should adapt their skills to operate them from a Ground Control Station (GCS). Common GCSs provide information in separate screens: one presents the video stream while the other displays information about the mission plan and information coming from other sensors. To avoid the burden of fusing information displayed in the two screens, an Augmented Reality (AR) tool is proposed in this paper. The AR system has two functionalities for Medium-Altitude Long-Endurance (MALE) UAVs: route orientation and target identification. Route orientation allows the operator to identify the upcoming waypoints and the path that the UAV is going to follow. Target identification allows a fast target localization, even in the presence of occlusions. The AR tool is implemented following the North Atlantic Treaty Organization (NATO) standards so that it can be used in different GCSs. The experiments show how the AR tool improves significantly the situational awareness of the UAV operators. PMID:28178189
Augmented Reality Tool for the Situational Awareness Improvement of UAV Operators.
Ruano, Susana; Cuevas, Carlos; Gallego, Guillermo; García, Narciso
2017-02-06
Unmanned Aerial Vehicles (UAVs) are being extensively used nowadays. Therefore, pilots of traditional aerial platforms should adapt their skills to operate them from a Ground Control Station (GCS). Common GCSs provide information in separate screens: one presents the video stream while the other displays information about the mission plan and information coming from other sensors. To avoid the burden of fusing information displayed in the two screens, an Augmented Reality (AR) tool is proposed in this paper. The AR system has two functionalities for Medium-Altitude Long-Endurance (MALE) UAVs: route orientation and target identification. Route orientation allows the operator to identify the upcoming waypoints and the path that the UAV is going to follow. Target identification allows a fast target localization, even in the presence of occlusions. The AR tool is implemented following the North Atlantic Treaty Organization (NATO) standards so that it can be used in different GCSs. The experiments show how the AR tool improves significantly the situational awareness of the UAV operators.
Affordances of Augmented Reality in Science Learning: Suggestions for Future Research
NASA Astrophysics Data System (ADS)
Cheng, Kun-Hung; Tsai, Chin-Chung
2013-08-01
Augmented reality (AR) is currently considered as having potential for pedagogical applications. However, in science education, research regarding AR-aided learning is in its infancy. To understand how AR could help science learning, this review paper firstly has identified two major approaches of utilizing AR technology in science education, which are named as image- based AR and location- based AR. These approaches may result in different affordances for science learning. It is then found that students' spatial ability, practical skills, and conceptual understanding are often afforded by image-based AR and location-based AR usually supports inquiry-based scientific activities. After examining what has been done in science learning with AR supports, several suggestions for future research are proposed. For example, more research is required to explore learning experience (e.g., motivation or cognitive load) and learner characteristics (e.g., spatial ability or perceived presence) involved in AR. Mixed methods of investigating learning process (e.g., a content analysis and a sequential analysis) and in-depth examination of user experience beyond usability (e.g., affective variables of esthetic pleasure or emotional fulfillment) should be considered. Combining image-based and location-based AR technology may bring new possibility for supporting science learning. Theories including mental models, spatial cognition, situated cognition, and social constructivist learning are suggested for the profitable uses of future AR research in science education.
Weidert, S; Wang, L; von der Heide, A; Navab, N; Euler, E
2012-03-01
The intraoperative application of augmented reality (AR) has so far mainly taken place in the field of endoscopy. Here, the camera image of the endoscope was augmented by computer graphics derived mostly from preoperative imaging. Due to the complex setup and operation of the devices, they have not yet become part of routine clinical practice. The Camera Augmented Mobile C-arm (CamC) that extends a classic C-arm by a video camera and mirror construction is characterized by its uncomplicated handling. It combines its video live stream geometrically correct with the acquired X-ray. The clinical application of the device in 43 cases showed the strengths of the device in positioning for X-ray acquisition, incision placement, K-wire placement, and instrument guidance. With its new function and the easy integration into the OR workflow of any procedure that requires X-ray imaging, the CamC has the potential to become the first widely used AR technology for orthopedic and trauma surgery.
Development of a mobile borehole investigation software using augmented reality
NASA Astrophysics Data System (ADS)
Son, J.; Lee, S.; Oh, M.; Yun, D. E.; Kim, S.; Park, H. D.
2015-12-01
Augmented reality (AR) is one of the most developing technologies in smartphone and IT areas. While various applications have been developed using the AR, there are a few geological applications which adopt its advantages. In this study, a smartphone application to manage boreholes using AR has been developed. The application is consisted of three major modules, an AR module, a map module and a data management module. The AR module calculates the orientation of the device and displays nearby boreholes distributed in three dimensions using the orientation. This module shows the boreholes in a transparent layer on a live camera screen so the user can find and understand the overall characteristics of the underground geology. The map module displays the boreholes on a 2D map to show their distribution and the location of the user. The database module uses SQLite library which has proper characteristics for mobile platforms, and Binary XML is adopted to enable containing additional customized data. The application is able to provide underground information in an intuitive and refined forms and to decrease time and general equipment required for geological field investigations.
Display technologies for augmented reality
NASA Astrophysics Data System (ADS)
Lee, Byoungho; Lee, Seungjae; Jang, Changwon; Hong, Jong-Young; Li, Gang
2018-02-01
With the virtue of rapid progress in optics, sensors, and computer science, we are witnessing that commercial products or prototypes for augmented reality (AR) are penetrating into the consumer markets. AR is spotlighted as expected to provide much more immersive and realistic experience than ordinary displays. However, there are several barriers to be overcome for successful commercialization of AR. Here, we explore challenging and important topics for AR such as image combiners, enhancement of display performance, and focus cue reproduction. Image combiners are essential to integrate virtual images with real-world. Display performance (e.g. field of view and resolution) is important for more immersive experience and focus cue reproduction may mitigate visual fatigue caused by vergence-accommodation conflict. We also demonstrate emerging technologies to overcome these issues: index-matched anisotropic crystal lens (IMACL), retinal projection displays, and 3D display with focus cues. For image combiners, a novel optical element called IMACL provides relatively wide field of view. Retinal projection displays may enhance field of view and resolution of AR displays. Focus cues could be reconstructed via multi-layer displays and holographic displays. Experimental results of our prototypes are explained.
The segmentation of the HMD market: optics for smart glasses, smart eyewear, AR and VR headsets
NASA Astrophysics Data System (ADS)
Kress, Bernard; Saeedi, Ehsan; Brac-de-la-Perriere, Vincent
2014-09-01
This paper reviews the various optical technologies that have been developed to implement HMDs (Head Mounted Displays), both as AR (Augmented Reality) devices, VR (Virtual Reality) devices and more recently as smart glasses, smart eyewear or connected glasses. We review the typical requirements and optical performances of such devices and categorize them into distinct groups, which are suited for different (and constantly evolving) market segments, and analyze such market segmentation.
Multithreaded hybrid feature tracking for markerless augmented reality.
Lee, Taehee; Höllerer, Tobias
2009-01-01
We describe a novel markerless camera tracking approach and user interaction methodology for augmented reality (AR) on unprepared tabletop environments. We propose a real-time system architecture that combines two types of feature tracking. Distinctive image features of the scene are detected and tracked frame-to-frame by computing optical flow. In order to achieve real-time performance, multiple operations are processed in a synchronized multi-threaded manner: capturing a video frame, tracking features using optical flow, detecting distinctive invariant features, and rendering an output frame. We also introduce user interaction methodology for establishing a global coordinate system and for placing virtual objects in the AR environment by tracking a user's outstretched hand and estimating a camera pose relative to it. We evaluate the speed and accuracy of our hybrid feature tracking approach, and demonstrate a proof-of-concept application for enabling AR in unprepared tabletop environments, using bare hands for interaction.
Beam steering for virtual/augmented reality displays with a cycloidal diffractive waveplate.
Chen, Haiwei; Weng, Yishi; Xu, Daming; Tabiryan, Nelson V; Wu, Shin-Tson
2016-04-04
We proposed a switchable beam steering device with cycloidal diffractive waveplate (CDW) for eye tracking in a virtual reality (VR) or augmented reality (AR) display system. Such a CDW diffracts the incident circularly polarized light to the first order with over 95% efficiency. To convert the input linearly polarized light to right-handed or left-handed circular polarization, we developed a broadband polarization switch consisting of a twisted nematic liquid crystal cell and an achromatic quarter-wave retardation film. By cascading 2-3 CDWs together, multiple diffraction angles can be achieved. To suppress the color dispersion, we proposed two approaches to obtain the same diffraction angle for red, green, and blue LEDs-based full color displays. Our device exhibits several advantages, such as high diffraction efficiency, fast response time, low power consumption, and low cost. It holds promise for the emerging VR/AR displays.
Interactive Molecular Graphics for Augmented Reality Using HoloLens.
Müller, Christoph; Krone, Michael; Huber, Markus; Biener, Verena; Herr, Dominik; Koch, Steffen; Reina, Guido; Weiskopf, Daniel; Ertl, Thomas
2018-06-13
Immersive technologies like stereo rendering, virtual reality, or augmented reality (AR) are often used in the field of molecular visualisation. Modern, comparably lightweight and affordable AR headsets like Microsoft's HoloLens open up new possibilities for immersive analytics in molecular visualisation. A crucial factor for a comprehensive analysis of molecular data in AR is the rendering speed. HoloLens, however, has limited hardware capabilities due to requirements like battery life, fanless cooling and weight. Consequently, insights from best practises for powerful desktop hardware may not be transferable. Therefore, we evaluate the capabilities of the HoloLens hardware for modern, GPU-enabled, high-quality rendering methods for the space-filling model commonly used in molecular visualisation. We also assess the scalability for large molecular data sets. Based on the results, we discuss ideas and possibilities for immersive molecular analytics. Besides more obvious benefits like the stereoscopic rendering offered by the device, this specifically includes natural user interfaces that use physical navigation instead of the traditional virtual one. Furthermore, we consider different scenarios for such an immersive system, ranging from educational use to collaborative scenarios.
Augmented reality on poster presentations, in the field and in the classroom
NASA Astrophysics Data System (ADS)
Hawemann, Friedrich; Kolawole, Folarin
2017-04-01
Augmented reality (AR) is the direct addition of virtual information through an interface to a real-world environment. In practice, through a mobile device such as a tablet or smartphone, information can be projected onto a target- for example, an image on a poster. Mobile devices are widely distributed today such that augmented reality is easily accessible to almost everyone. Numerous studies have shown that multi-dimensional visualization is essential for efficient perception of the spatial, temporal and geometrical configuration of geological structures and processes. Print media, such as posters and handouts lack the ability to display content in the third and fourth dimensions, which might be in space-domain as seen in three-dimensional (3-D) objects, or time-domain (four-dimensional, 4-D) expressible in the form of videos. Here, we show that augmented reality content can be complimentary to geoscience poster presentations, hands-on material and in the field. In the latter example, location based data is loaded and for example, a virtual geological profile can be draped over a real-world landscape. In object based AR, the application is trained to recognize an image or object through the camera of the user's mobile device, such that specific content is automatically downloaded and displayed on the screen of the device, and positioned relative to the trained image or object. We used ZapWorks, a commercially-available software application to create and present examples of content that is poster-based, in which important supplementary information is presented as interactive virtual images, videos and 3-D models. We suggest that the flexibility and real-time interactivity offered by AR makes it an invaluable tool for effective geoscience poster presentation, class-room and field geoscience learning.
NASA Astrophysics Data System (ADS)
Lam, Meng Chun; Nizam, Siti Soleha Muhammad; Arshad, Haslina; A'isyah Ahmad Shukri, Saidatul; Hashim, Nurhazarifah Che; Putra, Haekal Mozzia; Abidin, Rimaniza Zainal
2017-10-01
This article discusses the usability of an interactive application for halal products using Optical Character Recognition (OCR) and Augmented Reality (AR) technologies. Among the problems that have been identified in this study is that consumers have little knowledge about the E-Code. Therefore, users often have doubts about the halal status of the product. Nowadays, the integrity of halal status can be doubtful due to the actions of some irresponsible people spreading false information about a product. Therefore, an application that uses OCR and AR technology developed in this study will help the users to identify the information content of a product by scanning the E-Code label and by scanning the product's brand to know the halal status of the product. In this application, E-Code on the label of a product is scanned using OCR technology to display information about the E-Code. The product's brand is scan using augmented reality technology to display halal status of the product. The findings reveal that users are satisfied with this application and it is useful and easy to use.
Immersive realities: articulating the shift from VR to mobile AR through artistic practice
NASA Astrophysics Data System (ADS)
Margolis, Todd; Cornish, Tracy; Berry, Rodney; DeFanti, Thomas A.
2012-03-01
Our contemporary imaginings of technological engagement with digital environments has transitioned from flying through Virtual Reality to mobile interactions with the physical world through personal media devices. Experiences technologically mediated through social interactivity within physical environments are now being preferenced over isolated environments such as CAVEs or HMDs. Examples of this trend can be seen in early tele-collaborative artworks which strove to use advanced networking to join multiple participants in shared virtual environments. Recent developments in mobile AR allow untethered access to such shared realities in places far removed from labs and home entertainment environments, and without the bulky and expensive technologies attached to our bodies that accompany most VR. This paper addresses the emerging trend favoring socially immersive artworks via mobile Augmented Reality rather than sensorially immersive Virtual Reality installations. With particular focus on AR as a mobile, locative technology, we will discuss how concepts of immersion and interactivity are evolving with this new medium. Immersion in context of mobile AR can be redefined to describe socially interactive experiences. Having distinctly different sensory, spatial and situational properties, mobile AR offers a new form for remixing elements from traditional virtual reality with physically based social experiences. This type of immersion offers a wide array of potential for mobile AR art forms. We are beginning to see examples of how artists can use mobile AR to create social immersive and interactive experiences.
Zhu, Ming; Liu, Fei; Chai, Gang; Pan, Jun J.; Jiang, Taoran; Lin, Li; Xin, Yu; Zhang, Yan; Li, Qingfeng
2017-01-01
Augmented reality systems can combine virtual images with a real environment to ensure accurate surgery with lower risk. This study aimed to develop a novel registration and tracking technique to establish a navigation system based on augmented reality for maxillofacial surgery. Specifically, a virtual image is reconstructed from CT data using 3D software. The real environment is tracked by the augmented reality (AR) software. The novel registration strategy that we created uses an occlusal splint compounded with a fiducial marker (OSM) to establish a relationship between the virtual image and the real object. After the fiducial marker is recognized, the virtual image is superimposed onto the real environment, forming the “integrated image” on semi-transparent glass. Via the registration process, the integral image, which combines the virtual image with the real scene, is successfully presented on the semi-transparent helmet. The position error of this navigation system is 0.96 ± 0.51 mm. This augmented reality system was applied in the clinic and good surgical outcomes were obtained. The augmented reality system that we established for maxillofacial surgery has the advantages of easy manipulation and high accuracy, which can improve surgical outcomes. Thus, this system exhibits significant potential in clinical applications. PMID:28198442
Zhu, Ming; Liu, Fei; Chai, Gang; Pan, Jun J; Jiang, Taoran; Lin, Li; Xin, Yu; Zhang, Yan; Li, Qingfeng
2017-02-15
Augmented reality systems can combine virtual images with a real environment to ensure accurate surgery with lower risk. This study aimed to develop a novel registration and tracking technique to establish a navigation system based on augmented reality for maxillofacial surgery. Specifically, a virtual image is reconstructed from CT data using 3D software. The real environment is tracked by the augmented reality (AR) software. The novel registration strategy that we created uses an occlusal splint compounded with a fiducial marker (OSM) to establish a relationship between the virtual image and the real object. After the fiducial marker is recognized, the virtual image is superimposed onto the real environment, forming the "integrated image" on semi-transparent glass. Via the registration process, the integral image, which combines the virtual image with the real scene, is successfully presented on the semi-transparent helmet. The position error of this navigation system is 0.96 ± 0.51 mm. This augmented reality system was applied in the clinic and good surgical outcomes were obtained. The augmented reality system that we established for maxillofacial surgery has the advantages of easy manipulation and high accuracy, which can improve surgical outcomes. Thus, this system exhibits significant potential in clinical applications.
Establishment and Usability Evaluation of an Interactive AR Learning System on Conservation of Fish
ERIC Educational Resources Information Center
Lin, Hao-Chiang Koong; Hsieh, Min-Chai; Wang, Cheng-Hung; Sie, Zong-Yuan; Chang, Shei-Hsi
2011-01-01
In this study, we develop an interactive AR Learning System based on Augmented Reality and interactive touch-screen. The learning content knowledge is about conservation of fish in Taiwan. The system combines the game by the concept of AR book which allows children to learn about the importance of conservation of fish. A mechanism is designed to…
NASA Astrophysics Data System (ADS)
Shang, Low Wei; Siang, Tan Gek; Zakaria, Mohd Hafiz bin; Emran, Muhammad Helmy
2017-04-01
Augmented reality (AR) technology has undergone enormous advancement and now AR applications can be seamlessly executed using modern-day smartphones. This study aims to develop a mobile AR application which consists of 3D AR models of historical monuments located within the UNESCO World Heritage Site in Melaka. The application allows tourists to obtain information of the monuments from the AR models, which provide an alternative way of visiting the actual monuments to prevent overcrowding effect and promote heritage preservation. Perceived Usefulness (PU), Perceived Ease of Use (PEU), Facilitating Conditions (FC), and Perceived Playfulness (PP) are proposed as the determinants of user's Behavioural Intention to Use (BI) the application. Using 50 tourists in Melaka as respondents, a pilot study has been conducted to determine user's acceptance of the AR mobile application based on the Unified Theory of Acceptance and Use of Technology (UTAUT). Cronbach's Alpha test validated the internal consistency of the measures. Multiple Linear Regression analysis suggested that the proposed determinants explained 51.2% in user's BI the application. PU was the strongest determinant followed by FC while PEU and PP were found to be insignificant.
Augmented reality user interface for mobile ground robots with manipulator arms
NASA Astrophysics Data System (ADS)
Vozar, Steven; Tilbury, Dawn M.
2011-01-01
Augmented Reality (AR) is a technology in which real-world visual data is combined with an overlay of computer graphics, enhancing the original feed. AR is an attractive tool for teleoperated UGV UIs as it can improve communication between robots and users via an intuitive spatial and visual dialogue, thereby increasing operator situational awareness. The successful operation of UGVs often relies upon both chassis navigation and manipulator arm control, and since existing literature usually focuses on one task or the other, there is a gap in mobile robot UIs that take advantage of AR for both applications. This work describes the development and analysis of an AR UI system for a UGV with an attached manipulator arm. The system supplements a video feed shown to an operator with information about geometric relationships within the robot task space to improve the operator's situational awareness. Previous studies on AR systems and preliminary analyses indicate that such an implementation of AR for a mobile robot with a manipulator arm is anticipated to improve operator performance. A full user-study can determine if this hypothesis is supported by performing an analysis of variance on common test metrics associated with UGV teleoperation.
Learning Science Using AR Book: A Preliminary Study on Visual Needs of Deaf Learners
NASA Astrophysics Data System (ADS)
Megat Mohd. Zainuddin, Norziha; Badioze Zaman, Halimah; Ahmad, Azlina
Augmented Reality (AR) is a technology that is projected to have more significant role in teaching and learning, particularly in visualising abstract concepts in the learning process. AR is a technology is based on visually oriented technique. Thus, it is suitable for deaf learners since they are generally classified as visual learners. Realising the importance of visual learning style for deaf learners in learning Science, this paper reports on a preliminary study of on an ongoing research on problems faced by deaf learners in learning the topic on Microorganisms. Being visual learners, they have problems with current text books that are more text-based that graphic based. In this preliminary study, a qualitative approach using the ethnographic observational technique was used so that interaction with three deaf learners who are participants throughout this study (they are also involved actively in the design and development of the AR Book). An interview with their teacher and doctor were also conducted to identify their learning and medical problems respectively. Preliminary findings have confirmed the need to design and develop a special Augmented Reality Book called AR-Science for Deaf Learners (AR-SiD).
Transduction between worlds: using virtual and mixed reality for earth and planetary science
NASA Astrophysics Data System (ADS)
Hedley, N.; Lochhead, I.; Aagesen, S.; Lonergan, C. D.; Benoy, N.
2017-12-01
Virtual reality (VR) and augmented reality (AR) have the potential to transform the way we visualize multidimensional geospatial datasets in support of geoscience research, exploration and analysis. The beauty of virtual environments is that they can be built at any scale, users can view them at many levels of abstraction, move through them in unconventional ways, and experience spatial phenomena as if they had superpowers. Similarly, augmented reality allows you to bring the power of virtual 3D data visualizations into everyday spaces. Spliced together, these interface technologies hold incredible potential to support 21st-century geoscience. In my ongoing research, my team and I have made significant advances to connect data and virtual simulations with real geographic spaces, using virtual environments, geospatial augmented reality and mixed reality. These research efforts have yielded new capabilities to connect users with spatial data and phenomena. These innovations include: geospatial x-ray vision; flexible mixed reality; augmented 3D GIS; situated augmented reality 3D simulations of tsunamis and other phenomena interacting with real geomorphology; augmented visual analytics; and immersive GIS. These new modalities redefine the ways in which we can connect digital spaces of spatial analysis, simulation and geovisualization, with geographic spaces of data collection, fieldwork, interpretation and communication. In a way, we are talking about transduction between real and virtual worlds. Taking a mixed reality approach to this, we can link real and virtual worlds. This paper presents a selection of our 3D geovisual interface projects in terrestrial, coastal, underwater and other environments. Using rigorous applied geoscience data, analyses and simulations, our research aims to transform the novelty of virtual and augmented reality interface technologies into game-changing mixed reality geoscience.
Investigating Student Attitudes toward Augmented Reality
ERIC Educational Resources Information Center
Sirakaya, Mustafa; Kiliç Çakmak, Ebru
2018-01-01
This study aimed at identifying the attitudes of secondary school students toward AR applications and to investigate the change in these attitudes according to different variables. The study also aspired to determine the relationship between attitudes toward AR and achievement. The general survey model was used in the study. The study group was…
A Context-Aware Ubiquitous Learning Environment for Language Listening and Speaking
ERIC Educational Resources Information Center
Liu, T.-Y.
2009-01-01
This paper reported the results of a study that aimed to construct a sensor and handheld augmented reality (AR)-supported ubiquitous learning (u-learning) environment called the Handheld English Language Learning Organization (HELLO), which is geared towards enhancing students' language learning. The HELLO integrates sensors, AR, ubiquitous…
Analysis of Mental Workload in Online Shopping: Are Augmented and Virtual Reality Consistent?
Zhao, Xiaojun; Shi, Changxiu; You, Xuqun; Zong, Chenming
2017-01-01
A market research company (Nielsen) reported that consumers in the Asia-Pacific region have become the most active group in online shopping. Focusing on augmented reality (AR), which is one of three major techniques used to change the method of shopping in the future, this study used a mixed design to discuss the influences of the method of online shopping, user gender, cognitive style, product value, and sensory channel on mental workload in virtual reality (VR) and AR situations. The results showed that males' mental workloads were significantly higher than females'. For males, high-value products' mental workload was significantly higher than that of low-value products. In the VR situation, the visual mental workload of field-independent and field-dependent consumers showed a significant difference, but the difference was reduced under audio-visual conditions. In the AR situation, the visual mental workload of field-independent and field-dependent consumers showed a significant difference, but the difference increased under audio-visual conditions. This study provided a psychological study of online shopping with AR and VR technology with applications in the future. Based on the perspective of embodied cognition, AR online shopping may be potential focus of research and market application. For the future design of online shopping platforms and the updating of user experience, this study provides a reference.
Analysis of Mental Workload in Online Shopping: Are Augmented and Virtual Reality Consistent?
Zhao, Xiaojun; Shi, Changxiu; You, Xuqun; Zong, Chenming
2017-01-01
A market research company (Nielsen) reported that consumers in the Asia-Pacific region have become the most active group in online shopping. Focusing on augmented reality (AR), which is one of three major techniques used to change the method of shopping in the future, this study used a mixed design to discuss the influences of the method of online shopping, user gender, cognitive style, product value, and sensory channel on mental workload in virtual reality (VR) and AR situations. The results showed that males’ mental workloads were significantly higher than females’. For males, high-value products’ mental workload was significantly higher than that of low-value products. In the VR situation, the visual mental workload of field-independent and field-dependent consumers showed a significant difference, but the difference was reduced under audio–visual conditions. In the AR situation, the visual mental workload of field-independent and field-dependent consumers showed a significant difference, but the difference increased under audio–visual conditions. This study provided a psychological study of online shopping with AR and VR technology with applications in the future. Based on the perspective of embodied cognition, AR online shopping may be potential focus of research and market application. For the future design of online shopping platforms and the updating of user experience, this study provides a reference. PMID:28184207
The effectiveness of virtual and augmented reality in health sciences and medical anatomy.
Moro, Christian; Štromberga, Zane; Raikos, Athanasios; Stirling, Allan
2017-11-01
Although cadavers constitute the gold standard for teaching anatomy to medical and health science students, there are substantial financial, ethical, and supervisory constraints on their use. In addition, although anatomy remains one of the fundamental areas of medical education, universities have decreased the hours allocated to teaching gross anatomy in favor of applied clinical work. The release of virtual (VR) and augmented reality (AR) devices allows learning to occur through hands-on immersive experiences. The aim of this research was to assess whether learning structural anatomy utilizing VR or AR is as effective as tablet-based (TB) applications, and whether these modes allowed enhanced student learning, engagement and performance. Participants (n = 59) were randomly allocated to one of the three learning modes: VR, AR, or TB and completed a lesson on skull anatomy, after which they completed an anatomical knowledge assessment. Student perceptions of each learning mode and any adverse effects experienced were recorded. No significant differences were found between mean assessment scores in VR, AR, or TB. During the lessons however, VR participants were more likely to exhibit adverse effects such as headaches (25% in VR P < 0.05), dizziness (40% in VR, P < 0.001), or blurred vision (35% in VR, P < 0.01). Both VR and AR are as valuable for teaching anatomy as tablet devices, but also promote intrinsic benefits such as increased learner immersion and engagement. These outcomes show great promise for the effective use of virtual and augmented reality as means to supplement lesson content in anatomical education. Anat Sci Educ 10: 549-559. © 2017 American Association of Anatomists. © 2017 American Association of Anatomists.
Holographic Rovers: Augmented Reality and the Microsoft HoloLens
NASA Technical Reports Server (NTRS)
Toler, Laura
2017-01-01
Augmented Reality is an emerging field in technology, and encompasses Head Mounted Displays, smartphone apps, and even projected images. HMDs include the Meta 2, Magic Leap, Avegant Light Field, and the Microsoft HoloLens, which is evaluated specifically. The Microsoft HoloLens is designed to be used as an AR personal computer, and is being optimized with that goal in mind. Microsoft allied with the Unity3D game engine to create an SDK for interested application developers that can be used in the Unity environment.
Cutolo, Fabrizio; Meola, Antonio; Carbone, Marina; Sinceri, Sara; Cagnazzo, Federico; Denaro, Ennio; Esposito, Nicola; Ferrari, Mauro; Ferrari, Vincenzo
2017-12-01
Benefits of minimally invasive neurosurgery mandate the development of ergonomic paradigms for neuronavigation. Augmented Reality (AR) systems can overcome the shortcomings of commercial neuronavigators. The aim of this work is to apply a novel AR system, based on a head-mounted stereoscopic video see-through display, as an aid in complex neurological lesion targeting. Effectiveness was investigated on a newly designed patient-specific head mannequin featuring an anatomically realistic brain phantom with embedded synthetically created tumors and eloquent areas. A two-phase evaluation process was adopted in a simulated small tumor resection adjacent to Broca's area. Phase I involved nine subjects without neurosurgical training in performing spatial judgment tasks. In Phase II, three surgeons were involved in assessing the effectiveness of the AR-neuronavigator in performing brain tumor targeting on a patient-specific head phantom. Phase I revealed the ability of the AR scene to evoke depth perception under different visualization modalities. Phase II confirmed the potentialities of the AR-neuronavigator in aiding the determination of the optimal surgical access to the surgical target. The AR-neuronavigator is intuitive, easy-to-use, and provides three-dimensional augmented information in a perceptually-correct way. The system proved to be effective in guiding skin incision, craniotomy, and lesion targeting. The preliminary results encourage a structured study to prove clinical effectiveness. Moreover, our testing platform might be used to facilitate training in brain tumour resection procedures.
Realistic Real-Time Outdoor Rendering in Augmented Reality
Kolivand, Hoshang; Sunar, Mohd Shahrizal
2014-01-01
Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems. PMID:25268480
Realistic real-time outdoor rendering in augmented reality.
Kolivand, Hoshang; Sunar, Mohd Shahrizal
2014-01-01
Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems.
[VR and AR Applications in Medical Practice and Education].
Hsieh, Min-Chai; Lin, Yu-Hsuan
2017-12-01
As technology advances, mobile devices have gradually turned into wearable devices. Furthermore, virtual reality (VR), augmented reality (AR), and mixed reality (MR) are being increasingly applied in medical fields such as medical education and training, surgical simulation, neurological rehabilitation, psychotherapy, and telemedicine. Research results demonstrate the ability of VR, AR, and MR to ameliorate the inconveniences that are often associated with traditional medical care, reduce incidents of medical malpractice caused by unskilled operations, and reduce the cost of medical education and training. What is more, the application of these technologies has enhanced the effectiveness of medical education and training, raised the level of diagnosis and treatment, improved the doctor-patient relationship, and boosted the efficiency of medical execution. The present study introduces VR, AR, and MR applications in medical practice and education with the aim of helping health professionals better understand the applications and use these technologies to improve the quality of medical care.
Performance Of The IEEE 802.15.4 Protocol As The Marker Of Augmented Reality In Museum
NASA Astrophysics Data System (ADS)
Kurniawan Saputro, Adi; Sumpeno, Surya; Hariadi, Mochamad
2018-04-01
Museum is a place to keep the historic objects and historical education center to introduce the nation’s culture. Utilizing technology in a museum to become a smart city is a challenge. Internet of thing (IOT) is a technological advance in Information and communication (ICT) that can be applied in the museum The current ICT development is not only a transmission medium, but Augmented Reality technology is also being developed. Currently, Augmented Reality technology creates virtual objects into the real world using markers or images. In this study, researcher used signals to make virtual objects appear in the real world using the IEEE 802.14.5 protocol replacing the Augmented Reality marker. RSSI and triangulation are used as a substitute microlocation for AR objects. The result is the performance of Wireless Sensor Network could be used for data transmission in the museum. LOS research at a distance of 15 meters with 1000 ms delay found 1.4% error rate and NLOS with 2.3% error rate. So it can be concluded that utilization technology (IOT) using signal wireless sensor network as a replace for marker augmented reality can be used in museum
A novel 3D guidance system using augmented reality for percutaneous vertebroplasty: technical note.
Abe, Yuichiro; Sato, Shigenobu; Kato, Koji; Hyakumachi, Takahiko; Yanagibashi, Yasushi; Ito, Manabu; Abumi, Kuniyoshi
2013-10-01
Augmented reality (AR) is an imaging technology by which virtual objects are overlaid onto images of real objects captured in real time by a tracking camera. This study aimed to introduce a novel AR guidance system called virtual protractor with augmented reality (VIPAR) to visualize a needle trajectory in 3D space during percutaneous vertebroplasty (PVP). The AR system used for this study comprised a head-mount display (HMD) with a tracking camera and a marker sheet. An augmented scene was created by overlaying the preoperatively generated needle trajectory path onto a marker detected on the patient using AR software, thereby providing the surgeon with augmented views in real time through the HMD. The accuracy of the system was evaluated by using a computer-generated simulation model in a spine phantom and also evaluated clinically in 5 patients. In the 40 spine phantom trials, the error of the insertion angle (EIA), defined as the difference between the attempted angle and the insertion angle, was evaluated using 3D CT scanning. Computed tomography analysis of the 40 spine phantom trials showed that the EIA in the axial plane significantly improved when VIPAR was used compared with when it was not used (0.96° ± 0.61° vs 4.34° ± 2.36°, respectively). The same held true for EIA in the sagittal plane (0.61° ± 0.70° vs 2.55° ± 1.93°, respectively). In the clinical evaluation of the AR system, 5 patients with osteoporotic vertebral fractures underwent VIPAR-guided PVP from October 2011 to May 2012. The postoperative EIA was evaluated using CT. The clinical results of the 5 patients showed that the EIA in all 10 needle insertions was 2.09° ± 1.3° in the axial plane and 1.98° ± 1.8° in the sagittal plane. There was no pedicle breach or leakage of polymethylmethacrylate. VIPAR was successfully used to assist in needle insertion during PVP by providing the surgeon with an ideal insertion point and needle trajectory through the HMD. The findings indicate that AR guidance technology can become a useful assistive device during spine surgeries requiring percutaneous procedures.
Pokémon Go: An Unexpected Inspiration for Next Generation Learning Environments
ERIC Educational Resources Information Center
Nigaglioni, Irene
2017-01-01
Although mobile applications and games often seem isolating and somewhat stationary, last year's augmented reality (AR) gaming craze Pokémon Go demonstrated how technology has the potential to promote socialization, collaboration, and physical activity while still engaging users. Pokémon Go's use of AR technology, which superimposes…
Flow Experience and Educational Effectiveness of Teaching Informatics Using AR
ERIC Educational Resources Information Center
Giasiranis, Stefanos; Sofos, Loizos
2017-01-01
The purpose of this study was the investigation of the added value of technology of augmented reality (AR) in education and, particularly, whether this contributes to both student performance improvement, as well as the appearance of the psychological condition of Flow, which according to research, has had a positive effect on their performance…
Augmented reality-based electrode guidance system for reliable electroencephalography.
Song, Chanho; Jeon, Sangseo; Lee, Seongpung; Ha, Ho-Gun; Kim, Jonghyun; Hong, Jaesung
2018-05-24
In longitudinal electroencephalography (EEG) studies, repeatable electrode positioning is essential for reliable EEG assessment. Conventional methods use anatomical landmarks as fiducial locations for the electrode placement. Since the landmarks are manually identified, the EEG assessment is inevitably unreliable because of individual variations among the subjects and the examiners. To overcome this unreliability, an augmented reality (AR) visualization-based electrode guidance system was proposed. The proposed electrode guidance system is based on AR visualization to replace the manual electrode positioning. After scanning and registration of the facial surface of a subject by an RGB-D camera, the AR of the initial electrode positions as reference positions is overlapped with the current electrode positions in real time. Thus, it can guide the position of the subsequently placed electrodes with high repeatability. The experimental results with the phantom show that the repeatability of the electrode positioning was improved compared to that of the conventional 10-20 positioning system. The proposed AR guidance system improves the electrode positioning performance with a cost-effective system, which uses only RGB-D camera. This system can be used as an alternative to the international 10-20 system.
NASA Astrophysics Data System (ADS)
Lee, Sangho; Suh, Jangwon; Park, Hyeong-Dong
2015-03-01
Boring logs are widely used in geological field studies since the data describes various attributes of underground and surface environments. However, it is difficult to manage multiple boring logs in the field as the conventional management and visualization methods are not suitable for integrating and combining large data sets. We developed an iPad application to enable its user to search the boring log rapidly and visualize them using the augmented reality (AR) technique. For the development of the application, a standard borehole database appropriate for a mobile-based borehole database management system was designed. The application consists of three modules: an AR module, a map module, and a database module. The AR module superimposes borehole data on camera imagery as viewed by the user and provides intuitive visualization of borehole locations. The map module shows the locations of corresponding borehole data on a 2D map with additional map layers. The database module provides data management functions for large borehole databases for other modules. Field survey was also carried out using more than 100,000 borehole data.
Hand gesture guided robot-assisted surgery based on a direct augmented reality interface.
Wen, Rong; Tay, Wei-Liang; Nguyen, Binh P; Chng, Chin-Boon; Chui, Chee-Kong
2014-09-01
Radiofrequency (RF) ablation is a good alternative to hepatic resection for treatment of liver tumors. However, accurate needle insertion requires precise hand-eye coordination and is also affected by the difficulty of RF needle navigation. This paper proposes a cooperative surgical robot system, guided by hand gestures and supported by an augmented reality (AR)-based surgical field, for robot-assisted percutaneous treatment. It establishes a robot-assisted natural AR guidance mechanism that incorporates the advantages of the following three aspects: AR visual guidance information, surgeon's experiences and accuracy of robotic surgery. A projector-based AR environment is directly overlaid on a patient to display preoperative and intraoperative information, while a mobile surgical robot system implements specified RF needle insertion plans. Natural hand gestures are used as an intuitive and robust method to interact with both the AR system and surgical robot. The proposed system was evaluated on a mannequin model. Experimental results demonstrated that hand gesture guidance was able to effectively guide the surgical robot, and the robot-assisted implementation was found to improve the accuracy of needle insertion. This human-robot cooperative mechanism is a promising approach for precise transcutaneous ablation therapy. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Dunleavy, Matt; Dede, Chris; Mitchell, Rebecca
2009-02-01
The purpose of this study was to document how teachers and students describe and comprehend the ways in which participating in an augmented reality (AR) simulation aids or hinders teaching and learning. Like the multi-user virtual environment (MUVE) interface that underlies Internet games, AR is a good medium for immersive collaborative simulation, but has different strengths and limitations than MUVEs. Within a design-based research project, the researchers conducted multiple qualitative case studies across two middle schools (6th and 7th grade) and one high school (10th grade) in the northeastern United States to document the affordances and limitations of AR simulations from the student and teacher perspective. The researchers collected data through formal and informal interviews, direct observations, web site posts, and site documents. Teachers and students reported that the technology-mediated narrative and the interactive, situated, collaborative problem solving affordances of the AR simulation were highly engaging, especially among students who had previously presented behavioral and academic challenges for the teachers. However, while the AR simulation provided potentially transformative added value, it simultaneously presented unique technological, managerial, and cognitive challenges to teaching and learning.
Duan, Liya; Guan, Tao; Yang, Bo
2009-01-01
Augmented reality (AR) is a field of computer research which deals with the combination of real world and computer generated data. Registration is one of the most difficult problems currently limiting the usability of AR systems. In this paper, we propose a novel natural feature tracking based registration method for AR applications. The proposed method has following advantages: (1) it is simple and efficient, as no man-made markers are needed for both indoor and outdoor AR applications; moreover, it can work with arbitrary geometric shapes including planar, near planar and non planar structures which really enhance the usability of AR systems. (2) Thanks to the reduced SIFT based augmented optical flow tracker, the virtual scene can still be augmented on the specified areas even under the circumstances of occlusion and large changes in viewpoint during the entire process. (3) It is easy to use, because the adaptive classification tree based matching strategy can give us fast and accurate initialization, even when the initial camera is different from the reference image to a large degree. Experimental evaluations validate the performance of the proposed method for online pose tracking and augmentation.
Baker, Benjamin; Amin, Kavit; Chan, Adrian; Patel, Ketan; Wong, Jason
2016-01-01
The continuing enhancement of the surgical environment in the digital age has led to a number of innovations being highlighted as potential disruptive technologies in the surgical workplace. Augmented reality (AR) and virtual reality (VR) are rapidly becoming increasingly available, accessible and importantly affordable, hence their application into healthcare to enhance the medical use of data is certain. Whether it relates to anatomy, intraoperative surgery, or post-operative rehabilitation, applications are already being investigated for their role in the surgeons armamentarium. Here we provide an introduction to the technology and the potential areas of development in the surgical arena. PMID:28090510
Khor, Wee Sim; Baker, Benjamin; Amin, Kavit; Chan, Adrian; Patel, Ketan; Wong, Jason
2016-12-01
The continuing enhancement of the surgical environment in the digital age has led to a number of innovations being highlighted as potential disruptive technologies in the surgical workplace. Augmented reality (AR) and virtual reality (VR) are rapidly becoming increasingly available, accessible and importantly affordable, hence their application into healthcare to enhance the medical use of data is certain. Whether it relates to anatomy, intraoperative surgery, or post-operative rehabilitation, applications are already being investigated for their role in the surgeons armamentarium. Here we provide an introduction to the technology and the potential areas of development in the surgical arena.
Using augmented reality to teach and learn biochemistry.
Vega Garzón, Juan Carlos; Magrini, Marcio Luiz; Galembeck, Eduardo
2017-09-01
Understanding metabolism and metabolic pathways constitutes one of the central aims for students of biological sciences. Learning metabolic pathways should be focused on the understanding of general concepts and core principles. New technologies such Augmented Reality (AR) have shown potential to improve assimilation of biochemistry abstract concepts because students can manipulate 3D molecules in real time. Here we describe an application named Augmented Reality Metabolic Pathways (ARMET), which allowed students to visualize the 3D molecular structure of substrates and products, thus perceiving changes in each molecule. The structural modification of molecules shows students the flow and exchange of compounds and energy through metabolism. © 2017 by The International Union of Biochemistry and Molecular Biology, 45(5):417-420, 2017. © 2017 The International Union of Biochemistry and Molecular Biology.
Augmented reality in a tumor resection model.
Chauvet, Pauline; Collins, Toby; Debize, Clement; Novais-Gameiro, Lorraine; Pereira, Bruno; Bartoli, Adrien; Canis, Michel; Bourdel, Nicolas
2018-03-01
Augmented Reality (AR) guidance is a technology that allows a surgeon to see sub-surface structures, by overlaying pre-operative imaging data on a live laparoscopic video. Our objectives were to evaluate a state-of-the-art AR guidance system in a tumor surgical resection model, comparing the accuracy of the resection with and without the system. Our system has three phases. Phase 1: using the MRI images, the kidney's and pseudotumor's surfaces are segmented to construct a 3D model. Phase 2: the intra-operative 3D model of the kidney is computed. Phase 3: the pre-operative and intra-operative models are registered, and the laparoscopic view is augmented with the pre-operative data. We performed a prospective experimental study on ex vivo porcine kidneys. Alginate was injected into the parenchyma to create pseudotumors measuring 4-10 mm. The kidneys were then analyzed by MRI. Next, the kidneys were placed into pelvictrainers, and the pseudotumors were laparoscopically resected. The AR guidance system allows the surgeon to see tumors and margins using classical laparoscopic instruments, and a classical screen. The resection margins were measured microscopically to evaluate the accuracy of resection. Ninety tumors were segmented: 28 were used to optimize the AR software, and 62 were used to randomly compare surgical resection: 29 tumors were resected using AR and 33 without AR. The analysis of our pathological results showed 4 failures (tumor with positive margins) (13.8%) in the AR group, and 10 (30.3%) in the Non-AR group. There was no complete miss in the AR group, while there were 4 complete misses in the non-AR group. In total, 14 (42.4%) tumors were completely missed or had a positive margin in the non-AR group. Our AR system enhances the accuracy of surgical resection, particularly for small tumors. Crucial information such as resection margins and vascularization could also be displayed.
Applied Operations Research: Augmented Reality in an Industrial Environment
NASA Technical Reports Server (NTRS)
Cole, Stuart K.
2015-01-01
Augmented reality is the application of computer generated data or graphics onto a real world view. Its use provides the operator additional information or a heightened situational awareness. While advancements have been made in automation and diagnostics of high value critical equipment to improve readiness, reliability and maintenance, the need for assisting and support to Operations and Maintenance staff persists. AR can improve the human machine interface where computer capabilities maximize the human experience and analysis capabilities. NASA operates multiple facilities with complex ground based HVCE in support of national aerodynamics and space exploration, and the need exists to improve operational support and close a gap related to capability sustainment where key and experienced staff consistently rotate work assignments and reach their expiration of term of service. The initiation of an AR capability to augment and improve human abilities and training experience in the industrial environment requires planning and establishment of a goal and objectives for the systems and specific applications. This paper explored use of AR in support of Operation staff in real time operation of HVCE and its maintenance. The results identified include identification of specific goal and objectives, challenges related to availability and computer system infrastructure.
Wang, Yu-Jen; Chen, Po-Ju; Liang, Xiao; Lin, Yi-Hsin
2017-03-27
Augmented reality (AR), which use computer-aided projected information to augment our sense, has important impact on human life, especially for the elder people. However, there are three major challenges regarding the optical system in the AR system, which are registration, vision correction, and readability under strong ambient light. Here, we solve three challenges simultaneously for the first time using two liquid crystal (LC) lenses and polarizer-free attenuator integrated in optical-see-through AR system. One of the LC lens is used to electrically adjust the position of the projected virtual image which is so-called registration. The other LC lens with larger aperture and polarization independent characteristic is in charge of vision correction, such as myopia and presbyopia. The linearity of lens powers of two LC lenses is also discussed. The readability of virtual images under strong ambient light is solved by electrically switchable transmittance of the LC attenuator originating from light scattering and light absorption. The concept demonstrated in this paper could be further extended to other electro-optical devices as long as the devices exhibit the capability of phase modulations and amplitude modulations.
Application of real-time single camera SLAM technology for image-guided targeting in neurosurgery
NASA Astrophysics Data System (ADS)
Chang, Yau-Zen; Hou, Jung-Fu; Tsao, Yi Hsiang; Lee, Shih-Tseng
2012-10-01
In this paper, we propose an application of augmented reality technology for targeting tumors or anatomical structures inside the skull. The application is a combination of the technologies of MonoSLAM (Single Camera Simultaneous Localization and Mapping) and computer graphics. A stereo vision system is developed to construct geometric data of human face for registration with CT images. Reliability and accuracy of the application is enhanced by the use of fiduciary markers fixed to the skull. The MonoSLAM keeps track of the current location of the camera with respect to an augmented reality (AR) marker using the extended Kalman filter. The fiduciary markers provide reference when the AR marker is invisible to the camera. Relationship between the markers on the face and the augmented reality marker is obtained by a registration procedure by the stereo vision system and is updated on-line. A commercially available Android based tablet PC equipped with a 320×240 front-facing camera was used for implementation. The system is able to provide a live view of the patient overlaid by the solid models of tumors or anatomical structures, as well as the missing part of the tool inside the skull.
Real-Time Motion Tracking for Mobile Augmented/Virtual Reality Using Adaptive Visual-Inertial Fusion
Fang, Wei; Zheng, Lianyu; Deng, Huanjun; Zhang, Hongbo
2017-01-01
In mobile augmented/virtual reality (AR/VR), real-time 6-Degree of Freedom (DoF) motion tracking is essential for the registration between virtual scenes and the real world. However, due to the limited computational capacity of mobile terminals today, the latency between consecutive arriving poses would damage the user experience in mobile AR/VR. Thus, a visual-inertial based real-time motion tracking for mobile AR/VR is proposed in this paper. By means of high frequency and passive outputs from the inertial sensor, the real-time performance of arriving poses for mobile AR/VR is achieved. In addition, to alleviate the jitter phenomenon during the visual-inertial fusion, an adaptive filter framework is established to cope with different motion situations automatically, enabling the real-time 6-DoF motion tracking by balancing the jitter and latency. Besides, the robustness of the traditional visual-only based motion tracking is enhanced, giving rise to a better mobile AR/VR performance when motion blur is encountered. Finally, experiments are carried out to demonstrate the proposed method, and the results show that this work is capable of providing a smooth and robust 6-DoF motion tracking for mobile AR/VR in real-time. PMID:28475145
Augmented reality for personalized nanomedicines.
Lee, Yugyung; Lee, Chi H
As our understanding of onset and progress of diseases at the genetic and molecular level rapidly progresses, the potential of advanced technologies, such as 3D-printing, Socially-Assistive Robots (SARs) or augmented reality (AR), that are applied to personalized nanomedicines (PNMs) to alleviate pathological conditions, has become more prominent. Among advanced technologies, AR in particular has the greatest potential to address those challenges and facilitate the translation of PNMs into formidable clinical application of personalized therapy. As AR is about to adapt additional new methods, such as speech, voice recognition, eye tracing and motion tracking, to enable interaction with host response or biological systems in 3-D space, a combination of multiple approaches to accommodate varying environmental conditions, such as public noise and atmosphere brightness, will be explored to improve its therapeutic outcomes in clinical applications. For instance, AR glasses still being developed by Facebook or Microsoft will serve as new platform that can provide people with the health information they are interested in or various measures through which they can interact with medical services. This review has addressed the current progress and impact of AR on PNMs and its application to the biomedical field. Special emphasis is placed on the application of AR based PNMs to the treatment strategies against senior care, drug addiction and medication adherence. Published by Elsevier Inc.
Fang, Wei; Zheng, Lianyu; Deng, Huanjun; Zhang, Hongbo
2017-05-05
In mobile augmented/virtual reality (AR/VR), real-time 6-Degree of Freedom (DoF) motion tracking is essential for the registration between virtual scenes and the real world. However, due to the limited computational capacity of mobile terminals today, the latency between consecutive arriving poses would damage the user experience in mobile AR/VR. Thus, a visual-inertial based real-time motion tracking for mobile AR/VR is proposed in this paper. By means of high frequency and passive outputs from the inertial sensor, the real-time performance of arriving poses for mobile AR/VR is achieved. In addition, to alleviate the jitter phenomenon during the visual-inertial fusion, an adaptive filter framework is established to cope with different motion situations automatically, enabling the real-time 6-DoF motion tracking by balancing the jitter and latency. Besides, the robustness of the traditional visual-only based motion tracking is enhanced, giving rise to a better mobile AR/VR performance when motion blur is encountered. Finally, experiments are carried out to demonstrate the proposed method, and the results show that this work is capable of providing a smooth and robust 6-DoF motion tracking for mobile AR/VR in real-time.
A brief review of augmented reality science learning
NASA Astrophysics Data System (ADS)
Gopalan, Valarmathie; Bakar, Juliana Aida Abu; Zulkifli, Abdul Nasir
2017-10-01
This paper reviews several literatures concerning the theories and model that could be applied for science motivation for upper secondary school learners (16-17 years old) in order to make the learning experience more amazing and useful. The embedment of AR in science could bring an awe-inspiring transformation on learners' viewpoint towards the respective subject matters. Augmented Reality is able to present the real and virtual learning experience with the addition of multiple media without replacing the real environment. Due to the unique feature of AR, it attracts the mass attention of researchers to implement AR in science learning. This impressive technology offers learners with the ultimate visualization and provides an astonishing and transparent learning experience by bringing to light the unseen perspective of the learning content. This paper will attract the attention of researchers in the related field as well as academicians in the related discipline. This paper aims to propose several related theoretical guidance that could be applied in science motivation to transform the learning in an effective way.
A Context-Aware Method for Authentically Simulating Outdoors Shadows for Mobile Augmented Reality.
Barreira, Joao; Bessa, Maximino; Barbosa, Luis; Magalhaes, Luis
2018-03-01
Visual coherence between virtual and real objects is a major issue in creating convincing augmented reality (AR) applications. To achieve this seamless integration, actual light conditions must be determined in real time to ensure that virtual objects are correctly illuminated and cast consistent shadows. In this paper, we propose a novel method to estimate daylight illumination and use this information in outdoor AR applications to render virtual objects with coherent shadows. The illumination parameters are acquired in real time from context-aware live sensor data. The method works under unprepared natural conditions. We also present a novel and rapid implementation of a state-of-the-art skylight model, from which the illumination parameters are derived. The Sun's position is calculated based on the user location and time of day, with the relative rotational differences estimated from a gyroscope, compass and accelerometer. The results illustrated that our method can generate visually credible AR scenes with consistent shadows rendered from recovered illumination.
Augmented reality visualization of deformable tubular structures for surgical simulation.
Ferrari, Vincenzo; Viglialoro, Rosanna Maria; Nicoli, Paola; Cutolo, Fabrizio; Condino, Sara; Carbone, Marina; Siesto, Mentore; Ferrari, Mauro
2016-06-01
Surgical simulation based on augmented reality (AR), mixing the benefits of physical and virtual simulation, represents a step forward in surgical training. However, available systems are unable to update the virtual anatomy following deformations impressed on actual anatomy. A proof-of-concept solution is described providing AR visualization of hidden deformable tubular structures using nitinol tubes sensorized with electromagnetic sensors. This system was tested in vitro on a setup comprised of sensorized cystic, left and right hepatic, and proper hepatic arteries. In the trial session, the surgeon deformed the tubular structures with surgical forceps in 10 positions. The mean, standard deviation, and maximum misalignment between virtual and real arteries were 0.35, 0.22, and 0.99 mm, respectively. The alignment accuracy obtained demonstrates the feasibility of the approach, which can be adopted in advanced AR simulations, in particular as an aid to the identification and isolation of tubular structures. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Interacting with Visual Poems through AR-Based Digital Artwork
ERIC Educational Resources Information Center
Lin, Hao-Chiang Koong; Hsieh, Min-Chai; Liu, Eric Zhi-Feng; Chuang, Tsung-Yen
2012-01-01
In this study, an AR-based digital artwork called "Mind Log" was designed and evaluated. The augmented reality technique was employed to create digital artwork that would present interactive poems. A digital poem was generated via the interplay between a video film and a text-based poem. This artwork was created following a rigorous design flow,…
Location-Based Augmented Reality for Mobile Learning: Algorithm, System, and Implementation
ERIC Educational Resources Information Center
Tan, Qing; Chang, William; Kinshuk
2015-01-01
AR technology can be considered as mainly consisting of two aspects: identification of real-world object and display of computer-generated digital contents related the identified real-world object. The technical challenge of mobile AR is to identify the real-world object that mobile device's camera aim at. In this paper, we will present a…
NASA Astrophysics Data System (ADS)
Jenkins, H. S.; Gant, R.; Hopkins, D.
2014-12-01
Teaching natural science in a technologically advancing world requires that our methods reach beyond the traditional computer interface. Innovative 3D visualization techniques and real-time augmented user interfaces enable students to create realistic environments to understand the world around them. Here, we present a series of laboratory activities that utilize an Augmented Reality Sandbox to teach basic concepts of hydrology, geology, and geography to undergraduates at Harvard University and the University of Redlands. The Augmented Reality (AR) Sandbox utilizes a real sandbox that is overlain by a digital projection of topography and a color elevation map. A Microsoft Kinect 3D camera feeds altimetry data into a software program that maps this information onto the sand surface using a digital projector. Students can then manipulate the sand and observe as the Sandbox augments their manipulations with projections of contour lines, an elevation color map, and a simulation of water. The idea for the AR Sandbox was conceived at MIT by the Tangible Media Group in 2002 and the simulation software used here was written and developed by Dr. Oliver Kreylos of the University of California - Davis as part of the NSF funded LakeViz3D project. Between 2013 and 2014, we installed AR Sandboxes at Harvard and the University of Redlands, respectively, and developed laboratory exercises to teach flooding hazard, erosion and watershed development in undergraduate earth and environmental science courses. In 2013, we introduced a series of AR Sandbox laboratories in Introductory Geology, Hydrology, and Natural Disasters courses. We found laboratories that utilized the AR Sandbox at both universities allowed students to become quickly immersed in the learning process, enabling a more intuitive understanding of the processes that govern the natural world. The physical interface of the AR Sandbox reduces barriers to learning, can be used to rapidly illustrate basic concepts of geology, geography and hydrology, and enabled our undergraduate students to understand topography intuitively. We therefore find the AR Sandbox to be a novel teaching tool and an effective demonstration of the capabilities of 3D visualization and real-time augmented user interfaces that enable students to better understand environmental processes.
Chen, Xiaojun; Xu, Lu; Wang, Yiping; Wang, Huixiang; Wang, Fang; Zeng, Xiangsen; Wang, Qiugen; Egger, Jan
2015-06-01
The surgical navigation system has experienced tremendous development over the past decades for minimizing the risks and improving the precision of the surgery. Nowadays, Augmented Reality (AR)-based surgical navigation is a promising technology for clinical applications. In the AR system, virtual and actual reality are mixed, offering real-time, high-quality visualization of an extensive variety of information to the users (Moussa et al., 2012) [1]. For example, virtual anatomical structures such as soft tissues, blood vessels and nerves can be integrated with the real-world scenario in real time. In this study, an AR-based surgical navigation system (AR-SNS) is developed using an optical see-through HMD (head-mounted display), aiming at improving the safety and reliability of the surgery. With the use of this system, including the calibration of instruments, registration, and the calibration of HMD, the 3D virtual critical anatomical structures in the head-mounted display are aligned with the actual structures of patient in real-world scenario during the intra-operative motion tracking process. The accuracy verification experiment demonstrated that the mean distance and angular errors were respectively 0.809±0.05mm and 1.038°±0.05°, which was sufficient to meet the clinical requirements. Copyright © 2015 Elsevier Inc. All rights reserved.
An Augmented Lecture Feedback System to Support Learner and Teacher Communication
ERIC Educational Resources Information Center
Zarraonandia, Telmo; Aedo, Ignacio; Diaz, Paloma; Montero, Alvaro
2013-01-01
In this paper, it is advocated that the feedback loop between learners and teachers could be improved by making use of augmented reality (AR) techniques. The bidirectional communication between teacher and learners is sometimes hampered by students' fear of showing themselves up in front of their classmates. In order to overcome this problem, a…
Marker Registration Technique for Handwritten Text Marker in Augmented Reality Applications
NASA Astrophysics Data System (ADS)
Thanaborvornwiwat, N.; Patanukhom, K.
2018-04-01
Marker registration is a fundamental process to estimate camera poses in marker-based Augmented Reality (AR) systems. We developed AR system that creates correspondence virtual objects on handwritten text markers. This paper presents a new method for registration that is robust for low-content text markers, variation of camera poses, and variation of handwritten styles. The proposed method uses Maximally Stable Extremal Regions (MSER) and polygon simplification for a feature point extraction. The experiment shows that we need to extract only five feature points per image which can provide the best registration results. An exhaustive search is used to find the best matching pattern of the feature points in two images. We also compared performance of the proposed method to some existing registration methods and found that the proposed method can provide better accuracy and time efficiency.
Santana-Fernández, Javier; Gómez-Gil, Jaime; del-Pozo-San-Cirilo, Laura
2010-01-01
Current commercial tractor guidance systems present to the driver information to perform agricultural tasks in the best way. This information generally includes a treated zones map referenced to the tractor's position. Unlike actual guidance systems where the tractor driver must mentally associate treated zone maps and the plot layout, this paper presents a guidance system that using Augmented Reality (AR) technology, allows the tractor driver to see the real plot though eye monitor glasses with the treated zones in a different color. The paper includes a description of the system hardware and software, a real test done with image captures seen by the tractor driver, and a discussion predicting that the historical evolution of guidance systems could involve the use of AR technology in the agricultural guidance and monitoring systems.
Santana-Fernández, Javier; Gómez-Gil, Jaime; del-Pozo-San-Cirilo, Laura
2010-01-01
Current commercial tractor guidance systems present to the driver information to perform agricultural tasks in the best way. This information generally includes a treated zones map referenced to the tractor’s position. Unlike actual guidance systems where the tractor driver must mentally associate treated zone maps and the plot layout, this paper presents a guidance system that using Augmented Reality (AR) technology, allows the tractor driver to see the real plot though eye monitor glasses with the treated zones in a different color. The paper includes a description of the system hardware and software, a real test done with image captures seen by the tractor driver, and a discussion predicting that the historical evolution of guidance systems could involve the use of AR technology in the agricultural guidance and monitoring systems. PMID:22163479
Evaluating the use of augmented reality to support undergraduate student learning in geomorphology
NASA Astrophysics Data System (ADS)
Ockelford, A.; Bullard, J. E.; Burton, E.; Hackney, C. R.
2016-12-01
Augmented Reality (AR) supports the understanding of complex phenomena by providing unique visual and interactive experiences that combine real and virtual information and help communicate abstract problems to learners. With AR, designers can superimpose virtual graphics over real objects, allowing users to interact with digital content through physical manipulation. One of the most significant pedagogic features of AR is that it provides an essentially student-centred and flexible space in which students can learn. By actively engaging participants using a design-thinking approach, this technology has the potential to provide a more productive and engaging learning environment than real or virtual learning environments alone. AR is increasingly being used in support of undergraduate learning and public engagement activities across engineering, medical and humanities disciplines but it is not widely used across the geosciences disciplines despite the obvious applicability. This paper presents preliminary results from a multi-institutional project which seeks to evaluate the benefits and challenges of using an augmented reality sand box to support undergraduate learning in geomorphology. The sandbox enables users to create and visualise topography. As the sand is sculpted, contours are projected onto the miniature landscape. By hovering a hand over the box, users can make it `rain' over the landscape and the water `flows' down in to rivers and valleys. At undergraduate level, the sand-box is an ideal focus for problem-solving exercises, for example exploring how geomorphology controls hydrological processes, how such processes can be altered and the subsequent impacts of the changes for environmental risk. It is particularly valuable for students who favour a visual or kinesthetic learning style. Results presented in this paper discuss how the sandbox provides a complex interactive environment that encourages communication, collaboration and co-design.
Optical methods for enabling focus cues in head-mounted displays for virtual and augmented reality
NASA Astrophysics Data System (ADS)
Hua, Hong
2017-05-01
Developing head-mounted displays (HMD) that offer uncompromised optical pathways to both digital and physical worlds without encumbrance and discomfort confronts many grand challenges, both from technological perspectives and human factors. Among the many challenges, minimizing visual discomfort is one of the key obstacles. One of the key contributing factors to visual discomfort is the lack of the ability to render proper focus cues in HMDs to stimulate natural eye accommodation responses, which leads to the well-known accommodation-convergence cue discrepancy problem. In this paper, I will provide a summary on the various optical methods approaches toward enabling focus cues in HMDs for both virtual reality (VR) and augmented reality (AR).
Mass production of holographic transparent components for augmented and virtual reality applications
NASA Astrophysics Data System (ADS)
Russo, Juan Manuel; Dimov, Fedor; Padiyar, Joy; Coe-Sullivan, Seth
2017-06-01
Diffractive optics such as holographic optical elements (HOEs) can provide transparent and narrow band components with arbitrary incident and diffracted angles for near-to-eye commercial electronic products for augmented reality (AR), virtual reality (VR), and smart glass applications. In this paper, we will summarize the operational parameters and general optical geometries relevant for near-to-eye displays, the holographic substrates available for these applications, and their performance characteristics and ease of manufacture. We will compare the holographic substrates available in terms of fabrication, manufacturability, and end-user performance characteristics. Luminit is currently emplacing the manufacturing capacity to serve this market, and this paper will discuss the capabilities and limitations of this unique facility.
Augmented Reality Cues and Elderly Driver Hazard Perception
Schall, Mark C.; Rusch, Michelle L.; Lee, John D.; Dawson, Jeffrey D.; Thomas, Geb; Aksan, Nazan; Rizzo, Matthew
2013-01-01
Objective Evaluate the effectiveness of augmented reality (AR) cues in improving driving safety in elderly drivers who are at increased crash risk due to cognitive impairments. Background Cognitively challenging driving environments pose a particular crash risk for elderly drivers. AR cueing is a promising technology to mitigate risk by directing driver attention to roadway hazards. This study investigates whether AR cues improve or interfere with hazard perception in elderly drivers with age-related cognitive decline. Methods Twenty elderly (Mean= 73 years, SD= 5 years), licensed drivers with a range of cognitive abilities measured by a speed of processing (SOP) composite participated in a one-hour drive in an interactive, fixed-base driving simulator. Each participant drove through six, straight, six-mile-long rural roadway scenarios following a lead vehicle. AR cues directed attention to potential roadside hazards in three of the scenarios, and the other three were uncued (baseline) drives. Effects of AR cueing were evaluated with respect to: 1) detection of hazardous target objects, 2) interference with detecting nonhazardous secondary objects, and 3) impairment in maintaining safe distance behind a lead vehicle. Results AR cueing improved the detection of hazardous target objects of low visibility. AR cues did not interfere with detection of nonhazardous secondary objects and did not impair ability to maintain safe distance behind a lead vehicle. SOP capacity did not moderate those effects. Conclusion AR cues show promise for improving elderly driver safety by increasing hazard detection likelihood without interfering with other driving tasks such as maintaining safe headway. PMID:23829037
Invisible marker based augmented reality system
NASA Astrophysics Data System (ADS)
Park, Hanhoon; Park, Jong-Il
2005-07-01
Augmented reality (AR) has recently gained significant attention. The previous AR techniques usually need a fiducial marker with known geometry or objects of which the structure can be easily estimated such as cube. Placing a marker in the workspace of the user can be intrusive. To overcome this limitation, we present an AR system using invisible markers which are created/drawn with an infrared (IR) fluorescent pen. Two cameras are used: an IR camera and a visible camera, which are positioned in each side of a cold mirror so that their optical centers coincide with each other. We track the invisible markers using IR camera and visualize AR in the view of visible camera. Additional algorithms are employed for the system to have a reliable performance in the cluttered background. Experimental results are given to demonstrate the viability of the proposed system. As an application of the proposed system, the invisible marker can act as a Vision-Based Identity and Geometry (VBIG) tag, which can significantly extend the functionality of RFID. The invisible tag is the same as RFID in that it is not perceivable while more powerful in that the tag information can be presented to the user by direct projection using a mobile projector or by visualizing AR on the screen of mobile PDA.
ServAR: An augmented reality tool to guide the serving of food.
Rollo, Megan E; Bucher, Tamara; Smith, Shamus P; Collins, Clare E
2017-05-12
Accurate estimation of food portion size is a difficult task. Visual cues are important mediators of portion size and therefore technology-based aids may assist consumers when serving and estimating food portions. The current study evaluated the usability and impact on estimation error of standard food servings of a novel augmented reality food serving aid, ServAR. Participants were randomised into one of three groups: 1) no information/aid (control); 2) verbal information on standard serving sizes; or 3) ServAR, an aid which overlayed virtual food servings over a plate using a tablet computer. Participants were asked to estimate the standard serving sizes of nine foods (broccoli, carrots, cauliflower, green beans, kidney beans, potato, pasta, rice, and sweetcorn) using validated food replicas. Wilcoxon signed-rank tests compared median served weights of each food to reference standard serving size weights. Percentage error was used to compare the estimation of serving size accuracy between the three groups. All participants also performed a usability test using the ServAR tool to guide the serving of one randomly selected food. Ninety adults (78.9% female; a mean (95%CI) age 25.8 (24.9-26.7) years; BMI 24.2 (23.2-25.2) kg/m 2 ) completed the study. The median servings were significantly different to the reference portions for five foods in the ServAR group, compared to eight foods in the information only group and seven foods for the control group. The cumulative proportion of total estimations per group within ±10%, ±25% and ±50% of the reference portion was greater for those using ServAR (30.7, 65.2 and 90.7%; respectively), compared to the information only group (19.6, 47.4 and 77.4%) and control group (10.0, 33.7 and 68.9%). Participants generally found the ServAR tool easy to use and agreed that it showed potential to support optimal portion size selection. However, some refinements to the ServAR tool are required to improve the user experience. Use of the augmented reality tool improved accuracy and consistency of estimating standard serve sizes compared to the information only and control conditions. ServAR demonstrates potential as a practical tool to guide the serving of food. Further evaluation across a broad range of foods, portion sizes and settings is warranted.
Pratt, Philip; Ives, Matthew; Lawton, Graham; Simmons, Jonathan; Radev, Nasko; Spyropoulou, Liana; Amiras, Dimitri
2018-01-01
Precision and planning are key to reconstructive surgery. Augmented reality (AR) can bring the information within preoperative computed tomography angiography (CTA) imaging to life, allowing the surgeon to 'see through' the patient's skin and appreciate the underlying anatomy without making a single incision. This work has demonstrated that AR can assist the accurate identification, dissection and execution of vascular pedunculated flaps during reconstructive surgery. Separate volumes of osseous, vascular, skin, soft tissue structures and relevant vascular perforators were delineated from preoperative CTA scans to generate three-dimensional images using two complementary segmentation software packages. These were converted to polygonal models and rendered by means of a custom application within the HoloLens™ stereo head-mounted display. Intraoperatively, the models were registered manually to their respective subjects by the operating surgeon using a combination of tracked hand gestures and voice commands; AR was used to aid navigation and accurate dissection. Identification of the subsurface location of vascular perforators through AR overlay was compared to the positions obtained by audible Doppler ultrasound. Through a preliminary HoloLens-assisted case series, the operating surgeon was able to demonstrate precise and efficient localisation of perforating vessels.
Ben-Moussa, Maher; Rubo, Marius; Debracque, Coralie; Lange, Wolf-Gero
2017-01-01
The present paper explores the benefits and the capabilities of various emerging state-of-the-art interactive 3D and Internet of Things technologies and investigates how these technologies can be exploited to develop a more effective technology supported exposure therapy solution for social anxiety disorder. "DJINNI" is a conceptual design of an in vivo augmented reality (AR) exposure therapy mobile support system that exploits several capturing technologies and integrates the patient's state and situation by vision-based, audio-based, and physiology-based analysis as well as by indoor/outdoor localization techniques. DJINNI also comprises an innovative virtual reality exposure therapy system that is adaptive and customizable to the demands of the in vivo experience and therapeutic progress. DJINNI follows a gamification approach where rewards and achievements are utilized to motivate the patient to progress in her/his treatment. The current paper reviews the state of the art of technologies needed for such a solution and recommends how these technologies could be integrated in the development of an individually tailored and yet feasible and effective AR/virtual reality-based exposure therapy. Finally, the paper outlines how DJINNI could be part of classical cognitive behavioral treatment and how to validate such a setup.
Ben-Moussa, Maher; Rubo, Marius; Debracque, Coralie; Lange, Wolf-Gero
2017-01-01
The present paper explores the benefits and the capabilities of various emerging state-of-the-art interactive 3D and Internet of Things technologies and investigates how these technologies can be exploited to develop a more effective technology supported exposure therapy solution for social anxiety disorder. “DJINNI” is a conceptual design of an in vivo augmented reality (AR) exposure therapy mobile support system that exploits several capturing technologies and integrates the patient’s state and situation by vision-based, audio-based, and physiology-based analysis as well as by indoor/outdoor localization techniques. DJINNI also comprises an innovative virtual reality exposure therapy system that is adaptive and customizable to the demands of the in vivo experience and therapeutic progress. DJINNI follows a gamification approach where rewards and achievements are utilized to motivate the patient to progress in her/his treatment. The current paper reviews the state of the art of technologies needed for such a solution and recommends how these technologies could be integrated in the development of an individually tailored and yet feasible and effective AR/virtual reality-based exposure therapy. Finally, the paper outlines how DJINNI could be part of classical cognitive behavioral treatment and how to validate such a setup. PMID:28503155
Zhu, Ming; Chai, Gang; Lin, Li; Xin, Yu; Tan, Andy; Bogari, Melia; Zhang, Yan; Li, Qingfeng
2016-12-01
Augmented reality (AR) technology can superimpose the virtual image generated by computer onto the real operating field to present an integral image to enhance surgical safety. The purpose of our study is to develop a novel AR-based navigation system for craniofacial surgery. We focus on orbital hypertelorism correction, because the surgery requires high preciseness and is considered tough even for senior craniofacial surgeon. Twelve patients with orbital hypertelorism were selected. The preoperative computed tomography data were imported into 3-dimensional platform for preoperational design. The position and orientation of virtual information and real world were adjusted by image registration process. The AR toolkits were used to realize the integral image. Afterward, computed tomography was also performed after operation for comparing the difference between preoperational plan and actual operational outcome. Our AR-based navigation system was successfully used in these patients, directly displaying 3-dimensional navigational information onto the surgical field. They all achieved a better appearance by the guidance of navigation image. The difference in interdacryon distance and the dacryon point of each side appear no significant (P > 0.05) between preoperational plan and actual surgical outcome. This study reports on an effective visualized approach for guiding orbital hypertelorism correction. Our AR-based navigation system may lay a foundation for craniofacial surgery navigation. The AR technology could be considered as a helpful tool for precise osteotomy in craniofacial surgery.
Stereoscopic augmented reality for laparoscopic surgery.
Kang, Xin; Azizian, Mahdi; Wilson, Emmanuel; Wu, Kyle; Martin, Aaron D; Kane, Timothy D; Peters, Craig A; Cleary, Kevin; Shekhar, Raj
2014-07-01
Conventional laparoscopes provide a flat representation of the three-dimensional (3D) operating field and are incapable of visualizing internal structures located beneath visible organ surfaces. Computed tomography (CT) and magnetic resonance (MR) images are difficult to fuse in real time with laparoscopic views due to the deformable nature of soft-tissue organs. Utilizing emerging camera technology, we have developed a real-time stereoscopic augmented-reality (AR) system for laparoscopic surgery by merging live laparoscopic ultrasound (LUS) with stereoscopic video. The system creates two new visual cues: (1) perception of true depth with improved understanding of 3D spatial relationships among anatomical structures, and (2) visualization of critical internal structures along with a more comprehensive visualization of the operating field. The stereoscopic AR system has been designed for near-term clinical translation with seamless integration into the existing surgical workflow. It is composed of a stereoscopic vision system, a LUS system, and an optical tracker. Specialized software processes streams of imaging data from the tracked devices and registers those in real time. The resulting two ultrasound-augmented video streams (one for the left and one for the right eye) give a live stereoscopic AR view of the operating field. The team conducted a series of stereoscopic AR interrogations of the liver, gallbladder, biliary tree, and kidneys in two swine. The preclinical studies demonstrated the feasibility of the stereoscopic AR system during in vivo procedures. Major internal structures could be easily identified. The system exhibited unobservable latency with acceptable image-to-video registration accuracy. We presented the first in vivo use of a complete system with stereoscopic AR visualization capability. This new capability introduces new visual cues and enhances visualization of the surgical anatomy. The system shows promise to improve the precision and expand the capacity of minimally invasive laparoscopic surgeries.
Gravbox - The First Augmented Reality Sandbox for Gravitational Dynamics
NASA Astrophysics Data System (ADS)
Isbell, Jacob; Deam, Sophie; Reed, Mason; Bettis, Wyatt; Lu, Jianbo; Luppen, Zachary; Maier, Erin; McCurdy, Ross; Moore, Sadie; Fu, Hai
2018-01-01
Gravitational effects are an overarching theme in astronomy education, yet existing classroom demonstrations are insufficient in elucidating complex gravitational interactions. Inspired by the augmented reality (AR) sandbox developed by geologists, we have developed Gravbox, the first AR sandbox to demonstrate gravitational dynamics. The arbitrary topography of the sand surface represents the mass distribution of a two-dimensional universe. The computer reads the topography with a Kinect camera, calculates the orbit of a test particle with user-defined position and velocity, and projects the topography contour map and orbit animation with an overhead projector, all within a duty cycle of one second. This creates an interactive and intuitive tool to help students at all levels understand gravitational effects. In this contribution, we will describe the development of the Gravbox prototype and show its current capabilities. The Gravbox software will be publicly available along with a building tutorial.
NASA Astrophysics Data System (ADS)
Yamauchi, Makoto; Iwamoto, Kazuyo
2010-05-01
Line heating is a skilled task in shipbuilding to shape the outer plates of ship hulls. Real-time information on the deformation of the plates during the task would be helpful to workers performing this process. Therefore, we herein propose an interactive scheme for supporting workers performing line heating; the system provides such information through an optical shape measurement instrument combined with an augmented reality (AR) system. The instrument was designed and fabricated so that the measured data were represented using coordinates based on fiducial markers. Since the markers were simultaneously used in the AR system for the purpose of positioning, the data could then be displayed to the workers through a head-mounted display as a virtual image overlaid on the plates. Feedback of the shape measurement results was thus performed in real time using the proposed system.
Inexpensive Monocular Pico-Projector-based Augmented Reality Display for Surgical Microscope
Shi, Chen; Becker, Brian C.; Riviere, Cameron N.
2013-01-01
This paper describes an inexpensive pico-projector-based augmented reality (AR) display for a surgical microscope. The system is designed for use with Micron, an active handheld surgical tool that cancels hand tremor of surgeons to improve microsurgical accuracy. Using the AR display, virtual cues can be injected into the microscope view to track the movement of the tip of Micron, show the desired position, and indicate the position error. Cues can be used to maintain high performance by helping the surgeon to avoid drifting out of the workspace of the instrument. Also, boundary information such as the view range of the cameras that record surgical procedures can be displayed to tell surgeons the operation area. Furthermore, numerical, textual, or graphical information can be displayed, showing such things as tool tip depth in the work space and on/off status of the canceling function of Micron. PMID:25264542
Development and human factors analysis of neuronavigation vs. augmented reality.
Pandya, Abhilash; Siadat, Mohammad-Reza; Auner, Greg; Kalash, Mohammad; Ellis, R Darin
2004-01-01
This paper is focused on the human factors analysis comparing a standard neuronavigation system with an augmented reality system. We use a passive articulated arm (Microscribe, Immersion technology) to track a calibrated end-effector mounted video camera. In real time, we superimpose the live video view with the synchronized graphical view of CT-derived segmented object(s) of interest within a phantom skull. Using the same robotic arm, we have developed a neuronavigation system able to show the end-effector of the arm on orthogonal CT scans. Both the AR and the neuronavigation systems have been shown to be within 3mm of accuracy. A human factors study was conducted in which subjects were asked to draw craniotomies and answer questions to gage their understanding of the phantom objects. The human factors study included 21 subjects and indicated that the subjects performed faster, with more accuracy and less errors using the Augmented Reality interface.
A method for real-time generation of augmented reality work instructions via expert movements
NASA Astrophysics Data System (ADS)
Bhattacharya, Bhaskar; Winer, Eliot
2015-03-01
Augmented Reality (AR) offers tremendous potential for a wide range of fields including entertainment, medicine, and engineering. AR allows digital models to be integrated with a real scene (typically viewed through a video camera) to provide useful information in a variety of contexts. The difficulty in authoring and modifying scenes is one of the biggest obstacles to widespread adoption of AR. 3D models must be created, textured, oriented and positioned to create the complex overlays viewed by a user. This often requires using multiple software packages in addition to performing model format conversions. In this paper, a new authoring tool is presented which uses a novel method to capture product assembly steps performed by a user with a depth+RGB camera. Through a combination of computer vision and imaging process techniques, each individual step is decomposed into objects and actions. The objects are matched to those in a predetermined geometry library and the actions turned into animated assembly steps. The subsequent instruction set is then generated with minimal user input. A proof of concept is presented to establish the method's viability.
Augmented reality enabling intelligence exploitation at the edge
NASA Astrophysics Data System (ADS)
Kase, Sue E.; Roy, Heather; Bowman, Elizabeth K.; Patton, Debra
2015-05-01
Today's Warfighters need to make quick decisions while interacting in densely populated environments comprised of friendly, hostile, and neutral host nation locals. However, there is a gap in the real-time processing of big data streams for edge intelligence. We introduce a big data processing pipeline called ARTEA that ingests, monitors, and performs a variety of analytics including noise reduction, pattern identification, and trend and event detection in the context of an area of operations (AOR). Results of the analytics are presented to the Soldier via an augmented reality (AR) device Google Glass (Glass). Non-intrusive AR devices such as Glass can visually communicate contextually relevant alerts to the Soldier based on the current mission objectives, time, location, and observed or sensed activities. This real-time processing and AR presentation approach to knowledge discovery flattens the intelligence hierarchy enabling the edge Soldier to act as a vital and active participant in the analysis process. We report preliminary observations testing ARTEA and Glass in a document exploitation and person of interest scenario simulating edge Soldier participation in the intelligence process in disconnected deployment conditions.
Anastassova, Margarita; Burkhardt, Jean-Marie
2009-07-01
The paper presents an ergonomic analysis carried out in the early phases of an R&D project. The purpose was to investigate the functioning of today's Automotive Service Technicians (ASTs) training in order to inform the design of an Augmented Reality (AR) teaching aid. The first part of the paper presents a literature review of some major problems encountered by ASTs today. The benefits of AR as technological aid are also introduced. Then, the methodology and the results of two case studies are presented. The first study is based on interviews with trainers and trainees; the second one on observations in real training settings. The results support the assumption that today's ASTs' training could be regarded as a community-of-practice (CoP). Therefore, AR could be useful as a collaboration tool, offering a shared virtual representation of real vehicle's parts, which are normally invisible unless dismantled (e.g. the parts of a hydraulic automatic transmission). We conclude on the methods and the technologies to support the automotive CoP.
Augmented reality-guided artery-first pancreatico-duodenectomy.
Marzano, Ettore; Piardi, Tullio; Soler, Luc; Diana, Michele; Mutter, Didier; Marescaux, Jacques; Pessaux, Patrick
2013-11-01
Augmented Reality (AR) in surgery consists in the fusion of synthetic computer-generated images (3D virtual model) obtained from medical imaging preoperative work-up and real-time patient images with the aim to visualize unapparent anatomical details. The potential of AR navigation as a tool to improve safety of the surgical dissection is presented in a case of pancreatico-duodenectomy (PD). A 77-year-old male patient underwent an AR-assisted PD. The 3D virtual anatomical model was obtained from thoraco-abdominal CT scan using customary software (VR-RENDER®, IRCAD). The virtual model was superimposed to the operative field using an Exoscope (VITOM®, Karl Storz, Tüttlingen, Germany) as well as different visible landmarks (inferior vena cava, left renal vein, aorta, superior mesenteric vein, inferior margin of the pancreas). A computer scientist manually registered virtual and real images using a video mixer (MX 70; Panasonic, Secaucus, NJ) in real time. Dissection of the superior mesenteric artery and the hanging maneuver were performed under AR guidance along the hanging plane. AR allowed for precise and safe recognition of all the important vascular structures. Operative time was 360 min. AR display and fine registration was performed within 6 min. The postoperative course was uneventful. The pathology was positive for ampullary adenocarcinoma; the final stage was pT1N0 (0/43 retrieved lymph nodes) with clear surgical margins. AR is a valuable navigation tool that can enhance the ability to achieve a safe surgical resection during PD.
Inattentional blindness increased with augmented reality surgical navigation.
Dixon, Benjamin J; Daly, Michael J; Chan, Harley H L; Vescan, Allan; Witterick, Ian J; Irish, Jonathan C
2014-01-01
Augmented reality (AR) surgical navigation systems, designed to increase accuracy and efficiency, have been shown to negatively impact on attention. We wished to assess the effect "head-up" AR displays have on attention, efficiency, and accuracy, while performing a surgical task, compared with the same information being presented on a submonitor (SM). Fifty experienced otolaryngology surgeons (n = 42) and senior otolaryngology trainees (n = 8) performed an endoscopic surgical navigation exercise on a predissected cadaveric model. Computed tomography-generated anatomic contours were fused with the endoscopic image to provide an AR view. Subjects were randomized to perform the task with a standard endoscopic monitor with the AR navigation displayed on an SM or with AR as a single display. Accuracy, task completion time, and the recognition of unexpected findings (a foreign body and a critical complication) were recorded. Recognition of the foreign body was significantly better in the SM group (15/25 [60%]) compared with the AR alone group (8/25 [32%]; p = 0.02). There was no significant difference in task completion time (p = 0.83) or accuracy (p = 0.78) between the two groups. Providing identical surgical navigation on a SM, rather than on a single head-up display, reduced the level of inattentional blindness as measured by detection of unexpected findings. These gains were achieved without any measurable impact on efficiency or accuracy. AR displays may distract the user and we caution injudicious adoption of this technology for medical procedures.
Augmented Reality as a Telemedicine Platform for Remote Procedural Training.
Wang, Shiyao; Parsons, Michael; Stone-McLean, Jordan; Rogers, Peter; Boyd, Sarah; Hoover, Kristopher; Meruvia-Pastor, Oscar; Gong, Minglun; Smith, Andrew
2017-10-10
Traditionally, rural areas in many countries are limited by a lack of access to health care due to the inherent challenges associated with recruitment and retention of healthcare professionals. Telemedicine, which uses communication technology to deliver medical services over distance, is an economical and potentially effective way to address this problem. In this research, we develop a new telepresence application using an Augmented Reality (AR) system. We explore the use of the Microsoft HoloLens to facilitate and enhance remote medical training. Intrinsic advantages of AR systems enable remote learners to perform complex medical procedures such as Point of Care Ultrasound (PoCUS) without visual interference. This research uses the HoloLens to capture the first-person view of a simulated rural emergency room (ER) through mixed reality capture (MRC) and serves as a novel telemedicine platform with remote pointing capabilities. The mentor's hand gestures are captured using a Leap Motion and virtually displayed in the AR space of the HoloLens. To explore the feasibility of the developed platform, twelve novice medical trainees were guided by a mentor through a simulated ultrasound exploration in a trauma scenario, as part of a pilot user study. The study explores the utility of the system from the trainees, mentor, and objective observers' perspectives and compares the findings to that of a more traditional multi-camera telemedicine solution. The results obtained provide valuable insight and guidance for the development of an AR-supported telemedicine platform.
Augmented Reality as a Telemedicine Platform for Remote Procedural Training
Wang, Shiyao; Parsons, Michael; Stone-McLean, Jordan; Rogers, Peter; Boyd, Sarah; Hoover, Kristopher; Meruvia-Pastor, Oscar; Gong, Minglun; Smith, Andrew
2017-01-01
Traditionally, rural areas in many countries are limited by a lack of access to health care due to the inherent challenges associated with recruitment and retention of healthcare professionals. Telemedicine, which uses communication technology to deliver medical services over distance, is an economical and potentially effective way to address this problem. In this research, we develop a new telepresence application using an Augmented Reality (AR) system. We explore the use of the Microsoft HoloLens to facilitate and enhance remote medical training. Intrinsic advantages of AR systems enable remote learners to perform complex medical procedures such as Point of Care Ultrasound (PoCUS) without visual interference. This research uses the HoloLens to capture the first-person view of a simulated rural emergency room (ER) through mixed reality capture (MRC) and serves as a novel telemedicine platform with remote pointing capabilities. The mentor’s hand gestures are captured using a Leap Motion and virtually displayed in the AR space of the HoloLens. To explore the feasibility of the developed platform, twelve novice medical trainees were guided by a mentor through a simulated ultrasound exploration in a trauma scenario, as part of a pilot user study. The study explores the utility of the system from the trainees, mentor, and objective observers’ perspectives and compares the findings to that of a more traditional multi-camera telemedicine solution. The results obtained provide valuable insight and guidance for the development of an AR-supported telemedicine platform. PMID:28994720
Evaluation of Wearable Haptic Systems for the Fingers in Augmented Reality Applications.
Maisto, Maurizio; Pacchierotti, Claudio; Chinello, Francesco; Salvietti, Gionata; De Luca, Alessandro; Prattichizzo, Domenico
2017-01-01
Although Augmented Reality (AR) has been around for almost five decades, only recently we have witnessed AR systems and applications entering in our everyday life. Representative examples of this technological revolution are the smartphone games "Pokémon GO" and "Ingress" or the Google Translate real-time sign interpretation app. Even if AR applications are already quite compelling and widespread, users are still not able to physically interact with the computer-generated reality. In this respect, wearable haptics can provide the compelling illusion of touching the superimposed virtual objects without constraining the motion or the workspace of the user. In this paper, we present the experimental evaluation of two wearable haptic interfaces for the fingers in three AR scenarios, enrolling 38 participants. In the first experiment, subjects were requested to write on a virtual board using a real chalk. The haptic devices provided the interaction forces between the chalk and the board. In the second experiment, subjects were asked to pick and place virtual and real objects. The haptic devices provided the interaction forces due to the weight of the virtual objects. In the third experiment, subjects were asked to balance a virtual sphere on a real cardboard. The haptic devices provided the interaction forces due to the weight of the virtual sphere rolling on the cardboard. Providing haptic feedback through the considered wearable device significantly improved the performance of all the considered tasks. Moreover, subjects significantly preferred conditions providing wearable haptic feedback.
Habanapp: Havana's Architectural Heritage a Click Away
NASA Astrophysics Data System (ADS)
Morganti, C.; Bartolomei, C.
2018-05-01
The research treats the application of technologies related with augmented and virtual reality to architectural and historical context in the city of Havana, Cuba, on the basis of historical studies and Range-Imaging techniques on buildings bordering old city's five main squares. The specific aim is to transfer all of the data received thanks to the most recent mobiles apps about Augmented Reality (AR) and Virtual reality (VR), in order to give birth to an innovative App never seen before in Cuba. The "Oficina del Historiador de la ciudad de La Habana", institution supervising architectural and cultural asset in Cuba, is widely interested in the topic in order to develop a new educational, cultural and artistic tool to be used both online and offline.
Gibby, Jacob T; Swenson, Samuel A; Cvetko, Steve; Rao, Raj; Javan, Ramin
2018-06-22
Augmented reality has potential to enhance surgical navigation and visualization. We determined whether head-mounted display augmented reality (HMD-AR) with superimposed computed tomography (CT) data could allow the wearer to percutaneously guide pedicle screw placement in an opaque lumbar model with no real-time fluoroscopic guidance. CT imaging was obtained of a phantom composed of L1-L3 Sawbones vertebrae in opaque silicone. Preprocedural planning was performed by creating virtual trajectories of appropriate angle and depth for ideal approach into the pedicle, and these data were integrated into the Microsoft HoloLens using the Novarad OpenSight application allowing the user to view the virtual trajectory guides and CT images superimposed on the phantom in two and three dimensions. Spinal needles were inserted following the virtual trajectories to the point of contact with bone. Repeat CT revealed actual needle trajectory, allowing comparison with the ideal preprocedural paths. Registration of AR to phantom showed a roughly circular deviation with maximum average radius of 2.5 mm. Users took an average of 200 s to place a needle. Extrapolation of needle trajectory into the pedicle showed that of 36 needles placed, 35 (97%) would have remained within the pedicles. Needles placed approximated a mean distance of 4.69 mm in the mediolateral direction and 4.48 mm in the craniocaudal direction from pedicle bone edge. To our knowledge, this is the first peer-reviewed report and evaluation of HMD-AR with superimposed 3D guidance utilizing CT for spinal pedicle guide placement for the purpose of cannulation without the use of fluoroscopy.
NASA Astrophysics Data System (ADS)
Figl, Michael; Birkfellner, Wolfgang; Watzinger, Franz; Wanschitz, Felix; Hummel, Johann; Hanel, Rudolf A.; Ewers, Rolf; Bergmann, Helmar
2002-05-01
Two main concepts of Head Mounted Displays (HMD) for augmented reality (AR) visualization exist, the optical and video-see through type. Several research groups have pursued both approaches for utilizing HMDs for computer aided surgery. While the hardware requirements for a video see through HMD to achieve acceptable time delay and frame rate seem to be enormous the clinical acceptance of such a device is doubtful from a practical point of view. Starting from previous work in displaying additional computer-generated graphics in operating microscopes, we have adapted a miniature head mounted operating microscope for AR by integrating two very small computer displays. To calibrate the projection parameters of this so called Varioscope AR we have used Tsai's Algorithm for camera calibration. Connection to a surgical navigation system was performed by defining an open interface to the control unit of the Varioscope AR. The control unit consists of a standard PC with a dual head graphics adapter to render and display the desired augmentation of the scene. We connected this control unit to a computer aided surgery (CAS) system by the TCP/IP interface. In this paper we present the control unit for the HMD and its software design. We tested two different optical tracking systems, the Flashpoint (Image Guided Technologies, Boulder, CO), which provided about 10 frames per second, and the Polaris (Northern Digital, Ontario, Canada) which provided at least 30 frames per second, both with a time delay of one frame.
NASA Astrophysics Data System (ADS)
Wozniak, Peter; Vauderwange, Oliver; Mandal, Avikarsha; Javahiraly, Nicolas; Curticapean, Dan
2016-09-01
Practical exercises are a crucial part of many curricula. Even simple exercises can improve the understanding of the underlying subject. Most experimental setups require special hardware. To carry out e. g. a lens experiments the students need access to an optical bench, various lenses, light sources, apertures and a screen. In our previous publication we demonstrated the use of augmented reality visualization techniques in order to let the students prepare with a simulated experimental setup. Within the context of our intended blended learning concept we want to utilize augmented or virtual reality techniques for stationary laboratory exercises. Unlike applications running on mobile devices, stationary setups can be extended more easily with additional interfaces and thus allow for more complex interactions and simulations in virtual reality (VR) and augmented reality (AR). The most significant difference is the possibility to allow interactions beyond touching a screen. The LEAP Motion controller is a small inexpensive device that allows for the tracking of the user's hands and fingers in three dimensions. It is conceivable to allow the user to interact with the simulation's virtual elements by the user's very hand position, movement and gesture. In this paper we evaluate possible applications of the LEAP Motion controller for simulated experiments in augmented and virtual reality. We pay particular attention to the devices strengths and weaknesses and want to point out useful and less useful application scenarios.
Ogawa, Hiroyuki; Hasegawa, Seiichirou; Tsukada, Sachiyuki; Matsubara, Masaaki
2018-06-01
We developed an acetabular cup placement device, the AR-HIP system, using augmented reality (AR). The AR-HIP system allows the surgeon to view an acetabular cup image superimposed in the surgical field through a smartphone. The smartphone also shows the placement angle of the acetabular cup. This preliminary study was performed to assess the accuracy of the AR-HIP system for acetabular cup placement during total hip arthroplasty (THA). We prospectively measured the placement angles using both a goniometer and AR-HIP system in 56 hips of 54 patients undergoing primary THA. We randomly determined the order of intraoperative measurement using the 2 devices. At 3 months after THA, the placement angle of the acetabular cup was measured on computed tomography images. The primary outcome was the absolute value of the difference between intraoperative and postoperative computed tomography measurements. The measurement angle using AR-HIP was significantly more accurate in terms of radiographic anteversion than that using a goniometer (2.7° vs 6.8°, respectively; mean difference 4.1°; 95% confidence interval, 3.0-5.2; P < .0001). There was no statistically significant difference in terms of radiographic inclination (2.1° vs 2.6°; mean difference 0.5°; 95% confidence interval, -1.1 to 0.1; P = .13). In this pilot study, the AR-HIP system provided more accurate information regarding acetabular cup placement angle than the conventional method. Further studies are required to confirm the utility of the AR-HIP system as a navigation tool. Copyright © 2018 Elsevier Inc. All rights reserved.
Augmented reality intravenous injection simulator based 3D medical imaging for veterinary medicine.
Lee, S; Lee, J; Lee, A; Park, N; Lee, S; Song, S; Seo, A; Lee, H; Kim, J-I; Eom, K
2013-05-01
Augmented reality (AR) is a technology which enables users to see the real world, with virtual objects superimposed upon or composited with it. AR simulators have been developed and used in human medicine, but not in veterinary medicine. The aim of this study was to develop an AR intravenous (IV) injection simulator to train veterinary and pre-veterinary students to perform canine venipuncture. Computed tomographic (CT) images of a beagle dog were scanned using a 64-channel multidetector. The CT images were transformed into volumetric data sets using an image segmentation method and were converted into a stereolithography format for creating 3D models. An AR-based interface was developed for an AR simulator for IV injection. Veterinary and pre-veterinary student volunteers were randomly assigned to an AR-trained group or a control group trained using more traditional methods (n = 20/group; n = 8 pre-veterinary students and n = 12 veterinary students in each group) and their proficiency at IV injection technique in live dogs was assessed after training was completed. Students were also asked to complete a questionnaire which was administered after using the simulator. The group that was trained using an AR simulator were more proficient at IV injection technique using real dogs than the control group (P ≤ 0.01). The students agreed that they learned the IV injection technique through the AR simulator. Although the system used in this study needs to be modified before it can be adopted for veterinary educational use, AR simulation has been shown to be a very effective tool for training medical personnel. Using the technology reported here, veterinary AR simulators could be developed for future use in veterinary education. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.
Augmented reality 3D display based on integral imaging
NASA Astrophysics Data System (ADS)
Deng, Huan; Zhang, Han-Le; He, Min-Yang; Wang, Qiong-Hua
2017-02-01
Integral imaging (II) is a good candidate for augmented reality (AR) display, since it provides various physiological depth cues so that viewers can freely change the accommodation and convergence between the virtual three-dimensional (3D) images and the real-world scene without feeling any visual discomfort. We propose two AR 3D display systems based on the theory of II. In the first AR system, a micro II display unit reconstructs a micro 3D image, and the mciro-3D image is magnified by a convex lens. The lateral and depth distortions of the magnified 3D image are analyzed and resolved by the pitch scaling and depth scaling. The magnified 3D image and real 3D scene are overlapped by using a half-mirror to realize AR 3D display. The second AR system uses a micro-lens array holographic optical element (HOE) as an image combiner. The HOE is a volume holographic grating which functions as a micro-lens array for the Bragg-matched light, and as a transparent glass for Bragg mismatched light. A reference beam can reproduce a virtual 3D image from one side and a reference beam with conjugated phase can reproduce the second 3D image from other side of the micro-lens array HOE, which presents double-sided 3D display feature.
Kenngott, Hannes Götz; Preukschas, Anas Amin; Wagner, Martin; Nickel, Felix; Müller, Michael; Bellemann, Nadine; Stock, Christian; Fangerau, Markus; Radeleff, Boris; Kauczor, Hans-Ulrich; Meinzer, Hans-Peter; Maier-Hein, Lena; Müller-Stich, Beat Peter
2018-06-01
Augmented reality (AR) systems are currently being explored by a broad spectrum of industries, mainly for improving point-of-care access to data and images. Especially in surgery and especially for timely decisions in emergency cases, a fast and comprehensive access to images at the patient bedside is mandatory. Currently, imaging data are accessed at a distance from the patient both in time and space, i.e., at a specific workstation. Mobile technology and 3-dimensional (3D) visualization of radiological imaging data promise to overcome these restrictions by making bedside AR feasible. In this project, AR was realized in a surgical setting by fusing a 3D-representation of structures of interest with live camera images on a tablet computer using marker-based registration. The intent of this study was to focus on a thorough evaluation of AR. Feasibility, robustness, and accuracy were thus evaluated consecutively in a phantom model and a porcine model. Additionally feasibility was evaluated in one male volunteer. In the phantom model (n = 10), AR visualization was feasible in 84% of the visualization space with high accuracy (mean reprojection error ± standard deviation (SD): 2.8 ± 2.7 mm; 95th percentile = 6.7 mm). In a porcine model (n = 5), AR visualization was feasible in 79% with high accuracy (mean reprojection error ± SD: 3.5 ± 3.0 mm; 95th percentile = 9.5 mm). Furthermore, AR was successfully used and proved feasible within a male volunteer. Mobile, real-time, and point-of-care AR for clinical purposes proved feasible, robust, and accurate in the phantom, animal, and single-trial human model shown in this study. Consequently, AR following similar implementation proved robust and accurate enough to be evaluated in clinical trials assessing accuracy, robustness in clinical reality, as well as integration into the clinical workflow. If these further studies prove successful, AR might revolutionize data access at patient bedside.
Developing an Augmented Reality Environment for Earth Science Education
NASA Astrophysics Data System (ADS)
Pratt, M. J.; Skemer, P. A.; Arvidson, R. E.
2017-12-01
The emerging field of augmented reality (AR) provides new and exciting ways to explore geologic phenomena for research and education. The primary advantage of AR is that it allows users to physically explore complex three-dimensional structures that were previously inaccessible, for example a remote geologic outcrop or a mineral structure at the atomic scale. It is used, for example, with OnSight software during tactical operations to plan the Mars Curiosity rover's traverses by providing virtual views to walk through terrain and the rover at true scales. This mode of physical exploration allows users more freedom to investigate and understand the 3D structure than is possible on a flat computer screen, or within a static PowerPoint presentation during a classroom lecture. The Microsoft HoloLens headset provides the most-advanced, mobile AR platform currently available to developers. The Fossett Laboratory for Virtual Planetary Exploration at Washington University in St. Louis has applied this technology, coupled with photogrammetric software and the Unity 3D gaming engine, to develop photorealistic environments of 3D geologic outcrops from around the world. The untethered HoloLens provides an ideal platform for a classroom setting as it allows for shared experiences of the holograms of interest, projecting them in the same location for all users to explore. Furthermore, the HoloLens allows for face-to-face communication during use that is important in teaching, a feature that virtual reality does not allow. Our development of an AR application includes the design of an online database of photogrammetric outcrop models curated for the current limitations of AR technology. This database will be accessible to both those wishing to submit models, and is free to those wishing to use the application for teaching, outreach or research purposes.
Augmented versus Virtual Reality Laparoscopic Simulation: What Is the Difference?
Botden, Sanne M.B.I.; Buzink, Sonja N.; Schijven, Marlies P.
2007-01-01
Background Virtual reality (VR) is an emerging new modality for laparoscopic skills training; however, most simulators lack realistic haptic feedback. Augmented reality (AR) is a new laparoscopic simulation system offering a combination of physical objects and VR simulation. Laparoscopic instruments are used within an hybrid mannequin on tissue or objects while using video tracking. This study was designed to assess the difference in realism, haptic feedback, and didactic value between AR and VR laparoscopic simulation. Methods The ProMIS AR and LapSim VR simulators were used in this study. The participants performed a basic skills task and a suturing task on both simulators, after which they filled out a questionnaire about their demographics and their opinion of both simulators scored on a 5-point Likert scale. The participants were allotted to 3 groups depending on their experience: experts, intermediates and novices. Significant differences were calculated with the paired t-test. Results There was general consensus in all groups that the ProMIS AR laparoscopic simulator is more realistic than the LapSim VR laparoscopic simulator in both the basic skills task (mean 4.22 resp. 2.18, P < 0.000) as well as the suturing task (mean 4.15 resp. 1.85, P < 0.000). The ProMIS is regarded as having better haptic feedback (mean 3.92 resp. 1.92, P < 0.000) and as being more useful for training surgical residents (mean 4.51 resp. 2.94, P < 0.000). Conclusions In comparison with the VR simulator, the AR laparoscopic simulator was regarded by all participants as a better simulator for laparoscopic skills training on all tested features. PMID:17361356
NASA Astrophysics Data System (ADS)
Tang, Qiang; Chen, Yan; Gale, Alastair G.
2017-03-01
Appropriate feedback plays an important role in optimising mammographic interpretation training whilst also ensuring good interpretation performance. The traditional keyboard, mouse and workstation technical approach has a critical limitation in providing supplementary image-related information and providing complex feedback in real time. Augmented Reality (AR) provides a possible superior approach in this situation, as feedback can be provided directly overlaying the displayed mammographic images so making a generic approach which can also be vendor neutral. In this study, radiological feedback was dynamically remapped virtually into the real world, using perspective transformation, in order to provide a richer user experience in mammographic interpretation training. This is an initial attempt of an AR approach to dynamically superimpose pre-defined feedback information of a DICOM image on top of a radiologist's view, whilst the radiologist is examining images on a clinical workstation. The study demonstrates the feasibility of the approach, although there are limitations on interactive operations which are due to the hardware used. The results of this fully functional approach provide appropriate feedback/image correspondence in a simulated mammographic interpretation environment. Thus, it is argued that employing AR is a feasible way to provide rich feedback in the delivery of mammographic interpretation training.
NASA Astrophysics Data System (ADS)
Chen, Cheng-ping; Wang, Chang-Hwa
2015-12-01
Studies have proven that merging hands-on and online learning can result in an enhanced experience in learning science. In contrast to traditional online learning, multiple in-classroom activities may be involved in an augmented-reality (AR)-embedded e-learning process and thus could reduce the effects of individual differences. Using a three-stage AR-embedded instructional process, we conducted an experiment to investigate the influences of individual differences on learning earth science phenomena of "day, night, and seasons" for junior highs. The mixed-methods sequential explanatory design was employed. In the quantitative phase, factors of learning styles and ICT competences were examined alongside with the overall learning achievement. Independent t tests and ANCOVAs were employed to achieve inferential statistics. The results showed that overall learning achievement was significant for the AR-embedded instruction. Nevertheless, neither of the two learner factors exhibited significant effect on learning achievement. In the qualitative phase, we analyzed student interview records, and a wide variation on student's preferred instructional stages were revealed. These findings could provide an alternative rationale for developing ICT-supported instruction, as our three-stage AR-embedded comprehensive e-learning scheme could enhance instruction adaptiveness to disperse the imparities of individual differences between learners.
Use of augmented reality in laparoscopic gynecology to visualize myomas.
Bourdel, Nicolas; Collins, Toby; Pizarro, Daniel; Debize, Clement; Grémeau, Anne-Sophie; Bartoli, Adrien; Canis, Michel
2017-03-01
To report the use of augmented reality (AR) in gynecology. AR is a surgical guidance technology that enables important hidden surface structures to be visualized in endoscopic images. AR has been used for other organs, but never in gynecology and never with a very mobile organ like the uterus. We have developed a new AR approach specifically for uterine surgery and demonstrated its use for myomectomy. Tertiary university hospital. Three patients with one, two, and multiple myomas, respectively. AR was used during laparoscopy to localize the myomas. Three-dimensional (3D) models of the patient's uterus and myomas were constructed before surgery from T2-weighted magnetic resonance imaging. The intraoperative 3D shape of the uterus was determined. These models were automatically aligned and "fused" with the laparoscopic video in real time. The live fused video made the uterus appear semitransparent, and the surgeon can see the location of the myoma in real time while moving the laparoscope and the uterus. With this information, the surgeon can easily and quickly decide on how best to access the myoma. We developed an AR system for gynecologic surgery and have used it to improve laparoscopic myomectomy. Technically, the software we developed is very different to approaches tried for other organs, and it can handle significant challenges, including image blur, fast motion, and partial views of the organ. Copyright © 2016 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Hua, Hong
2017-02-01
Head-mounted light field displays render a true 3D scene by sampling either the projections of the 3D scene at different depths or the directions of the light rays apparently emitted by the 3D scene and viewed from different eye positions. They are capable of rendering correct or nearly correct focus cues and addressing the very well-known vergence-accommodation mismatch problem in conventional virtual and augmented reality displays. In this talk, I will focus on reviewing recent advancements of head-mounted light field displays for VR and AR applications. I will demonstrate examples of HMD systems developed in my group.
Experiential learning in soil science: Use of an augmented reality sandbox
NASA Astrophysics Data System (ADS)
Vaughan, Karen; Vaughan, Robert; Seeley, Janel; Brevik, Eric
2017-04-01
It is known widely that greater learning occurs when students are active participants. Novel technologies allow instructors the opportunity to create interactive activities for undergraduate students to gain comprehension of complex landscape processes. We incorporated the use of an Augmented Reality (AR) Sandbox in the Introductory Soil Science course at the University of Wyoming to facilitate an experiential learning experience in pedology. The AR Sandbox was developed by researchers at the University of California, Davis as part of a project on informal science education in freshwater lakes and watershed science. It is a hands-on display that allows users to create topography models by shaping sand that is augmented in real-time by a colored elevation maps, topographic contour lines, and simulated water. It uses a 3-dimensional motion sensing camera that detects changes to the distance between the sand surface and the camera sensor. A short-throw projector then displays the elevation model and contour lines in real-time. Undergraduate students enrolled in the Introductory Soil Science course were tasked with creating a virtual landscape and then predicting where particular soils would form on the various landforms. All participants reported a greater comprehension of surface water flow, erosion, and soil formation as a result of this exercise. They provided suggestions for future activities using the AR Sandbox including its incorporation into lessons of watershed hydrology, land management, soil water, and soil genesis.
Kęsik, Karolina; Książek, Kamil
2017-01-01
Augmented reality (AR) is becoming increasingly popular due to its numerous applications. This is especially evident in games, medicine, education, and other areas that support our everyday activities. Moreover, this kind of computer system not only improves our vision and our perception of the world that surrounds us, but also adds additional elements, modifies existing ones, and gives additional guidance. In this article, we focus on interpreting a reality-based real-time environment evaluation for informing the user about impending obstacles. The proposed solution is based on a hybrid architecture that is capable of estimating as much incoming information as possible. The proposed solution has been tested and discussed with respect to the advantages and disadvantages of different possibilities using this type of vision. PMID:29207564
Połap, Dawid; Kęsik, Karolina; Książek, Kamil; Woźniak, Marcin
2017-12-04
Augmented reality (AR) is becoming increasingly popular due to its numerous applications. This is especially evident in games, medicine, education, and other areas that support our everyday activities. Moreover, this kind of computer system not only improves our vision and our perception of the world that surrounds us, but also adds additional elements, modifies existing ones, and gives additional guidance. In this article, we focus on interpreting a reality-based real-time environment evaluation for informing the user about impending obstacles. The proposed solution is based on a hybrid architecture that is capable of estimating as much incoming information as possible. The proposed solution has been tested and discussed with respect to the advantages and disadvantages of different possibilities using this type of vision.
An indoor augmented reality mobile application for simulation of building evacuation
NASA Astrophysics Data System (ADS)
Sharma, Sharad; Jerripothula, Shanmukha
2015-03-01
Augmented Reality enables people to remain connected with the physical environment they are in, and invites them to look at the world from new and alternative perspectives. There has been an increasing interest in emergency evacuation applications for mobile devices. Nearly all the smart phones these days are Wi-Fi and GPS enabled. In this paper, we propose a novel emergency evacuation system that will help people to safely evacuate a building in case of an emergency situation. It will further enhance knowledge and understanding of where the exits are in the building and safety evacuation procedures. We have applied mobile augmented reality (mobile AR) to create an application with Unity 3D gaming engine. We show how the mobile AR application is able to display a 3D model of the building and animation of people evacuation using markers and web camera. The system gives a visual representation of a building in 3D space, allowing people to see where exits are in the building through the use of a smart phone or tablets. Pilot studies were conducted with the system showing its partial success and demonstrated the effectiveness of the application in emergency evacuation. Our computer vision methods give good results when the markers are closer to the camera, but accuracy decreases when the markers are far away from the camera.
Implementation of Augmented Reality Technology in Sangiran Museum with Vuforia
NASA Astrophysics Data System (ADS)
Purnomo, F. A.; Santosa, P. I.; Hartanto, R.; Pratisto, E. H.; Purbayu, A.
2018-03-01
Archaeological object is an evidence of life on ancient relics which has a lifespan of millions years ago. The discovery of this ancient object by the Museum Sangiran then is preserved and protected from potential damage. This research will develop Augmented Reality application for the museum that display a virtual information from ancient object on display. The content includes information as text, audio, and animation of 3D model as a representation of the ancient object. This study emphasizes the 3D Markerless recognition process by using Vuforia Augmented Reality (AR) system so that visitor can access the exhibition objects through different viewpoints. Based on the test result, by registering image target with 25o angle interval, 3D markerless keypoint feature can be detected with different viewpoint. The device must meet minimal specifications of Dual Core 1.2 GHz processor, GPU Power VR SG5X, 8 MP auto focus camera and 1 GB of memory to run the application. The average success of the AR application detects object in museum exhibition to 3D Markerless with a single view by 40%, Markerless multiview by 86% (for angle 0° - 180°) and 100% (for angle 0° - 360°). Application detection distance is between 23 cm and up to 540 cm with the response time to detect 3D Markerless has 12 seconds in average.
AR4VI: AR as an Accessibility Tool for People with Visual Impairments
Coughlan, James M.; Miele, Joshua
2017-01-01
Although AR technology has been largely dominated by visual media, a number of AR tools using both visual and auditory feedback have been developed specifically to assist people with low vision or blindness – an application domain that we term Augmented Reality for Visual Impairment (AR4VI). We describe two AR4VI tools developed at Smith-Kettlewell, as well as a number of pre-existing examples. We emphasize that AR4VI is a powerful tool with the potential to remove or significantly reduce a range of accessibility barriers. Rather than being restricted to use by people with visual impairments, AR4VI is a compelling universal design approach offering benefits for mainstream applications as well. PMID:29303163
AR4VI: AR as an Accessibility Tool for People with Visual Impairments.
Coughlan, James M; Miele, Joshua
2017-10-01
Although AR technology has been largely dominated by visual media, a number of AR tools using both visual and auditory feedback have been developed specifically to assist people with low vision or blindness - an application domain that we term Augmented Reality for Visual Impairment (AR4VI). We describe two AR4VI tools developed at Smith-Kettlewell, as well as a number of pre-existing examples. We emphasize that AR4VI is a powerful tool with the potential to remove or significantly reduce a range of accessibility barriers. Rather than being restricted to use by people with visual impairments, AR4VI is a compelling universal design approach offering benefits for mainstream applications as well.
Telescopic multi-resolution augmented reality
NASA Astrophysics Data System (ADS)
Jenkins, Jeffrey; Frenchi, Christopher; Szu, Harold
2014-05-01
To ensure a self-consistent scaling approximation, the underlying microscopic fluctuation components can naturally influence macroscopic means, which may give rise to emergent observable phenomena. In this paper, we describe a consistent macroscopic (cm-scale), mesoscopic (micron-scale), and microscopic (nano-scale) approach to introduce Telescopic Multi-Resolution (TMR) into current Augmented Reality (AR) visualization technology. We propose to couple TMR-AR by introducing an energy-matter interaction engine framework that is based on known Physics, Biology, Chemistry principles. An immediate payoff of TMR-AR is a self-consistent approximation of the interaction between microscopic observables and their direct effect on the macroscopic system that is driven by real-world measurements. Such an interdisciplinary approach enables us to achieve more than multiple scale, telescopic visualization of real and virtual information but also conducting thought experiments through AR. As a result of the consistency, this framework allows us to explore a large dimensionality parameter space of measured and unmeasured regions. Towards this direction, we explore how to build learnable libraries of biological, physical, and chemical mechanisms. Fusing analytical sensors with TMR-AR libraries provides a robust framework to optimize testing and evaluation through data-driven or virtual synthetic simulations. Visualizing mechanisms of interactions requires identification of observable image features that can indicate the presence of information in multiple spatial and temporal scales of analog data. The AR methodology was originally developed to enhance pilot-training as well as `make believe' entertainment industries in a user-friendly digital environment We believe TMR-AR can someday help us conduct thought experiments scientifically, to be pedagogically visualized in a zoom-in-and-out, consistent, multi-scale approximations.
Rochlen, Lauryn R.; Levine, Robert; Tait, Alan R.
2016-01-01
Introduction The value of simulation in medical education and procedural skills training is well recognized. Despite this, many mannequin-based trainers are limited by the inability of the trainee to view the internal anatomical structures. This study evaluates the usability and feasibility of a 1st person point of view (POV) augmented reality (AR) trainer on needle insertion as a component of central venous catheter (CVC) placement. Methods Forty subjects, including medical students and anesthesiology residents and faculty participated. AR glasses were provided through which the relevant internal anatomical landmarks were projected. Following a practice period, participants were asked to place the needle in the mannequin without the benefit of the AR projected internal anatomy. The ability of the trainees to correctly place the needle was documented. Participants also completed a short survey describing their perceptions of the AR technology. Results Participants reported that the AR technology was realistic (77.5%) and that the ability to view the internal anatomy was helpful (92.5%). Furthermore, 85% and 82.1%, respectively, believed that the AR technology promoted learning and should be incorporated into medical training. The ability to successfully place the needle was similar between experienced and non-experienced participants, however, less experienced participants were more likely to inadvertently puncture the carotid artery. Conclusions Results of this pilot study demonstrated the usability and feasibility of AR technology as a potentially important adjunct to simulated medical skills training. Further development and evaluation of this innovative technology under a variety of simulated medical training settings would be an important next step. PMID:27930431
Augmented reality in healthcare education: an integrative review.
Zhu, Egui; Hadadgar, Arash; Masiello, Italo; Zary, Nabil
2014-01-01
Background. The effective development of healthcare competencies poses great educational challenges. A possible approach to provide learning opportunities is the use of augmented reality (AR) where virtual learning experiences can be embedded in a real physical context. The aim of this study was to provide a comprehensive overview of the current state of the art in terms of user acceptance, the AR applications developed and the effect of AR on the development of competencies in healthcare. Methods. We conducted an integrative review. Integrative reviews are the broadest type of research review methods allowing for the inclusion of various research designs to more fully understand a phenomenon of concern. Our review included multi-disciplinary research publications in English reported until 2012. Results. 2529 research papers were found from ERIC, CINAHL, Medline, PubMed, Web of Science and Springer-link. Three qualitative, 20 quantitative and 2 mixed studies were included. Using a thematic analysis, we've described three aspects related to the research, technology and education. This study showed that AR was applied in a wide range of topics in healthcare education. Furthermore acceptance for AR as a learning technology was reported among the learners and its potential for improving different types of competencies. Discussion. AR is still considered as a novelty in the literature. Most of the studies reported early prototypes. Also the designed AR applications lacked an explicit pedagogical theoretical framework. Finally the learning strategies adopted were of the traditional style 'see one, do one and teach one' and do not integrate clinical competencies to ensure patients' safety.
Augmented reality in healthcare education: an integrative review
Zhu, Egui; Hadadgar, Arash; Masiello, Italo
2014-01-01
Background. The effective development of healthcare competencies poses great educational challenges. A possible approach to provide learning opportunities is the use of augmented reality (AR) where virtual learning experiences can be embedded in a real physical context. The aim of this study was to provide a comprehensive overview of the current state of the art in terms of user acceptance, the AR applications developed and the effect of AR on the development of competencies in healthcare. Methods. We conducted an integrative review. Integrative reviews are the broadest type of research review methods allowing for the inclusion of various research designs to more fully understand a phenomenon of concern. Our review included multi-disciplinary research publications in English reported until 2012. Results. 2529 research papers were found from ERIC, CINAHL, Medline, PubMed, Web of Science and Springer-link. Three qualitative, 20 quantitative and 2 mixed studies were included. Using a thematic analysis, we’ve described three aspects related to the research, technology and education. This study showed that AR was applied in a wide range of topics in healthcare education. Furthermore acceptance for AR as a learning technology was reported among the learners and its potential for improving different types of competencies. Discussion. AR is still considered as a novelty in the literature. Most of the studies reported early prototypes. Also the designed AR applications lacked an explicit pedagogical theoretical framework. Finally the learning strategies adopted were of the traditional style ‘see one, do one and teach one’ and do not integrate clinical competencies to ensure patients’ safety. PMID:25071992
NASA Astrophysics Data System (ADS)
Ribeiro, Allan; Santos, Helen
With the advent of new information and communication technologies (ICTs), the communicative interaction changes the way of being and acting of people, at the same time that changes the way of work activities related to education. In this range of possibilities provided by the advancement of computational resources include virtual reality (VR) and augmented reality (AR), are highlighted as new forms of information visualization in computer applications. While the RV allows user interaction with a virtual environment totally computer generated; in RA the virtual images are inserted in real environment, but both create new opportunities to support teaching and learning in formal and informal contexts. Such technologies are able to express representations of reality or of the imagination, as systems in nanoscale and low dimensionality, being imperative to explore, in the most diverse areas of knowledge, the potential offered by ICT and emerging technologies. In this sense, this work presents computer applications of virtual and augmented reality developed with the use of modeling and simulation in computational approaches to topics related to nanoscience and nanotechnology, and articulated with innovative pedagogical practices.
OLED microdisplays in near-to-eye applications: challenges and solutions
NASA Astrophysics Data System (ADS)
Vogel, Uwe; Richter, Bernd; Wartenberg, Philipp; König, Peter; Hild, Olaf R.; Fehse, Karsten; Schober, Matthias; Bodenstein, Elisabeth; Beyer, Beatrice
2017-06-01
Wearable augmented-reality (AR) has already started to be used productively mainly in manufacturing industry and logistics. Next step will be to move wearable AR from "professionals to citizens" by enabling networked, everywhere augmented-reality (in-/outdoor localisation, scene recognition, cloud access,…) which is non-intrusive, exhibits intuitive user-interaction, anytime safe and secure use, and considers personal privacy protection (user's and others). Various hardware improvements (e.g., low-power, seamless interactivity, small form factor, ergonomics,…), as well as connectivity and network integration will become vital for consumer adoption. Smart-Glasses (i.e., near-to-eye (NTE) displays) have evolved as major devices for wearable AR, that hold potential to become adopted by consumers soon. Tiny microdisplays are a key component of smart-glasses, e.g., creating images from organic light emitting diodes (OLED), that have become popular in mobile phone displays. All microdisplay technologies on the market comprise an image-creating pixel modulation, but only the emissive ones (for example, OLED and LED) feature the image and light source in a single device, and therefore do not require an external light source. This minimizes system size and power consumption, while providing exceptional contrast and color space. These advantages make OLED microdisplays a perfect fit for near-eye applications. Low-power active-matrix circuitry CMOS backplane architecture, embedded sensors, emission spectra outside the visible and high-resolution sub-pixel micro-patterning address some of the application challenges (e.g., long battery life, sun-light readability, user interaction modes) and enable advanced features for OLED microdisplays in near-to-eye displays, e.g., upcoming connected augmented-reality smart glasses. This report is to analyze the challenges in addressing those features and discuss solutions.
Bourdel, Nicolas; Collins, Toby; Pizarro, Daniel; Bartoli, Adrien; Da Ines, David; Perreira, Bruno; Canis, Michel
2017-01-01
Augmented Reality (AR) is a technology that can allow a surgeon to see subsurface structures. This works by overlaying information from another modality, such as MRI and fusing it in real time with the endoscopic images. AR has never been developed for a very mobile organ like the uterus and has never been performed for gynecology. Myomas are not always easy to localize in laparoscopic surgery when they do not significantly change the surface of the uterus, or are at multiple locations. To study the accuracy of myoma localization using a new AR system compared to MRI-only localization. Ten residents were asked to localize six myomas (on a uterine model into a laparoscopic box) when either using AR or in conditions that simulate a standard method (only the MRI was available). Myomas were randomly divided in two groups: the control group (MRI only, AR not activated) and the AR group (AR activated). Software was used to automatically measure the distance between the point of contact on the uterine surface and the myoma. We compared these distances to the true shortest distance to obtain accuracy measures. The time taken to perform the task was measured, and an assessment of the complexity was performed. The mean accuracy in the control group was 16.80 mm [0.1-52.2] versus 0.64 mm [0.01-4.71] with AR. In the control group, the mean time to perform the task was 18.68 [6.4-47.1] s compared to 19.6 [3.9-77.5] s with AR. The mean score of difficulty (evaluated for each myoma) was 2.36 [1-4] versus 0.87 [0-4], respectively, for the control and the AR group. We developed an AR system for a very mobile organ. This is the first user study to quantitatively evaluate an AR system for improving a surgical task. In our model, AR improves localization accuracy.
Systematic review on the effectiveness of augmented reality applications in medical training.
Barsom, E Z; Graafland, M; Schijven, M P
2016-10-01
Computer-based applications are increasingly used to support the training of medical professionals. Augmented reality applications (ARAs) render an interactive virtual layer on top of reality. The use of ARAs is of real interest to medical education because they blend digital elements with the physical learning environment. This will result in new educational opportunities. The aim of this systematic review is to investigate to which extent augmented reality applications are currently used to validly support medical professionals training. PubMed, Embase, INSPEC and PsychInfo were searched using predefined inclusion criteria for relevant articles up to August 2015. All study types were considered eligible. Articles concerning AR applications used to train or educate medical professionals were evaluated. Twenty-seven studies were found relevant, describing a total of seven augmented reality applications. Applications were assigned to three different categories. The first category is directed toward laparoscopic surgical training, the second category toward mixed reality training of neurosurgical procedures and the third category toward training echocardiography. Statistical pooling of data could not be performed due to heterogeneity of study designs. Face-, construct- and concurrent validity was proven for two applications directed at laparoscopic training, face- and construct validity for neurosurgical procedures and face-, content- and construct validity in echocardiography training. In the literature, none of the ARAs completed a full validation process for the purpose of use. Augmented reality applications that support blended learning in medical training have gained public and scientific interest. In order to be of value, applications must be able to transfer information to the user. Although promising, the literature to date is lacking to support such evidence.
AR Based App for Tourist Attraction in ESKİ ÇARŞI (Safranbolu)
NASA Astrophysics Data System (ADS)
Polat, Merve; Rakıp Karaş, İsmail; Kahraman, İdris; Alizadehashrafi, Behnam
2016-10-01
This research is dealing with 3D modeling of historical and heritage landmarks of Safranbolu that are registered by UNESCO. This is an Augmented Reality (AR) based project in order to trigger virtual three-dimensional (3D) models, cultural music, historical photos, artistic features and animated text information. The aim is to propose a GIS-based approach with these features and add to the system as attribute data in a relational database. The database will be available in an AR-based application to provide information for the tourists.
Navigation surgery using an augmented reality for pancreatectomy.
Okamoto, Tomoyoshi; Onda, Shinji; Yasuda, Jungo; Yanaga, Katsuhiko; Suzuki, Naoki; Hattori, Asaki
2015-01-01
The aim of this study was to evaluate the utility of navigation surgery using augmented reality technology (AR-based NS) for pancreatectomy. The 3D reconstructed images from CT were created by segmentation. The initial registration was performed by using the optical location sensor. The reconstructed images were superimposed onto the real organs in the monitor display. Of the 19 patients who had undergone hepatobiliary and pancreatic surgery using AR-based NS, the accuracy, visualization ability, and utility of our system were assessed in five cases with pancreatectomy. The position of each organ in the surface-rendering image corresponded almost to that of the actual organ. Reference to the display image allowed for safe dissection while preserving the adjacent vessels or organs. The locations of the lesions and resection line on the targeted organ were overlaid on the operating field. The initial mean registration error was improved to approximately 5 mm by our refinements. However, several problems such as registration accuracy, portability and cost still remain. AR-based NS contributed to accurate and effective surgical resection in pancreatectomy. The pancreas appears to be a suitable organ for further investigations. This technology is promising to improve surgical quality, training, and education. © 2015 S. Karger AG, Basel.
Fernández-Caballero, Antonio; Navarro, Elena; Fernández-Sotos, Patricia; González, Pascual; Ricarte, Jorge J.; Latorre, José M.; Rodriguez-Jimenez, Roberto
2017-01-01
This perspective paper faces the future of alternative treatments that take advantage of a social and cognitive approach with regards to pharmacological therapy of auditory verbal hallucinations (AVH) in patients with schizophrenia. AVH are the perception of voices in the absence of auditory stimulation and represents a severe mental health symptom. Virtual/augmented reality (VR/AR) and brain computer interfaces (BCI) are technologies that are growing more and more in different medical and psychological applications. Our position is that their combined use in computer-based therapies offers still unforeseen possibilities for the treatment of physical and mental disabilities. This is why, the paper expects that researchers and clinicians undergo a pathway toward human-avatar symbiosis for AVH by taking full advantage of new technologies. This outlook supposes to address challenging issues in the understanding of non-pharmacological treatment of schizophrenia-related disorders and the exploitation of VR/AR and BCI to achieve a real human-avatar symbiosis. PMID:29209193
Fernández-Caballero, Antonio; Navarro, Elena; Fernández-Sotos, Patricia; González, Pascual; Ricarte, Jorge J; Latorre, José M; Rodriguez-Jimenez, Roberto
2017-01-01
This perspective paper faces the future of alternative treatments that take advantage of a social and cognitive approach with regards to pharmacological therapy of auditory verbal hallucinations (AVH) in patients with schizophrenia. AVH are the perception of voices in the absence of auditory stimulation and represents a severe mental health symptom. Virtual/augmented reality (VR/AR) and brain computer interfaces (BCI) are technologies that are growing more and more in different medical and psychological applications. Our position is that their combined use in computer-based therapies offers still unforeseen possibilities for the treatment of physical and mental disabilities. This is why, the paper expects that researchers and clinicians undergo a pathway toward human-avatar symbiosis for AVH by taking full advantage of new technologies. This outlook supposes to address challenging issues in the understanding of non-pharmacological treatment of schizophrenia-related disorders and the exploitation of VR/AR and BCI to achieve a real human-avatar symbiosis.
A Fog Computing and Cloudlet Based Augmented Reality System for the Industry 4.0 Shipyard.
Fernández-Caramés, Tiago M; Fraga-Lamas, Paula; Suárez-Albela, Manuel; Vilar-Montesinos, Miguel
2018-06-02
Augmented Reality (AR) is one of the key technologies pointed out by Industry 4.0 as a tool for enhancing the next generation of automated and computerized factories. AR can also help shipbuilding operators, since they usually need to interact with information (e.g., product datasheets, instructions, maintenance procedures, quality control forms) that could be handled easily and more efficiently through AR devices. This is the reason why Navantia, one of the 10 largest shipbuilders in the world, is studying the application of AR (among other technologies) in different shipyard environments in a project called "Shipyard 4.0". This article presents Navantia's industrial AR (IAR) architecture, which is based on cloudlets and on the fog computing paradigm. Both technologies are ideal for supporting physically-distributed, low-latency and QoS-aware applications that decrease the network traffic and the computational load of traditional cloud computing systems. The proposed IAR communications architecture is evaluated in real-world scenarios with payload sizes according to demanding Microsoft HoloLens applications and when using a cloud, a cloudlet and a fog computing system. The results show that, in terms of response delay, the fog computing system is the fastest when transferring small payloads (less than 128 KB), while for larger file sizes, the cloudlet solution is faster than the others. Moreover, under high loads (with many concurrent IAR clients), the cloudlet in some cases is more than four times faster than the fog computing system in terms of response delay.
Onda, Shinji; Okamoto, Tomoyoshi; Kanehira, Masaru; Suzuki, Fumitake; Ito, Ryusuke; Fujioka, Shuichi; Suzuki, Naoki; Hattori, Asaki; Yanaga, Katsuhiko
2014-04-01
In pancreaticoduodenectomy (PD), early ligation of the inferior pancreaticoduodenal artery (IPDA) before efferent veins has been advocated to decrease blood loss by congestion of the pancreatic head to be resected. In this study, we herein report the utility of early identification of the IPDA using an augmented reality (AR)-based navigation system (NS). Seven nonconsecutive patients underwent PD using AR-based NS. After paired-point matching registration, the reconstructed image obtained by preoperative computed tomography (CT) was fused with a real-time operative field image and displayed on 3D monitors. The vascular reconstructed images, including the superior mesenteric artery, jejunal artery, and IPDA were visualized to facilitate image-guided surgical procedures. We compared operating time and intraoperative blood loss of six patients who successfully underwent identification of IPDA using AR-based NS (group A) with nine patients who underwent early ligation of IPDA without using AR (group B) and 18 patients who underwent a conventional PD (group C). The IPDA or the jejunal artery was rapidly identified and ligated in six patients. The mean operating time and intraoperative blood loss in group A was 415 min and 901 ml, respectively. There was no significant difference in operating time and intraoperative blood loss among the groups. The AR-based NS provided precise anatomical information, which allowed the surgeons to rapidly identify and perform early ligation of IPDA in PD. © 2013 Japanese Society of Hepato-Biliary-Pancreatic Surgery.
2015-09-01
Training System ARB Aircraft Recovery Bulletins AR Augmented Reality CAG Carrier Air Group CATCC Carrier Air Traffic Control Center COTS...in integration of an optical lens systems into the aircraft carrier. The current generation of optical lens systems integrated into aircraft ...The use of MOVLAS on an aircraft carrier represents a direct communication link between the LSO and pilot. As a backup landing aid system to
Towards cybernetic surgery: robotic and augmented reality-assisted liver segmentectomy.
Pessaux, Patrick; Diana, Michele; Soler, Luc; Piardi, Tullio; Mutter, Didier; Marescaux, Jacques
2015-04-01
Augmented reality (AR) in surgery consists in the fusion of synthetic computer-generated images (3D virtual model) obtained from medical imaging preoperative workup and real-time patient images in order to visualize unapparent anatomical details. The 3D model could be used for a preoperative planning of the procedure. The potential of AR navigation as a tool to improve safety of the surgical dissection is outlined for robotic hepatectomy. Three patients underwent a fully robotic and AR-assisted hepatic segmentectomy. The 3D virtual anatomical model was obtained using a thoracoabdominal CT scan with a customary software (VR-RENDER®, IRCAD). The model was then processed using a VR-RENDER® plug-in application, the Virtual Surgical Planning (VSP®, IRCAD), to delineate surgical resection planes including the elective ligature of vascular structures. Deformations associated with pneumoperitoneum were also simulated. The virtual model was superimposed to the operative field. A computer scientist manually registered virtual and real images using a video mixer (MX 70; Panasonic, Secaucus, NJ) in real time. Two totally robotic AR segmentectomy V and one segmentectomy VI were performed. AR allowed for the precise and safe recognition of all major vascular structures during the procedure. Total time required to obtain AR was 8 min (range 6-10 min). Each registration (alignment of the vascular anatomy) required a few seconds. Hepatic pedicle clamping was never performed. At the end of the procedure, the remnant liver was correctly vascularized. Resection margins were negative in all cases. The postoperative period was uneventful without perioperative transfusion. AR is a valuable navigation tool which may enhance the ability to achieve safe surgical resection during robotic hepatectomy.
Learning Application of Astronomy Based Augmented Reality using Android Platform
NASA Astrophysics Data System (ADS)
Maleke, B.; Paseru, D.; Padang, R.
2018-02-01
Astronomy is a branch of science involving observations of celestial bodies such as stars, planets, nebular comets, star clusters, and galaxies as well as natural phenomena occurring outside the Earth’s atmosphere. The way of learning of Astronomy is quite varied, such as by using a book or observe directly with a telescope. But both ways of learning have shortcomings, for example learning through books is only presented in the form of interesting 2D drawings. While learning with a telescope requires a fairly expensive cost to buy the equipment. This study will present a more interesting way of learning from the previous one, namely through Augmented Reality (AR) application using Android platform. Augmented Reality is a combination of virtual world (virtual) and real world (real) made by computer. Virtual objects can be text, animation, 3D models or videos that are combined with the actual environment so that the user feels the virtual object is in his environment. With the use of the Android platform, this application makes the learning method more interesting because it can be used on various Android smartphones so that learning can be done anytime and anywhere. The methodology used in making applications is Multimedia Lifecycle, along with C # language for AR programming and flowchart as a modelling tool. The results of research on some users stated that this application can run well and can be used as an alternative way of learning Astronomy with more interesting.
Augmented Reality Mentor for Training Maintenance Procedures: Interim Assessment
2014-08-01
USABILITY QUESTIONNAIRE ........................................................ C-1 APPENDIX D: AR MENTOR USABILITY INTERVIEW PROTOCOL...APPENDIX F. INSTRUCTOR INTERVIEW PROTOCOL ......................................... F-1 APPENDIX G: MECHANIC INTERVIEW PROTOCOL...thus freeing up the opportunity for instruction around higher-order problem solving. They may also reduce the burden on peer “ helpers ” who read
Participatory Scaling through Augmented Reality Learning through Local Games
ERIC Educational Resources Information Center
Martin, John; Dikkers, Seann; Squire, Kurt; Gagnon, David
2014-01-01
The proliferation of broadband mobile devices, which many students bring to school with them as mobile phones, makes the widespread adoption of AR pedagogies a possibility, but pedagogical, distribution, and training models are needed to make this innovation an integrated part of education, This paper employs Social Construction of Technology…
Developing Mobile Support System for Augmented Reality in Clinical Constraint Teaching.
Weng, Wei Kai; Hsu, Han Jen
2018-01-01
It's always been a big topic how to improve the efficiency of education. Reply teaching and image teaching had always performed better than text book. Furthermore, adapting AR technology will be a great add-on to reinforce it. That's the main purpose of this project.
Application of Virtual and Augmented reality to geoscientific teaching and research.
NASA Astrophysics Data System (ADS)
Hodgetts, David
2017-04-01
The geological sciences are the ideal candidate for the application of Virtual Reality (VR) and Augmented Reality (AR). Digital data collection techniques such as laser scanning, digital photogrammetry and the increasing use of Unmanned Aerial Vehicles (UAV) or Small Unmanned Aircraft (SUA) technology allow us to collect large datasets efficiently and evermore affordably. This linked with the recent resurgence in VR and AR technologies make these 3D digital datasets even more valuable. These advances in VR and AR have been further supported by rapid improvements in graphics card technologies, and by development of high performance software applications to support them. Visualising data in VR is more complex than normal 3D rendering, consideration needs to be given to latency, frame-rate and the comfort of the viewer to enable reasonably long immersion time. Each frame has to be rendered from 2 viewpoints (one for each eye) requiring twice the rendering than for normal monoscopic views. Any unnatural effects (e.g. incorrect lighting) can lead to an uncomfortable VR experience so these have to be minimised. With large digital outcrop datasets comprising 10's-100's of millions of triangles this is challenging but achievable. Apart from the obvious "wow factor" of VR there are some serious applications. It is often the case that users of digital outcrop data do not appreciate the size of features they are dealing with. This is not the case when using correctly scaled VR, and a true sense of scale can be achieved. In addition VR provides an excellent way of performing quality control on 3D models and interpretations and errors are much more easily visible. VR models can then be used to create content that can then be used in AR applications closing the loop and taking interpretations back into the field.
A novel augmented reality simulator for skills assessment in minimal invasive surgery.
Lahanas, Vasileios; Loukas, Constantinos; Smailis, Nikolaos; Georgiou, Evangelos
2015-08-01
Over the past decade, simulation-based training has come to the foreground as an efficient method for training and assessment of surgical skills in minimal invasive surgery. Box-trainers and virtual reality (VR) simulators have been introduced in the teaching curricula and have substituted to some extent the traditional model of training based on animals or cadavers. Augmented reality (AR) is a new technology that allows blending of VR elements and real objects within a real-world scene. In this paper, we present a novel AR simulator for assessment of basic laparoscopic skills. The components of the proposed system include: a box-trainer, a camera and a set of laparoscopic tools equipped with custom-made sensors that allow interaction with VR training elements. Three AR tasks were developed, focusing on basic skills such as perception of depth of field, hand-eye coordination and bimanual operation. The construct validity of the system was evaluated via a comparison between two experience groups: novices with no experience in laparoscopic surgery and experienced surgeons. The observed metrics included task execution time, tool pathlength and two task-specific errors. The study also included a feedback questionnaire requiring participants to evaluate the face-validity of the system. Between-group comparison demonstrated highly significant differences (<0.01) in all performance metrics and tasks denoting the simulator's construct validity. Qualitative analysis on the instruments' trajectories highlighted differences between novices and experts regarding smoothness and economy of motion. Subjects' ratings on the feedback questionnaire highlighted the face-validity of the training system. The results highlight the potential of the proposed simulator to discriminate groups with different expertise providing a proof of concept for the potential use of AR as a core technology for laparoscopic simulation training.
FlyAR: augmented reality supported micro aerial vehicle navigation.
Zollmann, Stefanie; Hoppe, Christof; Langlotz, Tobias; Reitmayr, Gerhard
2014-04-01
Micro aerial vehicles equipped with high-resolution cameras can be used to create aerial reconstructions of an area of interest. In that context automatic flight path planning and autonomous flying is often applied but so far cannot fully replace the human in the loop, supervising the flight on-site to assure that there are no collisions with obstacles. Unfortunately, this workflow yields several issues, such as the need to mentally transfer the aerial vehicles position between 2D map positions and the physical environment, and the complicated depth perception of objects flying in the distance. Augmented Reality can address these issues by bringing the flight planning process on-site and visualizing the spatial relationship between the planned or current positions of the vehicle and the physical environment. In this paper, we present Augmented Reality supported navigation and flight planning of micro aerial vehicles by augmenting the users view with relevant information for flight planning and live feedback for flight supervision. Furthermore, we introduce additional depth hints supporting the user in understanding the spatial relationship of virtual waypoints in the physical world and investigate the effect of these visualization techniques on the spatial understanding.
Design of Mobile Augmented Reality in Health Care Education: A Theory-Driven Framework.
Zhu, Egui; Lilienthal, Anneliese; Shluzas, Lauren Aquino; Masiello, Italo; Zary, Nabil
2015-09-18
Augmented reality (AR) is increasingly used across a range of subject areas in health care education as health care settings partner to bridge the gap between knowledge and practice. As the first contact with patients, general practitioners (GPs) are important in the battle against a global health threat, the spread of antibiotic resistance. AR has potential as a practical tool for GPs to combine learning and practice in the rational use of antibiotics. This paper was driven by learning theory to develop a mobile augmented reality education (MARE) design framework. The primary goal of the framework is to guide the development of AR educational apps. This study focuses on (1) identifying suitable learning theories for guiding the design of AR education apps, (2) integrating learning outcomes and learning theories to support health care education through AR, and (3) applying the design framework in the context of improving GPs' rational use of antibiotics. The design framework was first constructed with the conceptual framework analysis method. Data were collected from multidisciplinary publications and reference materials and were analyzed with directed content analysis to identify key concepts and their relationships. Then the design framework was applied to a health care educational challenge. The proposed MARE framework consists of three hierarchical layers: the foundation, function, and outcome layers. Three learning theories-situated, experiential, and transformative learning-provide foundational support based on differing views of the relationships among learning, practice, and the environment. The function layer depends upon the learners' personal paradigms and indicates how health care learning could be achieved with MARE. The outcome layer analyzes different learning abilities, from knowledge to the practice level, to clarify learning objectives and expectations and to avoid teaching pitched at the wrong level. Suggestions for learning activities and the requirements of the learning environment form the foundation for AR to fill the gap between learning outcomes and medical learners' personal paradigms. With the design framework, the expected rational use of antibiotics by GPs is described and is easy to execute and evaluate. The comparison of specific expected abilities with the GP personal paradigm helps solidify the GP practical learning objectives and helps design the learning environment and activities. The learning environment and activities were supported by learning theories. This paper describes a framework for guiding the design, development, and application of mobile AR for medical education in the health care setting. The framework is theory driven with an understanding of the characteristics of AR and specific medical disciplines toward helping medical education improve professional development from knowledge to practice. Future research will use the framework as a guide for developing AR apps in practice to validate and improve the design framework.
Design of Mobile Augmented Reality in Health Care Education: A Theory-Driven Framework
Lilienthal, Anneliese; Shluzas, Lauren Aquino; Masiello, Italo; Zary, Nabil
2015-01-01
Background Augmented reality (AR) is increasingly used across a range of subject areas in health care education as health care settings partner to bridge the gap between knowledge and practice. As the first contact with patients, general practitioners (GPs) are important in the battle against a global health threat, the spread of antibiotic resistance. AR has potential as a practical tool for GPs to combine learning and practice in the rational use of antibiotics. Objective This paper was driven by learning theory to develop a mobile augmented reality education (MARE) design framework. The primary goal of the framework is to guide the development of AR educational apps. This study focuses on (1) identifying suitable learning theories for guiding the design of AR education apps, (2) integrating learning outcomes and learning theories to support health care education through AR, and (3) applying the design framework in the context of improving GPs’ rational use of antibiotics. Methods The design framework was first constructed with the conceptual framework analysis method. Data were collected from multidisciplinary publications and reference materials and were analyzed with directed content analysis to identify key concepts and their relationships. Then the design framework was applied to a health care educational challenge. Results The proposed MARE framework consists of three hierarchical layers: the foundation, function, and outcome layers. Three learning theories—situated, experiential, and transformative learning—provide foundational support based on differing views of the relationships among learning, practice, and the environment. The function layer depends upon the learners’ personal paradigms and indicates how health care learning could be achieved with MARE. The outcome layer analyzes different learning abilities, from knowledge to the practice level, to clarify learning objectives and expectations and to avoid teaching pitched at the wrong level. Suggestions for learning activities and the requirements of the learning environment form the foundation for AR to fill the gap between learning outcomes and medical learners’ personal paradigms. With the design framework, the expected rational use of antibiotics by GPs is described and is easy to execute and evaluate. The comparison of specific expected abilities with the GP personal paradigm helps solidify the GP practical learning objectives and helps design the learning environment and activities. The learning environment and activities were supported by learning theories. Conclusions This paper describes a framework for guiding the design, development, and application of mobile AR for medical education in the health care setting. The framework is theory driven with an understanding of the characteristics of AR and specific medical disciplines toward helping medical education improve professional development from knowledge to practice. Future research will use the framework as a guide for developing AR apps in practice to validate and improve the design framework. PMID:27731839
NASA Astrophysics Data System (ADS)
Maurer, Calvin R., Jr.; Sauer, Frank; Hu, Bo; Bascle, Benedicte; Geiger, Bernhard; Wenzel, Fabian; Recchi, Filippo; Rohlfing, Torsten; Brown, Christopher R.; Bakos, Robert J.; Maciunas, Robert J.; Bani-Hashemi, Ali R.
2001-05-01
We are developing a video see-through head-mounted display (HMD) augmented reality (AR) system for image-guided neurosurgical planning and navigation. The surgeon wears a HMD that presents him with the augmented stereo view. The HMD is custom fitted with two miniature color video cameras that capture a stereo view of the real-world scene. We are concentrating specifically at this point on cranial neurosurgery, so the images will be of the patient's head. A third video camera, operating in the near infrared, is also attached to the HMD and is used for head tracking. The pose (i.e., position and orientation) of the HMD is used to determine where to overlay anatomic structures segmented from preoperative tomographic images (e.g., CT, MR) on the intraoperative video images. Two SGI 540 Visual Workstation computers process the three video streams and render the augmented stereo views for display on the HMD. The AR system operates in real time at 30 frames/sec with a temporal latency of about three frames (100 ms) and zero relative lag between the virtual objects and the real-world scene. For an initial evaluation of the system, we created AR images using a head phantom with actual internal anatomic structures (segmented from CT and MR scans of a patient) realistically positioned inside the phantom. When using shaded renderings, many users had difficulty appreciating overlaid brain structures as being inside the head. When using wire frames, and texture-mapped dot patterns, most users correctly visualized brain anatomy as being internal and could generally appreciate spatial relationships among various objects. The 3D perception of these structures is based on both stereoscopic depth cues and kinetic depth cues, with the user looking at the head phantom from varying positions. The perception of the augmented visualization is natural and convincing. The brain structures appear rigidly anchored in the head, manifesting little or no apparent swimming or jitter. The initial evaluation of the system is encouraging, and we believe that AR visualization might become an important tool for image-guided neurosurgical planning and navigation.
Towards multi-platform software architecture for Collaborative Teleoperation
NASA Astrophysics Data System (ADS)
Domingues, Christophe; Otmane, Samir; Davesne, Frederic; Mallem, Malik
2009-03-01
Augmented Reality (AR) can provide to a Human Operator (HO) a real help in achieving complex tasks, such as remote control of robots and cooperative teleassistance. Using appropriate augmentations, the HO can interact faster, safer and easier with the remote real world. In this paper, we present an extension of an existing distributed software and network architecture for collaborative teleoperation based on networked human-scaled mixed reality and mobile platform. The first teleoperation system was composed by a VR application and a Web application. However the 2 systems cannot be used together and it is impossible to control a distant robot simultaneously. Our goal is to update the teleoperation system to permit a heterogeneous collaborative teleoperation between the 2 platforms. An important feature of this interface is based on the use of different Virtual Reality platforms and different Mobile platforms to control one or many robots.
Towards multi-platform software architecture for Collaborative Teleoperation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Domingues, Christophe; Otmane, Samir; Davesne, Frederic
2009-03-05
Augmented Reality (AR) can provide to a Human Operator (HO) a real help in achieving complex tasks, such as remote control of robots and cooperative teleassistance. Using appropriate augmentations, the HO can interact faster, safer and easier with the remote real world. In this paper, we present an extension of an existing distributed software and network architecture for collaborative teleoperation based on networked human-scaled mixed reality and mobile platform. The first teleoperation system was composed by a VR application and a Web application. However the 2 systems cannot be used together and it is impossible to control a distant robotmore » simultaneously. Our goal is to update the teleoperation system to permit a heterogeneous collaborative teleoperation between the 2 platforms. An important feature of this interface is based on the use of different Virtual Reality platforms and different Mobile platforms to control one or many robots.« less
a Framework for Low-Cost Multi-Platform VR and AR Site Experiences
NASA Astrophysics Data System (ADS)
Wallgrün, J. O.; Huang, J.; Zhao, J.; Masrur, A.; Oprean, D.; Klippel, A.
2017-11-01
Low-cost consumer-level immersive solutions have the potential to revolutionize education and research in many fields by providing virtual experiences of sites that are either inaccessible, too dangerous, or too expensive to visit, or by augmenting in-situ experiences using augmented and mixed reality methods. We present our approach for creating low-cost multi-platform virtual and augmented reality site experiences of real world places for education and research purposes, making extensive use of Structure-from-Motion methods as well as 360° photography and videography. We discuss several example projects, for the Mayan City of Cahal Pech, Iceland's Thrihnukar volcano, the Santa Marta informal settlement in Rio, and for the Penn State Campus, and we propose a framework for creating and maintaining such applications by combining declarative content specification methods with a central linked-data based spatio-temporal information system.
Augmenting the thermal flux experiment: A mixed reality approach with the HoloLens
NASA Astrophysics Data System (ADS)
Strzys, M. P.; Kapp, S.; Thees, M.; Kuhn, J.; Lukowicz, P.; Knierim, P.; Schmidt, A.
2017-09-01
In the field of Virtual Reality (VR) and Augmented Reality (AR), technologies have made huge progress during the last years and also reached the field of education. The virtuality continuum, ranging from pure virtuality on one side to the real world on the other, has been successfully covered by the use of immersive technologies like head-mounted displays, which allow one to embed virtual objects into the real surroundings, leading to a Mixed Reality (MR) experience. In such an environment, digital and real objects do not only coexist, but moreover are also able to interact with each other in real time. These concepts can be used to merge human perception of reality with digitally visualized sensor data, thereby making the invisible visible. As a first example, in this paper we introduce alongside the basic idea of this column an MR experiment in thermodynamics for a laboratory course for freshman students in physics or other science and engineering subjects that uses physical data from mobile devices for analyzing and displaying physical phenomena to students.
Augmented reality system for CT-guided interventions: system description and initial phantom trials
NASA Astrophysics Data System (ADS)
Sauer, Frank; Schoepf, Uwe J.; Khamene, Ali; Vogt, Sebastian; Das, Marco; Silverman, Stuart G.
2003-05-01
We are developing an augmented reality (AR) image guidance system, in which information derived from medical images is overlaid onto a video view of the patient. The interventionalist wears a head-mounted display (HMD) that presents him with the augmented stereo view. The HMD is custom fitted with two miniature color video cameras that capture the stereo view of the scene. A third video camera, operating in the near IR, is also attached to the HMD and is used for head tracking. The system achieves real-time performance of 30 frames per second. The graphics appears firmly anchored in the scne, without any noticeable swimming or jitter or time lag. For the application of CT-guided interventions, we extended our original prototype system to include tracking of a biopsy needle to which we attached a set of optical markers. The AR visualization provides very intuitive guidance for planning and placement of the needle and reduces radiation to patient and radiologist. We used an interventional abdominal phantom with simulated liver lesions to perform an inital set of experiments. The users were consistently able to locate the target lesion with the first needle pass. These results provide encouragement to move the system towards clinical trials.
Augmented Reality Robot-assisted Radical Prostatectomy: Preliminary Experience.
Porpiglia, Francesco; Fiori, Cristian; Checcucci, Enrico; Amparore, Daniele; Bertolo, Riccardo
2018-05-01
To present our preliminary experience with augmented reality robot-assisted radical prostatectomy (AR-RARP). From June to August 2017, patients candidate to RARP were enrolled and underwent high-resolution multi-parametric magnetic resonance imaging (1-mm slices) according to dedicated protocol. The obtained three-dimensional (3D) reconstruction was integrated in the robotic console to perform AR-RARP. According to the staging at magnetic resonance imaging or reconstruction, in case of cT2 prostate cancer, intrafascial nerve sparing (NS) was performed: a mark was placed on the prostate capsule to indicate the virtual underlying intraprostatic lesion; in case of cT3, standard NS AR-RARP was scheduled with AR-guided biopsy at the level of suspected extracapsular extension (ECE). Prostate specimens were scanned to assess the 3D model concordance. Sixteen patients underwent intrafascial NS technique (cT2), whereas 14 underwent standard NS+ selective biopsy of suspected ECE (cT3). Final pathology confirmed clinical staging. Positive surgical margins' rate was 30% (no positive surgical margins in pT2). In patients whose intraprostatic lesions were marked, final pathology confirmed lesion location. In patients with suspected ECE, AR-guided selective biopsies confirmed the ECE location, with 11 of 14 biopsies (78%) positive for prostate cancer. Prostate specimens were scanned with finding of a good overlap. The mismatch between 3D reconstruction and scanning ranged from 1 to 5 mm. In 85% of the entire surface, the mismatch was <3 mm. In our preliminary experience, AR-RARP seems to be safe and effective. The accuracy of 3D reconstruction seemed to be promising. This technology has still limitations: the virtual models are manually oriented and rigid. Future collaborations with bioengineers will allow overcoming these limitations. Copyright © 2018 Elsevier Inc. All rights reserved.
Implementation of augmented reality in operative dentistry learning.
Llena, C; Folguera, S; Forner, L; Rodríguez-Lozano, F J
2018-02-01
To evaluate the efficacy of augmented reality (AR) in the gaining of knowledge and skills amongst dental students in the design of cavity preparations and analyse their degree of satisfaction. AR cavity models were prepared for use with computers and mobile devices. Forty-one students were divided into two groups (traditional teaching methods vs AR). Questionnaires were designed to evaluate knowledge and skills, with the administration of a satisfaction questionnaire for those using AR. The degree of compliance with the standards in cavity design was assessed. The Mann-Whitney U-test was used to compare knowledge and skills between the two groups, and the Wilcoxon test was applied to compare intragroup differences. The chi-square test in turn was used to compare the qualitative parameters of the cavity designs between the groups. Statistical significance was considered for P<.05 in all cases. No significant differences were observed in level of knowledge before, immediately after or 6 months after teaching between the two groups (P>.05). Although the results corresponding to most of the studied skills parameters were better in the experimental group, significant differences (P<.05) were only founded for cavity depth and extent for Class I and divergence of the buccal and lingual walls for the Class II. The experience was rated as favourable or very favourable by 100% of the participants. The students showed preference for computers (60%) vs mobile devices (10%). The AR techniques favoured the gaining of knowledge and skills and were regarded as a useful tool by the students. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Ntourakis, Dimitrios; Memeo, Ricardo; Soler, Luc; Marescaux, Jacques; Mutter, Didier; Pessaux, Patrick
2016-02-01
Modern chemotherapy achieves the shrinking of colorectal cancer liver metastases (CRLM) to such extent that they may disappear from radiological imaging. Disappearing CRLM rarely represents a complete pathological remission and have an important risk of recurrence. Augmented reality (AR) consists in the fusion of real-time patient images with a computer-generated 3D virtual patient model created from pre-operative medical imaging. The aim of this prospective pilot study is to investigate the potential of AR navigation as a tool to help locate and surgically resect missing CRLM. A 3D virtual anatomical model was created from thoracoabdominal CT-scans using customary software (VR RENDER(®), IRCAD). The virtual model was superimposed to the operative field using an Exoscope (VITOM(®), Karl Storz, Tüttlingen, Germany). Virtual and real images were manually registered in real-time using a video mixer, based on external anatomical landmarks with an estimated accuracy of 5 mm. This modality was tested in three patients, with four missing CRLM that had sizes from 12 to 24 mm, undergoing laparotomy after receiving pre-operative oxaliplatin-based chemotherapy. AR display and fine registration was performed within 6 min. AR helped detect all four missing CRLM, and guided their resection. In all cases the planned security margin of 1 cm was clear and resections were confirmed to be R0 by pathology. There was no postoperative major morbidity or mortality. No local recurrence occurred in the follow-up period of 6-22 months. This initial experience suggests that AR may be a helpful navigation tool for the resection of missing CRLM.
Bringing Abstract Academic Integrity and Ethical Concepts into Real-Life Situations
ERIC Educational Resources Information Center
Kwong, Theresa; Wong, Eva; Yue, Kevin
2017-01-01
This paper reports the learning analytics on the initial stages of a large-scale, government-funded project which inducts university students in Hong Kong into consideration of academic integrity and ethics through mobile Augmented Reality (AR) learning trails--Trails of Integrity and Ethics (TIEs)--accessed on smart devices. The trails immerse…
Science Spots AR: A Platform for Science Learning Games with Augmented Reality
ERIC Educational Resources Information Center
Laine, Teemu H.; Nygren, Eeva; Dirin, Amir; Suk, Hae-Jung
2016-01-01
Lack of motivation and of real-world relevance have been identified as reasons for low interest in science among children. Game-based learning and storytelling are prominent methods for generating intrinsic motivation in learning. Real-world relevance requires connecting abstract scientific concepts with the real world. This can be done by…
Embodied information behavior, mixed reality and big data
NASA Astrophysics Data System (ADS)
West, Ruth; Parola, Max J.; Jaycen, Amelia R.; Lueg, Christopher P.
2015-03-01
A renaissance in the development of virtual (VR), augmented (AR), and mixed reality (MR) technologies with a focus on consumer and industrial applications is underway. As data becomes ubiquitous in our lives, a need arises to revisit the role of our bodies, explicitly in relation to data or information. Our observation is that VR/AR/MR technology development is a vision of the future framed in terms of promissory narratives. These narratives develop alongside the underlying enabling technologies and create new use contexts for virtual experiences. It is a vision rooted in the combination of responsive, interactive, dynamic, sharable data streams, and augmentation of the physical senses for capabilities beyond those normally humanly possible. In parallel to the varied definitions of information and approaches to elucidating information behavior, a myriad of definitions and methods of measuring and understanding presence in virtual experiences exist. These and other ideas will be tested by designers, developers and technology adopters as the broader ecology of head-worn devices for virtual experiences evolves in order to reap the full potential and benefits of these emerging technologies.
Providing IoT Services in Smart Cities through Dynamic Augmented Reality Markers.
Chaves-Diéguez, David; Pellitero-Rivero, Alexandre; García-Coego, Daniel; González-Castaño, Francisco Javier; Rodríguez-Hernández, Pedro Salvador; Piñeiro-Gómez, Óscar; Gil-Castiñeira, Felipe; Costa-Montenegro, Enrique
2015-07-03
Smart cities are expected to improve the quality of life of citizens by relying on new paradigms, such as the Internet of Things (IoT) and its capacity to manage and interconnect thousands of sensors and actuators scattered across the city. At the same time, mobile devices widely assist professional and personal everyday activities. A very good example of the potential of these devices for smart cities is their powerful support for intuitive service interfaces (such as those based on augmented reality (AR)) for non-expert users. In our work, we consider a scenario that combines IoT and AR within a smart city maintenance service to improve the accessibility of sensor and actuator devices in the field, where responsiveness is crucial. In it, depending on the location and needs of each service, data and commands will be transported by an urban communications network or consulted on the spot. Direct AR interaction with urban objects has already been described; it usually relies on 2D visual codes to deliver object identifiers (IDs) to the rendering device to identify object resources. These IDs allow information about the objects to be retrieved from a remote server. In this work, we present a novel solution that replaces static AR markers with dynamic markers based on LED communication, which can be decoded through cameras embedded in smartphones. These dynamic markers can directly deliver sensor information to the rendering device, on top of the object ID, without further network interaction.
NASA Astrophysics Data System (ADS)
Oh, Jihun; Kang, Xin; Wilson, Emmanuel; Peters, Craig A.; Kane, Timothy D.; Shekhar, Raj
2014-03-01
In laparoscopic surgery, live video provides visualization of the exposed organ surfaces in the surgical field, but is unable to show internal structures beneath those surfaces. The laparoscopic ultrasound is often used to visualize the internal structures, but its use is limited to intermittent confirmation because of the need for an extra hand to maneuver the ultrasound probe. Other limitations of using ultrasound are the difficulty of interpretation and the need for an extra port. The size of the ultrasound transducer may also be too large for its usage in small children. In this paper, we report on an augmented reality (AR) visualization system that features continuous hands-free volumetric ultrasound scanning of the surgical anatomy and video imaging from a stereoscopic laparoscope. The acquisition of volumetric ultrasound image is realized by precisely controlling a back-and-forth movement of an ultrasound transducer mounted on a linear slider. Furthermore, the ultrasound volume is refreshed several times per minute. This scanner will sit outside of the body in the envisioned use scenario and could be even integrated into the operating table. An overlay of the maximum intensity projection (MIP) of ultrasound volume on the laparoscopic stereo video through geometric transformations features an AR visualization system particularly suitable for children, because ultrasound is radiation-free and provides higher-quality images in small patients. The proposed AR representation promises to be better than the AR representation using ultrasound slice data.
Providing IoT Services in Smart Cities through Dynamic Augmented Reality Markers
Chaves-Diéguez, David; Pellitero-Rivero, Alexandre; García-Coego, Daniel; González-Castaño, Francisco Javier; Rodríguez-Hernández, Pedro Salvador; Piñeiro-Gómez, Óscar; Gil-Castiñeira, Felipe; Costa-Montenegro, Enrique
2015-01-01
Smart cities are expected to improve the quality of life of citizens by relying on new paradigms, such as the Internet of Things (IoT) and its capacity to manage and interconnect thousands of sensors and actuators scattered across the city. At the same time, mobile devices widely assist professional and personal everyday activities. A very good example of the potential of these devices for smart cities is their powerful support for intuitive service interfaces (such as those based on augmented reality (AR)) for non-expert users. In our work, we consider a scenario that combines IoT and AR within a smart city maintenance service to improve the accessibility of sensor and actuator devices in the field, where responsiveness is crucial. In it, depending on the location and needs of each service, data and commands will be transported by an urban communications network or consulted on the spot. Direct AR interaction with urban objects has already been described; it usually relies on 2D visual codes to deliver object identifiers (IDs) to the rendering device to identify object resources. These IDs allow information about the objects to be retrieved from a remote server. In this work, we present a novel solution that replaces static AR markers with dynamic markers based on LED communication, which can be decoded through cameras embedded in smartphones. These dynamic markers can directly deliver sensor information to the rendering device, on top of the object ID, without further network interaction. PMID:26151215
Design, implementation and accuracy of a prototype for medical augmented reality.
Pandya, Abhilash; Siadat, Mohammad-Reza; Auner, Greg
2005-01-01
This paper is focused on prototype development and accuracy evaluation of a medical Augmented Reality (AR) system. The accuracy of such a system is of critical importance for medical use, and is hence considered in detail. We analyze the individual error contributions and the system accuracy of the prototype. A passive articulated arm is used to track a calibrated end-effector-mounted video camera. The live video view is superimposed in real time with the synchronized graphical view of CT-derived segmented object(s) of interest within a phantom skull. The AR accuracy mostly depends on the accuracy of the tracking technology, the registration procedure, the camera calibration, and the image scanning device (e.g., a CT or MRI scanner). The accuracy of the Microscribe arm was measured to be 0.87 mm. After mounting the camera on the tracking device, the AR accuracy was measured to be 2.74 mm on average (standard deviation = 0.81 mm). After using data from a 2-mm-thick CT scan, the AR error remained essentially the same at an average of 2.75 mm (standard deviation = 1.19 mm). For neurosurgery, the acceptable error is approximately 2-3 mm, and our prototype approaches these accuracy requirements. The accuracy could be increased with a higher-fidelity tracking system and improved calibration and object registration. The design and methods of this prototype device can be extrapolated to current medical robotics (due to the kinematic similarity) and neuronavigation systems.
Nakata, Norio; Suzuki, Naoki; Hattori, Asaki; Hirai, Naoya; Miyamoto, Yukio; Fukuda, Kunihiko
2012-01-01
Although widely used as a pointing device on personal computers (PCs), the mouse was originally designed for control of two-dimensional (2D) cursor movement and is not suited to complex three-dimensional (3D) image manipulation. Augmented reality (AR) is a field of computer science that involves combining the physical world and an interactive 3D virtual world; it represents a new 3D user interface (UI) paradigm. A system for 3D and four-dimensional (4D) image manipulation has been developed that uses optical tracking AR integrated with a smartphone remote control. The smartphone is placed in a hard case (jacket) with a 2D printed fiducial marker for AR on the back. It is connected to a conventional PC with an embedded Web camera by means of WiFi. The touch screen UI of the smartphone is then used as a remote control for 3D and 4D image manipulation. Using this system, the radiologist can easily manipulate 3D and 4D images from computed tomography and magnetic resonance imaging in an AR environment with high-quality image resolution. Pilot assessment of this system suggests that radiologists will be able to manipulate 3D and 4D images in the reading room in the near future. Supplemental material available at http://radiographics.rsna.org/lookup/suppl/doi:10.1148/rg.324115086/-/DC1.
Augmented Reality Technology Using Microsoft HoloLens in Anatomic Pathology.
Hanna, Matthew G; Ahmed, Ishtiaque; Nine, Jeffrey; Prajapati, Shyam; Pantanowitz, Liron
2018-05-01
Context Augmented reality (AR) devices such as the Microsoft HoloLens have not been well used in the medical field. Objective To test the HoloLens for clinical and nonclinical applications in pathology. Design A Microsoft HoloLens was tested for virtual annotation during autopsy, viewing 3D gross and microscopic pathology specimens, navigating whole slide images, telepathology, as well as real-time pathology-radiology correlation. Results Pathology residents performing an autopsy wearing the HoloLens were remotely instructed with real-time diagrams, annotations, and voice instruction. 3D-scanned gross pathology specimens could be viewed as holograms and easily manipulated. Telepathology was supported during gross examination and at the time of intraoperative consultation, allowing users to remotely access a pathologist for guidance and to virtually annotate areas of interest on specimens in real-time. The HoloLens permitted radiographs to be coregistered on gross specimens and thereby enhanced locating important pathologic findings. The HoloLens also allowed easy viewing and navigation of whole slide images, using an AR workstation, including multiple coregistered tissue sections facilitating volumetric pathology evaluation. Conclusions The HoloLens is a novel AR tool with multiple clinical and nonclinical applications in pathology. The device was comfortable to wear, easy to use, provided sufficient computing power, and supported high-resolution imaging. It was useful for autopsy, gross and microscopic examination, and ideally suited for digital pathology. Unique applications include remote supervision and annotation, 3D image viewing and manipulation, telepathology in a mixed-reality environment, and real-time pathology-radiology correlation.
Model-Based Referenceless Quality Metric of 3D Synthesized Images Using Local Image Description.
Gu, Ke; Jakhetiya, Vinit; Qiao, Jun-Fei; Li, Xiaoli; Lin, Weisi; Thalmann, Daniel
2017-07-28
New challenges have been brought out along with the emerging of 3D-related technologies such as virtual reality (VR), augmented reality (AR), and mixed reality (MR). Free viewpoint video (FVV), due to its applications in remote surveillance, remote education, etc, based on the flexible selection of direction and viewpoint, has been perceived as the development direction of next-generation video technologies and has drawn a wide range of researchers' attention. Since FVV images are synthesized via a depth image-based rendering (DIBR) procedure in the "blind" environment (without reference images), a reliable real-time blind quality evaluation and monitoring system is urgently required. But existing assessment metrics do not render human judgments faithfully mainly because geometric distortions are generated by DIBR. To this end, this paper proposes a novel referenceless quality metric of DIBR-synthesized images using the autoregression (AR)-based local image description. It was found that, after the AR prediction, the reconstructed error between a DIBR-synthesized image and its AR-predicted image can accurately capture the geometry distortion. The visual saliency is then leveraged to modify the proposed blind quality metric to a sizable margin. Experiments validate the superiority of our no-reference quality method as compared with prevailing full-, reduced- and no-reference models.
ERIC Educational Resources Information Center
Ladd, Melissa
2016-01-01
This study strived to determine the effectiveness of the AR phonics program relative to the effectiveness of the scripted phonics program for developing the letter identification, sound verbalization, and blending abilities of kindergarten students considered at-risk based on state assessments. The researcher was interested in pretest and posttest…
Editorial: Challenges for the usability of AR and VR for clinical neurosurgical procedures.
de Ribaupierre, Sandrine; Eagleson, Roy
2017-10-01
There are a number of challenges that must be faced when trying to develop AR and VR-based Neurosurgical simulators, Surgical Navigation Platforms, and "Smart OR" systems. Trying to simulate an operating room environment and surgical tasks in Augmented and Virtual Reality is a challenge many are attempting to solve, in order to train surgeons or help them operate. What are some of the needs of the surgeon, and what are the challenges encountered (human computer interface, perception, workflow, etc). We discuss these tradeoffs and conclude with critical remarks.
Augmented reality for biomedical wellness sensor systems
NASA Astrophysics Data System (ADS)
Jenkins, Jeffrey; Szu, Harold
2013-05-01
Due to the commercial move and gaming industries, Augmented Reality (AR) technology has matured. By definition of AR, both artificial and real humans can be simultaneously present and realistically interact among one another. With the help of physics and physiology, we can build in the AR tool together with real human day-night webcam inputs through a simple interaction of heat transfer -getting hot, action and reaction -walking or falling, as well as the physiology -sweating due to activity. Knowing the person age, weight and 3D coordinates of joints in the body, we deduce the force, the torque, and the energy expenditure during real human movements and apply to an AR human model. We wish to support the physics-physiology AR version, PPAR, as a BMW surveillance tool for senior home alone (SHA). The functionality is to record senior walking and hand movements inside a home environment. Besides the fringe benefit of enabling more visits from grand children through AR video games, the PP-AR surveillance tool may serve as a means to screen patients in the home for potential falls at points around in house. Moreover, we anticipate PP-AR may help analyze the behavior history of SHA, e.g. enhancing the Smartphone SHA Ubiquitous Care Program, by discovering early symptoms of candidate Alzheimer-like midnight excursions, or Parkinson-like trembling motion for when performing challenging muscular joint movements. Using a set of coordinates corresponding to a set of 3D positions representing human joint locations, we compute the Kinetic Energy (KE) generated by each body segment over time. The Work is then calculated, and converted into calories. Using common graphics rendering pipelines, one could invoke AR technology to provide more information about patients to caretakers. Alerts to caretakers can be prompted by a patient's departure from their personal baseline, and the patient's time ordered joint information can be loaded to a graphics viewer allowing for high-definition digital reconstruction. Then an entire scene can be viewed from any position in virtual space, and AR can display certain measurements values which either constituted an alert, or otherwise indicate signs of the transition from wellness to illness.
Augmented reality cues to assist older drivers with gap estimation for left-turns.
Rusch, Michelle L; Schall, Mark C; Lee, John D; Dawson, Jeffrey D; Rizzo, Matthew
2014-10-01
The objective of this study was to assess the effects of augmented reality (AR) cues designed to assist middle-aged and older drivers with a range of UFOV impairments, judging when to make left-turns across oncoming traffic. Previous studies have shown that AR cues can help middle-aged and older drivers respond to potential roadside hazards by increasing hazard detection without interfering with other driving tasks. Intersections pose a critical challenge for cognitively impaired drivers, prone to misjudge time-to-contact with oncoming traffic. We investigated whether AR cues improve or interfere with hazard perception in left-turns across oncoming traffic for drivers with age-related cognitive decline. Sixty-four middle-aged and older drivers with a range of UFOV impairment judged when it would be safe to turn left across oncoming traffic approaching the driver from the opposite direction in a rural stop-sign controlled intersection scenario implemented in a static base driving simulator. Outcome measures used to evaluate the effectiveness of AR cueing included: Time-to-Contact (TTC), Gap Time Variation (GTV), Response Rate, and Gap Response Variation (GRV). All drivers estimated TTCs were shorter in cued than in uncued conditions. In addition, drivers responded more often in cued conditions than in uncued conditions and GRV decreased for all drivers in scenarios that contained AR cues. For both TTC and response rate, drivers also appeared to adjust their behavior to be consistent with the cues, especially drivers with the poorest UFOV scores (matching their behavior to be close to middle-aged drivers). Driver ratings indicated that cueing was not considered to be distracting. Further, various conditions of reliability (e.g., 15% miss rate) did not appear to affect performance or driver ratings. Copyright © 2014 Elsevier Ltd. All rights reserved.
AUGMENTED REALITY CUES TO ASSIST OLDER DRIVERS WITH GAP ESTIMATION FOR LEFT-TURNS
Rusch, Michelle L.; Schall, Mark C.; Lee, John D.; Dawson, Jeffrey D.; Rizzo, Matthew
2014-01-01
The objective of this study was to assess the effects of augmented reality (AR) cues designed to assist middle-aged and older drivers with a range of UFOV impairments, judging when to make left-turns across oncoming traffic. Previous studies have shown that AR cues can help middle-aged and older drivers respond to potential roadside hazards by increasing hazard detection without interfering with other driving tasks. Intersections pose a critical challenge for cognitively impaired drivers, prone to misjudge time-to-contact with oncoming traffic. We investigated whether AR cues improve or interfere with hazard perception in left-turns across oncoming traffic for drivers with age-related cognitive decline. Sixty-four middle-aged and older drivers with a range of UFOV impairment judged when it would be safe to turn left across oncoming traffic approaching the driver from the opposite direction in a rural stop-sign controlled intersection scenario implemented in a static base driving simulator. Outcome measures used to evaluate the effectiveness of AR cueing included: Time-to-Contact (TTC), Gap Time Variation (GTV), Response Rate, and Gap Response Variation (GRV). All drivers estimated TTCs were shorter in cued than in uncued conditions. In addition, drivers responded more often in cued conditions than in uncued conditions and GRV decreased for all drivers in scenarios that contained AR cues. For both TTC and response rate, drivers also appeared to adjust their behavior to be consistent with the cues, especially drivers with the poorest UFOV scores (matching their behavior to be close to middle-aged drivers). Driver ratings indicated that cueing was not considered to be distracting. Further, various conditions of reliability (e.g., 15% miss rate) did not appear to affect performance or driver ratings. PMID:24950128
Tang, Rui; Ma, Longfei; Li, Ang; Yu, Lihan; Rong, Zhixia; Zhang, Xinjing; Xiang, Canhong; Liao, Hongen; Dong, Jiahong
2018-06-01
We applied augmented reality (AR) techniques to flexible choledochoscopy examinations. Enhanced computed tomography data of a patient with intrahepatic and extrahepatic biliary duct dilatation were collected to generate a hollow, 3-dimensional (3D) model of the biliary tree by 3D printing. The 3D printed model was placed in an opaque box. An electromagnetic (EM) sensor was internally installed in the choledochoscope instrument channel for tracking its movements through the passages of the 3D printed model, and an AR navigation platform was built using image overlay display. The porta hepatis was used as the reference marker with rigid image registration. The trajectories of the choledochoscope and the EM sensor were observed and recorded using the operator interface of the choledochoscope. Training choledochoscopy was performed on the 3D printed model. The choledochoscope was guided into the left and right hepatic ducts, the right anterior hepatic duct, the bile ducts of segment 8, the hepatic duct in subsegment 8, the right posterior hepatic duct, and the left and the right bile ducts of the caudate lobe. Although stability in tracking was less than ideal, the virtual choledochoscope images and EM sensor tracking were effective for navigation. AR techniques can be used to assist navigation in choledochoscopy examinations in bile duct models. Further research is needed to determine its benefits in clinical settings.
D Building Reconstruction by Multiview Images and the Integrated Application with Augmented Reality
NASA Astrophysics Data System (ADS)
Hwang, Jin-Tsong; Chu, Ting-Chen
2016-10-01
This study presents an approach wherein photographs with a high degree of overlap are clicked using a digital camera and used to generate three-dimensional (3D) point clouds via feature point extraction and matching. To reconstruct a building model, an unmanned aerial vehicle (UAV) is used to click photographs from vertical shooting angles above the building. Multiview images are taken from the ground to eliminate the shielding effect on UAV images caused by trees. Point clouds from the UAV and multiview images are generated via Pix4Dmapper. By merging two sets of point clouds via tie points, the complete building model is reconstructed. The 3D models are reconstructed using AutoCAD 2016 to generate vectors from the point clouds; SketchUp Make 2016 is used to rebuild a complete building model with textures. To apply 3D building models in urban planning and design, a modern approach is to rebuild the digital models; however, replacing the landscape design and building distribution in real time is difficult as the frequency of building replacement increases. One potential solution to these problems is augmented reality (AR). Using Unity3D and Vuforia to design and implement the smartphone application service, a markerless AR of the building model can be built. This study is aimed at providing technical and design skills related to urban planning, urban designing, and building information retrieval using AR.
NASA Technical Reports Server (NTRS)
Axholt, Magnus; Skoglund, Martin; Peterson, Stephen D.; Cooper, Matthew D.; Schoen, Thomas B.; Gustafsson, Fredrik; Ynnerman, Anders; Ellis, Stephen R.
2010-01-01
Augmented Reality (AR) is a technique by which computer generated signals synthesize impressions that are made to coexist with the surrounding real world as perceived by the user. Human smell, taste, touch and hearing can all be augmented, but most commonly AR refers to the human vision being overlaid with information otherwise not readily available to the user. A correct calibration is important on an application level, ensuring that e.g. data labels are presented at correct locations, but also on a system level to enable display techniques such as stereoscopy to function properly [SOURCE]. Thus, vital to AR, calibration methodology is an important research area. While great achievements already have been made, there are some properties in current calibration methods for augmenting vision which do not translate from its traditional use in automated cameras calibration to its use with a human operator. This paper uses a Monte Carlo simulation of a standard direct linear transformation camera calibration to investigate how user introduced head orientation noise affects the parameter estimation during a calibration procedure of an optical see-through head mounted display.
Tackling the challenges of fully immersive head-mounted AR devices
NASA Astrophysics Data System (ADS)
Singer, Wolfgang; Hillenbrand, Matthias; Münz, Holger
2017-11-01
The optical requirements of fully immersive head mounted AR devices are inherently determined by the human visual system. The etendue of the visual system is large. As a consequence, the requirements for fully immersive head-mounted AR devices exceeds almost any high end optical system. Two promising solutions to achieve the large etendue and their challenges are discussed. Head-mounted augmented reality devices have been developed for decades - mostly for application within aircrafts and in combination with a heavy and bulky helmet. The established head-up displays for applications within automotive vehicles typically utilize similar techniques. Recently, there is the vision of eyeglasses with included augmentation, offering a large field of view, and being unobtrusively all-day wearable. There seems to be no simple solution to reach the functional performance requirements. Known technical solutions paths seem to be a dead-end, and some seem to offer promising perspectives, however with severe limitations. As an alternative, unobtrusively all-day wearable devices with a significantly smaller field of view are already possible.
Measuring the Usability of Augmented Reality e-Learning Systems: A User-Centered Evaluation Approach
NASA Astrophysics Data System (ADS)
Pribeanu, Costin; Balog, Alexandru; Iordache, Dragoş Daniel
The development of Augmented Reality (AR) systems is creating new challenges and opportunities for the designers of e-learning systems. The mix of real and virtual requires appropriate interaction techniques that have to be evaluated with users in order to avoid usability problems. Formative usability aims at finding usability problems as early as possible in the development life cycle and is suitable to support the development of such novel interactive systems. This work presents an approach to the user-centered usability evaluation of an e-learning scenario for Biology developed on an Augmented Reality educational platform. The evaluation has been carried on during and after a summer school held within the ARiSE research project. The basic idea was to perform usability evaluation twice. In this respect, we conducted user testing with a small number of students during the summer school in order to get a fast feedback from users having good knowledge in Biology. Then, we repeated the user testing in different conditions and with a relatively larger number of representative users. In this paper we describe both experiments and compare the usability evaluation results.
Optical augmented reality assisted navigation system for neurosurgery teaching and planning
NASA Astrophysics Data System (ADS)
Wu, Hui-Qun; Geng, Xing-Yun; Wang, Li; Zhang, Yuan-Peng; Jiang, Kui; Tang, Le-Min; Zhou, Guo-Min; Dong, Jian-Cheng
2013-07-01
This paper proposed a convenient navigation system for neurosurgeon's pre-operative planning and teaching with augmented reality (AR) technique, which maps the three-dimensional reconstructed virtual anatomy structures onto a skull model. This system included two parts, a virtual reality system and a skull model scence. In our experiment, a 73 year old right-handed man initially diagnosed with astrocytoma was selected as an example to vertify our system. His imaging data from different modalities were registered and the skull soft tissue, brain and inside vessels as well as tumor were reconstructed. Then the reconstructed models were overlayed on the real scence. Our findings showed that the reconstructed tissues were augmented into the real scence and the registration results were in good alignment. The reconstructed brain tissue was well distributed in the skull cavity. The probe was used by a neurosurgeon to explore the surgical pathway which could be directly posed into the tumor while not injuring important vessels. In this way, the learning cost for students and patients' education about surgical risks reduced. Therefore, this system could be a selective protocol for image guided surgery(IGS), and is promising for neurosurgeon's pre-operative planning and teaching.
Loukas, Constantinos; Lahanas, Vasileios; Georgiou, Evangelos
2013-12-01
Despite the popular use of virtual and physical reality simulators in laparoscopic training, the educational potential of augmented reality (AR) has not received much attention. A major challenge is the robust tracking and three-dimensional (3D) pose estimation of the endoscopic instrument, which are essential for achieving interaction with the virtual world and for realistic rendering when the virtual scene is occluded by the instrument. In this paper we propose a method that addresses these issues, based solely on visual information obtained from the endoscopic camera. Two different tracking algorithms are combined for estimating the 3D pose of the surgical instrument with respect to the camera. The first tracker creates an adaptive model of a colour strip attached to the distal part of the tool (close to the tip). The second algorithm tracks the endoscopic shaft, using a combined Hough-Kalman approach. The 3D pose is estimated with perspective geometry, using appropriate measurements extracted by the two trackers. The method has been validated on several complex image sequences for its tracking efficiency, pose estimation accuracy and applicability in AR-based training. Using a standard endoscopic camera, the absolute average error of the tip position was 2.5 mm for working distances commonly found in laparoscopic training. The average error of the instrument's angle with respect to the camera plane was approximately 2°. The results are also supplemented by video segments of laparoscopic training tasks performed in a physical and an AR environment. The experiments yielded promising results regarding the potential of applying AR technologies for laparoscopic skills training, based on a computer vision framework. The issue of occlusion handling was adequately addressed. The estimated trajectory of the instruments may also be used for surgical gesture interpretation and assessment. Copyright © 2013 John Wiley & Sons, Ltd.
Loy Rodas, Nicolas; Barrera, Fernando; Padoy, Nicolas
2017-02-01
We present an approach to provide awareness to the harmful ionizing radiation generated during X-ray-guided minimally invasive procedures. A hand-held screen is used to display directly in the user's view information related to radiation safety in a mobile augmented reality (AR) manner. Instead of using markers, we propose a method to track the observer's viewpoint, which relies on the use of multiple RGB-D sensors and combines equipment detection for tracking initialization with a KinectFusion-like approach for frame-to-frame tracking. Two of the sensors are ceiling-mounted and a third one is attached to the hand-held screen. The ceiling cameras keep an updated model of the room's layout, which is used to exploit context information and improve the relocalization procedure. The system is evaluated on a multicamera dataset generated inside an operating room (OR) and containing ground-truth poses of the AR display. This dataset includes a wide variety of sequences with different scene configurations, occlusions, motion in the scene, and abrupt viewpoint changes. Qualitative results illustrating the different AR visualization modes for radiation awareness provided by the system are also presented. Our approach allows the user to benefit from a large AR visualization area and permits to recover from tracking failure caused by vast motion or changes in the scene just by looking at a piece of equipment. The system enables the user to see the 3-D propagation of radiation, the medical staff's exposure, and/or the doses deposited on the patient's surface as seen through his own eyes.
Liquid crystal true 3D displays for augmented reality applications
NASA Astrophysics Data System (ADS)
Li, Yan; Liu, Shuxin; Zhou, Pengcheng; Chen, Quanming; Su, Yikai
2018-02-01
Augmented reality (AR) technology, which integrates virtual computer-generated information into the real world scene, is believed to be the next-generation human-machine interface. However, most AR products adopt stereoscopic 3D display technique, which causes the accommodation-vergence conflict. To solve this problem, we have proposed two approaches. The first is a multi-planar volumetric display using fast switching polymer-stabilized liquid crystal (PSLC) films. By rapidly switching the films between scattering and transparent states while synchronizing with a high-speed projector, the 2D slices of a 3D volume could be displayed in time sequence. We delved into the research on developing high-performance PSLC films in both normal mode and reverse mode; moreover, we also realized the demonstration of four-depth AR images with correct accommodation cues. For the second approach, we realized a holographic AR display using digital blazed gratings and a 4f system to eliminate zero-order and higher-order noise. With a 4k liquid crystal on silicon device, we achieved a field of view (FOV) of 32 deg. Moreover, we designed a compact waveguidebased holographic 3D display. In the design, there are two holographic optical elements (HOEs), each of which functions as a diffractive grating and a Fresnel lens. Because of the grating effect, holographic 3D image light is coupled into and decoupled out of the waveguide by modifying incident angles. Because of the lens effect, the collimated zero order light is focused at a point, and got filtered out. The optical power of the second HOE also helps enlarge FOV.
Video-guided calibration of an augmented reality mobile C-arm.
Chen, Xin; Naik, Hemal; Wang, Lejing; Navab, Nassir; Fallavollita, Pascal
2014-11-01
The augmented reality (AR) fluoroscope augments an X-ray image by video and provides the surgeon with a real-time in situ overlay of the anatomy. The overlay alignment is crucial for diagnostic and intra-operative guidance, so precise calibration of the AR fluoroscope is required. The first and most complex step of the calibration procedure is the determination of the X-ray source position. Currently, this is achieved using a biplane phantom with movable metallic rings on its top layer and fixed X-ray opaque markers on its bottom layer. The metallic rings must be moved to positions where at least two pairs of rings and markers are isocentric in the X-ray image. The current "trial and error" calibration process currently requires acquisition of many X-ray images, a task that is both time consuming and radiation intensive. An improved process was developed and tested for C-arm calibration. Video guidance was used to drive the calibration procedure to minimize both X-ray exposure and the time involved. For this, a homography between X-ray and video images is estimated. This homography is valid for the plane at which the metallic rings are positioned and is employed to guide the calibration procedure. Eight users having varying calibration experience (i.e., 2 experts, 2 semi-experts, 4 novices) were asked to participate in the evaluation. The video-guided technique reduced the number of intra-operative X-ray calibration images by 89% and decreased the total time required by 59%. A video-based C-arm calibration method has been developed that improves the usability of the AR fluoroscope with a friendlier interface, reduced calibration time and clinically acceptable radiation doses.
Smart maintenance of riverbanks using a standard data layer and Augmented Reality
NASA Astrophysics Data System (ADS)
Pierdicca, Roberto; Frontoni, Emanuele; Zingaretti, Primo; Mancini, Adriano; Malinverni, Eva Savina; Tassetti, Anna Nora; Marcheggiani, Ernesto; Galli, Andrea
2016-10-01
Linear buffer strips (BS) along watercourses are commonly adopted to reduce run-off, accumulation of bank-top sediments and the leaking of pesticides into fresh-waters, which strongly increase water pollution. However, the monitoring of their conditions is a difficult task because they are scattered over wide rural areas. This work demonstrates the benefits of using a standard data layer and Augmented Reality (AR) in watershed control and outlines the guideline of a novel approach for the health-check of linear BS. We designed a mobile environmental monitoring system for smart maintenance of riverbanks by embedding the AR technology within a Geographical Information System (GIS). From the technological point of view, the system's architecture consists of a cloud-based service for data sharing, using a standard data layer, and of a mobile device provided with a GPS based AR engine for augmented data visualization. The proposed solution aims to ease the overall inspection process by reducing the time required to run a survey. Indeed, ordinary operational survey conditions are usually performed basing the fieldwork on just classical digitized maps. Our application proposes to enrich inspections by superimposing information on the device screen with the same point of view of the camera, providing an intuitive visualization of buffer strip location. This way, the inspection officer can quickly and dynamically access relevant information overlaying geographic features, comments and other contents in real time. The solution has been tested in fieldwork to prove at what extent this cutting-edge technology contributes to an effective monitoring over large territorial settings. The aim is to encourage officers, land managers and practitioners toward more effective monitoring and management practices.
Design and evaluation of an augmented reality simulator using leap motion.
Wright, Trinette; de Ribaupierre, Sandrine; Eagleson, Roy
2017-10-01
Advances in virtual and augmented reality (AR) are having an impact on the medical field in areas such as surgical simulation. Improvements to surgical simulation will provide students and residents with additional training and evaluation methods. This is particularly important for procedures such as the endoscopic third ventriculostomy (ETV), which residents perform regularly. Simulators such as NeuroTouch, have been designed to aid in training associated with this procedure. The authors have designed an affordable and easily accessible ETV simulator, and compare it with the existing NeuroTouch for its usability and training effectiveness. This simulator was developed using Unity, Vuforia and the leap motion (LM) for an AR environment. The participants, 16 novices and two expert neurosurgeons, were asked to complete 40 targeting tasks. Participants used the NeuroTouch tool or a virtual hand controlled by the LM to select the position and orientation for these tasks. The length of time to complete each task was recorded and the trajectory log files were used to calculate performance. The resulting data from the novices' and experts' speed and accuracy are compared, and they discuss the objective performance of training in terms of the speed and accuracy of targeting accuracy for each system.
Augmented Reality Image Guidance in Minimally Invasive Prostatectomy
NASA Astrophysics Data System (ADS)
Cohen, Daniel; Mayer, Erik; Chen, Dongbin; Anstee, Ann; Vale, Justin; Yang, Guang-Zhong; Darzi, Ara; Edwards, Philip'eddie'
This paper presents our work aimed at providing augmented reality (AR) guidance of robot-assisted laparoscopic surgery (RALP) using the da Vinci system. There is a good clinical case for guidance due to the significant rate of complications and steep learning curve for this procedure. Patients who were due to undergo robotic prostatectomy for organ-confined prostate cancer underwent preoperative 3T MRI scans of the pelvis. These were segmented and reconstructed to form 3D images of pelvic anatomy. The reconstructed image was successfully overlaid onto screenshots of the recorded surgery post-procedure. Surgeons who perform minimally-invasive prostatectomy took part in a user-needs analysis to determine the potential benefits of an image guidance system after viewing the overlaid images. All surgeons stated that the development would be useful at key stages of the surgery and could help to improve the learning curve of the procedure and improve functional and oncological outcomes. Establishing the clinical need in this way is a vital early step in development of an AR guidance system. We have also identified relevant anatomy from preoperative MRI. Further work will be aimed at automated registration to account for tissue deformation during the procedure, using a combination of transrectal ultrasound and stereoendoscopic video.
Alaraj, Ali; Charbel, Fady T.; Birk, Daniel; Tobin, Mathew; Luciano, Cristian; Banerjee, Pat P.; Rizzi, Silvio; Sorenson, Jeff; Foley, Kevin; Slavin, Konstantin; Roitberg, Ben
2013-01-01
Recent studies have shown that mental script-based rehearsal and simulation-based training improves the transfer of surgical skills in various medical disciplines. Despite significant advances in technology and intraoperative techniques over the last several decades, surgical skills training on neurosurgical operations still carries significant risk of serious morbidity or mortality. Potentially avoidable technical errors are well recognized as contributing to poor surgical outcome. Surgical education is undergoing overwhelming change, with reduction of working hours and current trends to focus on patient’s safety and linking reimbursement with clinical outcomes, and there is a need for adjunctive means for neurosurgical training;this has been recent advancement in simulation technology. ImmersiveTouch (IT) is an augmented reality (AR) system that integrates a haptic device and a high-resolution stereoscopic display. This simulation platform utilizes multiple sensory modalities, recreating many of the environmental cues experienced during an actual procedure. Modules available include ventriculostomy, bone drilling, percutaneous trigeminal rhizotomy, in addition to simulated spinal modules such as pedicle screw placement, vertebroplasty, and lumbar puncture. We present our experience with development of such AR neurosurgical modules and the feedback from neurosurgical residents. PMID:23254799
Design and evaluation of an augmented reality simulator using leap motion
de Ribaupierre, Sandrine; Eagleson, Roy
2017-01-01
Advances in virtual and augmented reality (AR) are having an impact on the medical field in areas such as surgical simulation. Improvements to surgical simulation will provide students and residents with additional training and evaluation methods. This is particularly important for procedures such as the endoscopic third ventriculostomy (ETV), which residents perform regularly. Simulators such as NeuroTouch, have been designed to aid in training associated with this procedure. The authors have designed an affordable and easily accessible ETV simulator, and compare it with the existing NeuroTouch for its usability and training effectiveness. This simulator was developed using Unity, Vuforia and the leap motion (LM) for an AR environment. The participants, 16 novices and two expert neurosurgeons, were asked to complete 40 targeting tasks. Participants used the NeuroTouch tool or a virtual hand controlled by the LM to select the position and orientation for these tasks. The length of time to complete each task was recorded and the trajectory log files were used to calculate performance. The resulting data from the novices' and experts' speed and accuracy are compared, and they discuss the objective performance of training in terms of the speed and accuracy of targeting accuracy for each system. PMID:29184667
Robust and efficient fiducial tracking for augmented reality in HD-laparoscopic video streams
NASA Astrophysics Data System (ADS)
Mueller, M.; Groch, A.; Baumhauer, M.; Maier-Hein, L.; Teber, D.; Rassweiler, J.; Meinzer, H.-P.; Wegner, In.
2012-02-01
Augmented Reality (AR) is a convenient way of porting information from medical images into the surgical field of view and can deliver valuable assistance to the surgeon, especially in laparoscopic procedures. In addition, high definition (HD) laparoscopic video devices are a great improvement over the previously used low resolution equipment. However, in AR applications that rely on real-time detection of fiducials from video streams, the demand for efficient image processing has increased due to the introduction of HD devices. We present an algorithm based on the well-known Conditional Density Propagation (CONDENSATION) algorithm which can satisfy these new demands. By incorporating a prediction around an already existing and robust segmentation algorithm, we can speed up the whole procedure while leaving the robustness of the fiducial segmentation untouched. For evaluation purposes we tested the algorithm on recordings from real interventions, allowing for a meaningful interpretation of the results. Our results show that we can accelerate the segmentation by a factor of 3.5 on average. Moreover, the prediction information can be used to compensate for fiducials that are temporarily occluded or out of scope, providing greater stability.
Noll, Christoph; von Jan, Ute; Raap, Ulrike; Albrecht, Urs-Vito
2017-09-14
Advantages of mobile Augmented Reality (mAR) application-based learning versus textbook-based learning were already shown in a previous study. However, it was unclear whether the augmented reality (AR) component was responsible for the success of the self-developed app or whether this was attributable to the novelty of using mobile technology for learning. The study's aim was to test the hypothesis whether there is no difference in learning success between learners who employed the mobile AR component and those who learned without it to determine possible effects of mAR. Also, we were interested in potential emotional effects of using this technology. Forty-four medical students (male: 25, female: 19, mean age: 22.25 years, standard deviation [SD]: 3.33 years) participated in this study. Baseline emotional status was evaluated using the Profile of Mood States (POMS) questionnaire. Dermatological knowledge was ascertained using a single choice (SC) test (10 questions). The students were randomly assigned to learn 45 min with either a mobile learning method with mAR (group A) or without AR (group B). Afterwards, both groups were again asked to complete the previous questionnaires. AttrakDiff 2 questionnaires were used to evaluate the perceived usability as well as pragmatic and hedonic qualities. For capturing longer term effects, after 14 days, all participants were again asked to complete the SC questionnaire. All evaluations were anonymous, and descriptive statistics were calculated. For hypothesis testing, an unpaired signed-rank test was applied. For the SC tests, there were only minor differences, with both groups gaining knowledge (average improvement group A: 3.59 [SD 1.48]; group B: 3.86 [SD 1.51]). Differences between both groups were statistically insignificant (exact Mann Whitney U, U=173.5; P=.10; r=.247). However, in the follow-up SC test after 14 days, group A had retained more knowledge (average decrease of the number of correct answers group A: 0.33 [SD 1.62]; group B: 1.14 [SD 1.30]). For both groups, descriptively, there were only small variations regarding emotional involvement, and learning experiences also differed little, with both groups rating the app similar for its stimulating effect. We were unable to show significant effects for mAR on the immediate learning success of the mobile learning setting. However, the similar level of stimulation being noted for both groups is inconsistent with the previous assumption of the success of mAR-based approach being solely attributable to the excitement of using mobile technology, independent of mAR; the mAR group showed some indications for a better long-term retention of knowledge. Further studies are needed to examine this aspect. German Clinical Trials Register (DRKS): 00012980; http://www.drks.de/drks_web/navigate.do? navigationId=trial.HTML&TRIAL_ID=DRKS00012980 (Archived by WebCite at http://www.webcitation.org/ 6tCWoM2Jb). ©Christoph Noll, Ute von Jan, Ulrike Raap, Urs-Vito Albrecht. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 14.09.2017.
2017-01-01
Background Advantages of mobile Augmented Reality (mAR) application-based learning versus textbook-based learning were already shown in a previous study. However, it was unclear whether the augmented reality (AR) component was responsible for the success of the self-developed app or whether this was attributable to the novelty of using mobile technology for learning. Objective The study’s aim was to test the hypothesis whether there is no difference in learning success between learners who employed the mobile AR component and those who learned without it to determine possible effects of mAR. Also, we were interested in potential emotional effects of using this technology. Methods Forty-four medical students (male: 25, female: 19, mean age: 22.25 years, standard deviation [SD]: 3.33 years) participated in this study. Baseline emotional status was evaluated using the Profile of Mood States (POMS) questionnaire. Dermatological knowledge was ascertained using a single choice (SC) test (10 questions). The students were randomly assigned to learn 45 min with either a mobile learning method with mAR (group A) or without AR (group B). Afterwards, both groups were again asked to complete the previous questionnaires. AttrakDiff 2 questionnaires were used to evaluate the perceived usability as well as pragmatic and hedonic qualities. For capturing longer term effects, after 14 days, all participants were again asked to complete the SC questionnaire. All evaluations were anonymous, and descriptive statistics were calculated. For hypothesis testing, an unpaired signed-rank test was applied. Results For the SC tests, there were only minor differences, with both groups gaining knowledge (average improvement group A: 3.59 [SD 1.48]; group B: 3.86 [SD 1.51]). Differences between both groups were statistically insignificant (exact Mann Whitney U, U=173.5; P=.10; r=.247). However, in the follow-up SC test after 14 days, group A had retained more knowledge (average decrease of the number of correct answers group A: 0.33 [SD 1.62]; group B: 1.14 [SD 1.30]). For both groups, descriptively, there were only small variations regarding emotional involvement, and learning experiences also differed little, with both groups rating the app similar for its stimulating effect. Conclusions We were unable to show significant effects for mAR on the immediate learning success of the mobile learning setting. However, the similar level of stimulation being noted for both groups is inconsistent with the previous assumption of the success of mAR-based approach being solely attributable to the excitement of using mobile technology, independent of mAR; the mAR group showed some indications for a better long-term retention of knowledge. Further studies are needed to examine this aspect. Trial Registration German Clinical Trials Register (DRKS): 00012980; http://www.drks.de/drks_web/navigate.do? navigationId=trial.HTML&TRIAL_ID=DRKS00012980 (Archived by WebCite at http://www.webcitation.org/ 6tCWoM2Jb). PMID:28912113
Virtual Reconstruction of Lost Architectures: from the Tls Survey to AR Visualization
NASA Astrophysics Data System (ADS)
Quattrini, R.; Pierdicca, R.; Frontoni, E.; Barcaglioni, R.
2016-06-01
The exploitation of high quality 3D models for dissemination of archaeological heritage is currently an investigated topic, although Mobile Augmented Reality platforms for historical architecture are not available, allowing to develop low-cost pipelines for effective contents. The paper presents a virtual anastylosis, starting from historical sources and from 3D model based on TLS survey. Several efforts and outputs in augmented or immersive environments, exploiting this reconstruction, are discussed. The work demonstrates the feasibility of a 3D reconstruction approach for complex architectural shapes starting from point clouds and its AR/VR exploitation, allowing the superimposition with archaeological evidences. Major contributions consist in the presentation and the discussion of a pipeline starting from the virtual model, to its simplification showing several outcomes, comparing also the supported data qualities and advantages/disadvantages due to MAR and VR limitations.
Using an Augmented Reality Device as a Distance-based Vision Aid-Promise and Limitations.
Kinateder, Max; Gualtieri, Justin; Dunn, Matt J; Jarosz, Wojciech; Yang, Xing-Dong; Cooper, Emily A
2018-06-06
For people with limited vision, wearable displays hold the potential to digitally enhance visual function. As these display technologies advance, it is important to understand their promise and limitations as vision aids. The aim of this study was to test the potential of a consumer augmented reality (AR) device for improving the functional vision of people with near-complete vision loss. An AR application that translates spatial information into high-contrast visual patterns was developed. Two experiments assessed the efficacy of the application to improve vision: an exploratory study with four visually impaired participants and a main controlled study with participants with simulated vision loss (n = 48). In both studies, performance was tested on a range of visual tasks (identifying the location, pose and gesture of a person, identifying objects, and moving around in an unfamiliar space). Participants' accuracy and confidence were compared on these tasks with and without augmented vision, as well as their subjective responses about ease of mobility. In the main study, the AR application was associated with substantially improved accuracy and confidence in object recognition (all P < .001) and to a lesser degree in gesture recognition (P < .05). There was no significant change in performance on identifying body poses or in subjective assessments of mobility, as compared with a control group. Consumer AR devices may soon be able to support applications that improve the functional vision of users for some tasks. In our study, both artificially impaired participants and participants with near-complete vision loss performed tasks that they could not do without the AR system. Current limitations in system performance and form factor, as well as the risk of overconfidence, will need to be overcome.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal.
Augmented Reality at the Tactical and Operational Levels of War
2015-10-24
benefits and challenges their personnel will experience once AR systems are fully adopted. This paper will explain these benefits and challenges as...develop, procure, and integrate systems it believes will benefit its tactical combat units and operational leaders. Ultimately, as the capabilities of...friendly forces, can also help to prevent collateral damage and civilian casualties. Beyond the immediate life-and-death benefits at the tactical
Horizons in Learning Innovation through Technology: Prospects for Air Force Education Benefits
2010-06-10
prototyping, and implementation. Successfully implementing disruptive innovations requires change management to help steward the identification ...systems and environments for Air Force education benefits goes beyond the identification and analysis of emerging horizons. Processes and methods...scene, a patrol area, or a suspect lineup (“Augmented- reality,” 2010). Connection to Innovation Triangle. The concepts of LVC and AR are quickly
Applying AR technology with a projector-camera system in a history museum
NASA Astrophysics Data System (ADS)
Miyata, Kimiyoshi; Shiroishi, Rina; Inoue, Yuka
2011-01-01
In this research, an AR (augmented reality) technology with projector-camera system is proposed for a history museum to provide user-friendly interface and pseudo hands-on exhibition. The proposed system is a desktop application and designed for old Japanese coins to enhance the visitors' interests and motivation to investigate them. The size of the old coins are small to recognize their features and the surface of the coins has fine structures on both sides, so it is meaningful to show the reverse side and enlarged image of the coins to the visitors for enhancing their interest and motivation. The image of the reverse side of the coins is displayed based on the AR technology to reverse the AR marker by the user. The information to augment the coins is projected by using a data projector, and the information is placed nearby the coins. The proposed system contributes to develop an exhibition method based on the combinations of the real artifacts and the AR technology, and demonstrated the flexibility and capability to offer background information relating to the old Japanese coins. However, the accuracy of the detection and tracking of the markers and visitor evaluation survey are required to improve the effectiveness of the system.
Mobile augmented reality for computer-assisted percutaneous nephrolithotomy.
Müller, Michael; Rassweiler, Marie-Claire; Klein, Jan; Seitel, Alexander; Gondan, Matthias; Baumhauer, Matthias; Teber, Dogu; Rassweiler, Jens J; Meinzer, Hans-Peter; Maier-Hein, Lena
2013-07-01
Percutaneous nephrolithotomy (PCNL) plays an integral role in treatment of renal stones. Creating percutaneous renal access is the most important and challenging step in the procedure. To facilitate this step, we evaluated our novel mobile augmented reality (AR) system for its feasibility of use for PCNL. A tablet computer, such as an iPad[Formula: see text], is positioned above the patient with its camera pointing toward the field of intervention. The images of the tablet camera are registered with the CT image by means of fiducial markers. Structures of interest can be superimposed semi-transparently on the video images. We present a systematic evaluation by means of a phantom study. An urological trainee and two experts conducted 53 punctures on kidney phantoms. The trainee performed best with the proposed AR system in terms of puncturing time (mean: 99 s), whereas the experts performed best with fluoroscopy (mean: 59 s). iPad assistance lowered radiation exposure by a factor of 3 for the inexperienced physician and by a factor of 1.8 for the experts in comparison with fluoroscopy usage. We achieve a mean visualization accuracy of 2.5 mm. The proposed tablet computer-based AR system has proven helpful in assisting percutaneous interventions such as PCNL and shows benefits compared to other state-of-the-art assistance systems. A drawback of the system in its current state is the lack of depth information. Despite that, the simple integration into the clinical workflow highlights the potential impact of this approach to such interventions.
NASA Astrophysics Data System (ADS)
Sudra, Gunther; Speidel, Stefanie; Fritz, Dominik; Müller-Stich, Beat Peter; Gutt, Carsten; Dillmann, Rüdiger
2007-03-01
Minimally invasive surgery is a highly complex medical discipline with various risks for surgeon and patient, but has also numerous advantages on patient-side. The surgeon has to adapt special operation-techniques and deal with difficulties like the complex hand-eye coordination, limited field of view and restricted mobility. To alleviate with these new problems, we propose to support the surgeon's spatial cognition by using augmented reality (AR) techniques to directly visualize virtual objects in the surgical site. In order to generate an intelligent support, it is necessary to have an intraoperative assistance system that recognizes the surgical skills during the intervention and provides context-aware assistance surgeon using AR techniques. With MEDIASSIST we bundle our research activities in the field of intraoperative intelligent support and visualization. Our experimental setup consists of a stereo endoscope, an optical tracking system and a head-mounted-display for 3D visualization. The framework will be used as platform for the development and evaluation of our research in the field of skill recognition and context-aware assistance generation. This includes methods for surgical skill analysis, skill classification, context interpretation as well as assistive visualization and interaction techniques. In this paper we present the objectives of MEDIASSIST and first results in the fields of skill analysis, visualization and multi-modal interaction. In detail we present a markerless instrument tracking for surgical skill analysis as well as visualization techniques and recognition of interaction gestures in an AR environment.
Tang, Rui; Ma, Longfei; Xiang, Canhong; Wang, Xuedong; Li, Ang; Liao, Hongen; Dong, Jiahong
2017-09-01
Patients who undergo hilar cholangiocarcinoma (HCAC) resection with concomitant hepatectomy have a high risk of postoperative morbidity and mortality due to surgical trauma to the hepatic and biliary vasculature. A 58-year-old Chinese man with yellowing skin and sclera, abdominal distension, pruritus, and anorexia for approximately 3 weeks. Magnetic resonance cholangiopancreatography and enhanced computed tomography (CT) scanning revealed a mass over the biliary tree at the porta hepatis, which diagnosed to be s a hilar cholangiocarcinoma. Three-dimensional (3D) images of the patient's hepatic and biliary structures were reconstructed preoperatively from CT data, and the 3D images were used for preoperative planning and augmented reality (AR)-assisted intraoperative navigation during open HCAC resection with hemihepatectomy. A 3D-printed model of the patient's biliary structures was also used intraoperatively as a visual reference. No serious postoperative complications occurred, and the patient was tumor-free at the 9-month follow-up examination based on CT results. AR-assisted preoperative planning and intraoperative navigation might be beneficial in other patients with HCAC patients to reduce postoperative complications and ensure disease-free survival. In our postoperative analysis, we also found that, when the3D images were superimposed 3D-printed model using a see-through integral video graphy display device, our senses of depth perception and motion parallax were improved, compared with that which we had experienced intraoperatively using the videobased AR display system.
NASA Astrophysics Data System (ADS)
Cieślńiski, Wojciech B.; Sobecki, Janusz; Piepiora, Paweł A.; Piepiora, Zbigniew N.; Witkowski, Kazimierz
2016-04-01
The mental training (Galloway, 2011) is one of the measures of the psychological preparation in sport. Especially such as the judo discipline requires the mental training, due to the fact that the judo is a combat sport, the direct, physical confrontation of two opponents. Hence the mental preparation should be an essential element of preparing for the sports fight. In the article are described the basics of the AR systems and presents selected elements of the AR systems: sight glasses Vuzix glasses, Kinect sensor and an interactive floor Multitap. Next, there are proposed the scenarios for using the AR in the mental training which are based on using both Vuzix glasses type head as well as the interactive floor Multitap. All options, except for the last, are provided for using the Kinect sensor. In addition, these variants differ as to the primary user of the system. It can be an competitor, his coach the competitor and the coach at the same time. In the end of the article are presented methods of exploring, both, the effectiveness and usefulness, and/or the User Experience of the proposed prototypes. There are presented three educational training simulator prototype models in sport (judo) describing their functionality based on the theory of sports training (the cyclical nature of sports training) and the theory of subtle interactions, enabling an explanation of the effects of sports training using the augmented reality technology.
Jedi training: playful evaluation of head-mounted augmented reality display systems
NASA Astrophysics Data System (ADS)
Ozbek, Christopher S.; Giesler, Bjorn; Dillmann, Ruediger
2004-05-01
A fundamental decision in building augmented reality (AR) systems is how to accomplish the combining of the real and virtual worlds. Nowadays this key-question boils down to the two alternatives video-see-through (VST) vs. optical-see-through (OST). Both systems have advantages and disadvantages in areas like production-simplicity, resolution, flexibility in composition strategies, field of view etc. To provide additional decision criteria for high dexterity, accuracy tasks and subjective user-acceptance a gaming environment was programmed that allowed good evaluation of hand-eye coordination, and that was inspired by the Star Wars movies. During an experimentation session with more than thirty participants a preference for optical-see-through glasses in conjunction with infra-red-tracking was found. Especially the high-computational demand for video-capture, processing and the resulting drop in frame rate emerged as a key-weakness of the VST-system.
NASA Technical Reports Server (NTRS)
Schulte, Erin
2017-01-01
As augmented and virtual reality grows in popularity, and more researchers focus on its development, other fields of technology have grown in the hopes of integrating with the up-and-coming hardware currently on the market. Namely, there has been a focus on how to make an intuitive, hands-free human-computer interaction (HCI) utilizing AR and VR that allows users to control their technology with little to no physical interaction with hardware. Computer vision, which is utilized in devices such as the Microsoft Kinect, webcams and other similar hardware has shown potential in assisting with the development of a HCI system that requires next to no human interaction with computing hardware and software. Object and facial recognition are two subsets of computer vision, both of which can be applied to HCI systems in the fields of medicine, security, industrial development and other similar areas.
Zeng, Hong; Wang, Yanxin; Wu, Changcheng; Song, Aiguo; Liu, Jia; Ji, Peng; Xu, Baoguo; Zhu, Lifeng; Li, Huijun; Wen, Pengcheng
2017-01-01
Brain-machine interface (BMI) can be used to control the robotic arm to assist paralysis people for performing activities of daily living. However, it is still a complex task for the BMI users to control the process of objects grasping and lifting with the robotic arm. It is hard to achieve high efficiency and accuracy even after extensive trainings. One important reason is lacking of sufficient feedback information for the user to perform the closed-loop control. In this study, we proposed a method of augmented reality (AR) guiding assistance to provide the enhanced visual feedback to the user for a closed-loop control with a hybrid Gaze-BMI, which combines the electroencephalography (EEG) signals based BMI and the eye tracking for an intuitive and effective control of the robotic arm. Experiments for the objects manipulation tasks while avoiding the obstacle in the workspace are designed to evaluate the performance of our method for controlling the robotic arm. According to the experimental results obtained from eight subjects, the advantages of the proposed closed-loop system (with AR feedback) over the open-loop system (with visual inspection only) have been verified. The number of trigger commands used for controlling the robotic arm to grasp and lift the objects with AR feedback has reduced significantly and the height gaps of the gripper in the lifting process have decreased more than 50% compared to those trials with normal visual inspection only. The results reveal that the hybrid Gaze-BMI user can benefit from the information provided by the AR interface, improving the efficiency and reducing the cognitive load during the grasping and lifting processes. PMID:29163123
Wang, Huixiang; Wang, Fang; Leong, Anthony Peng Yew; Xu, Lu; Chen, Xiaojun; Wang, Qiugen
2016-09-01
Augmented reality (AR) enables superimposition of virtual images onto the real world. The aim of this study is to present a novel AR-based navigation system for sacroiliac screw insertion and to evaluate its feasibility and accuracy in cadaveric experiments. Six cadavers with intact pelvises were employed in our study. They were CT scanned and the pelvis and vessels were segmented into 3D models. The ideal trajectory of the sacroiliac screw was planned and represented visually as a cylinder. For the intervention, the head mounted display created a real-time AR environment by superimposing the virtual 3D models onto the surgeon's field of view. The screws were drilled into the pelvis as guided by the trajectory represented by the cylinder. Following the intervention, a repeat CT scan was performed to evaluate the accuracy of the system, by assessing the screw positions and the deviations between the planned trajectories and inserted screws. Post-operative CT images showed that all 12 screws were correctly placed with no perforation. The mean deviation between the planned trajectories and the inserted screws was 2.7 ± 1.2 mm at the bony entry point, 3.7 ± 1.1 mm at the screw tip, and the mean angular deviation between the two trajectories was 2.9° ± 1.1°. The mean deviation at the nerve root tunnels region on the sagittal plane was 3.6 ± 1.0 mm. This study suggests an intuitive approach for guiding screw placement by way of AR-based navigation. This approach was feasible and accurate. It may serve as a valuable tool for assisting percutaneous sacroiliac screw insertion in live surgery.
NASA Astrophysics Data System (ADS)
Lee, Sam; Lucas, Nathan P.; Ellis, R. Darin; Pandya, Abhilash
2012-06-01
This paper presents a seamlessly controlled human multi-robot system comprised of ground and aerial robots of semiautonomous nature for source localization tasks. The system combines augmented reality interfaces capabilities with human supervisor's ability to control multiple robots. The role of this human multi-robot interface is to allow an operator to control groups of heterogeneous robots in real time in a collaborative manner. It used advanced path planning algorithms to ensure obstacles are avoided and that the operators are free for higher-level tasks. Each robot knows the environment and obstacles and can automatically generate a collision-free path to any user-selected target. It displayed sensor information from each individual robot directly on the robot in the video view. In addition, a sensor data fused AR view is displayed which helped the users pin point source information or help the operator with the goals of the mission. The paper studies a preliminary Human Factors evaluation of this system in which several interface conditions are tested for source detection tasks. Results show that the novel Augmented Reality multi-robot control (Point-and-Go and Path Planning) reduced mission completion times compared to the traditional joystick control for target detection missions. Usability tests and operator workload analysis are also investigated.
Computer-Based Technologies in Dentistry: Types and Applications
Albuha Al-Mussawi, Raja’a M.; Farid, Farzaneh
2016-01-01
During dental education, dental students learn how to examine patients, make diagnosis, plan treatment and perform dental procedures perfectly and efficiently. However, progresses in computer-based technologies including virtual reality (VR) simulators, augmented reality (AR) and computer aided design/computer aided manufacturing (CAD/CAM) systems have resulted in new modalities for instruction and practice of dentistry. Virtual reality dental simulators enable repeated, objective and assessable practice in various controlled situations. Superimposition of three-dimensional (3D) virtual images on actual images in AR allows surgeons to simultaneously visualize the surgical site and superimpose informative 3D images of invisible regions on the surgical site to serve as a guide. The use of CAD/CAM systems for designing and manufacturing of dental appliances and prostheses has been well established. This article reviews computer-based technologies, their application in dentistry and their potentials and limitations in promoting dental education, training and practice. Practitioners will be able to choose from a broader spectrum of options in their field of practice by becoming familiar with new modalities of training and practice. PMID:28392819
Computer-Based Technologies in Dentistry: Types and Applications.
Albuha Al-Mussawi, Raja'a M; Farid, Farzaneh
2016-06-01
During dental education, dental students learn how to examine patients, make diagnosis, plan treatment and perform dental procedures perfectly and efficiently. However, progresses in computer-based technologies including virtual reality (VR) simulators, augmented reality (AR) and computer aided design/computer aided manufacturing (CAD/CAM) systems have resulted in new modalities for instruction and practice of dentistry. Virtual reality dental simulators enable repeated, objective and assessable practice in various controlled situations. Superimposition of three-dimensional (3D) virtual images on actual images in AR allows surgeons to simultaneously visualize the surgical site and superimpose informative 3D images of invisible regions on the surgical site to serve as a guide. The use of CAD/CAM systems for designing and manufacturing of dental appliances and prostheses has been well established. This article reviews computer-based technologies, their application in dentistry and their potentials and limitations in promoting dental education, training and practice. Practitioners will be able to choose from a broader spectrum of options in their field of practice by becoming familiar with new modalities of training and practice.
ERIC Educational Resources Information Center
Folta, Elizabeth Eason
2010-01-01
In an effort to get children back outdoors and exploring the natural environment, a Modular Serious Educational Game (mSEG), Red Wolf Caper, was created as part of a design-based research study. Red Wolf Caper uses a combination of an augmented reality (AR) game and a serious educational game (SEG) to capture the students' interest in the natural…
Military Applications of Augmented Reality
2011-01-01
Perception and Occlusion Representation Among the things our initial domain analysis [ Gabbard et al(2002)] indicated as a potential advantage for AR...Steven Feiner, Blaine Bell, Deborah Hix, Joseph L. Gabbard , Tobias Höllerer, Blair MacIntyre, Enylton Coelho, Ulrich Neumann, Suya You, Reinhold...Annual Meeting, Human Factors and Ergonomics Society, pp 48–52 [ Gabbard et al(2002)] Gabbard JL, Swan II JE, Hix D, Lanzagorta M, Livingston MA, Brown D
Augmented Reality Based Navigation for Computer Assisted Hip Resurfacing: A Proof of Concept Study.
Liu, He; Auvinet, Edouard; Giles, Joshua; Rodriguez Y Baena, Ferdinando
2018-05-23
Implantation accuracy has a great impact on the outcomes of hip resurfacing such as recovery of hip function. Computer assisted orthopedic surgery has demonstrated clear advantages for the patients, with improved placement accuracy and fewer outliers, but the intrusiveness, cost, and added complexity have limited its widespread adoption. To provide seamless computer assistance with improved immersion and a more natural surgical workflow, we propose an augmented-reality (AR) based navigation system for hip resurfacing. The operative femur is registered by processing depth information from the surgical site with a commercial depth camera. By coupling depth data with robotic assistance, obstacles that may obstruct the femur can be tracked and avoided automatically to reduce the chance of disruption to the surgical workflow. Using the registration result and the pre-operative plan, intra-operative surgical guidance is provided through a commercial AR headset so that the user can perform the operation without additional physical guides. To assess the accuracy of the navigation system, experiments of guide hole drilling were performed on femur phantoms. The position and orientation of the drilled holes were compared with the pre-operative plan, and the mean errors were found to be approximately 2 mm and 2°, results which are in line with commercial computer assisted orthopedic systems today.
Augmented reality in laparoscopic surgical oncology.
Nicolau, Stéphane; Soler, Luc; Mutter, Didier; Marescaux, Jacques
2011-09-01
Minimally invasive surgery represents one of the main evolutions of surgical techniques aimed at providing a greater benefit to the patient. However, minimally invasive surgery increases the operative difficulty since the depth perception is usually dramatically reduced, the field of view is limited and the sense of touch is transmitted by an instrument. However, these drawbacks can currently be reduced by computer technology guiding the surgical gesture. Indeed, from a patient's medical image (US, CT or MRI), Augmented Reality (AR) can increase the surgeon's intra-operative vision by providing a virtual transparency of the patient. AR is based on two main processes: the 3D visualization of the anatomical or pathological structures appearing in the medical image, and the registration of this visualization on the real patient. 3D visualization can be performed directly from the medical image without the need for a pre-processing step thanks to volume rendering. But better results are obtained with surface rendering after organ and pathology delineations and 3D modelling. Registration can be performed interactively or automatically. Several interactive systems have been developed and applied to humans, demonstrating the benefit of AR in surgical oncology. It also shows the current limited interactivity due to soft organ movements and interaction between surgeon instruments and organs. If the current automatic AR systems show the feasibility of such system, it is still relying on specific and expensive equipment which is not available in clinical routine. Moreover, they are not robust enough due to the high complexity of developing a real-time registration taking organ deformation and human movement into account. However, the latest results of automatic AR systems are extremely encouraging and show that it will become a standard requirement for future computer-assisted surgical oncology. In this article, we will explain the concept of AR and its principles. Then, we will review the existing interactive and automatic AR systems in digestive surgical oncology, highlighting their benefits and limitations. Finally, we will discuss the future evolutions and the issues that still have to be tackled so that this technology can be seamlessly integrated in the operating room. Copyright © 2011 Elsevier Ltd. All rights reserved.
Chen, Long; Tang, Wen; John, Nigel W; Wan, Tao Ruan; Zhang, Jian Jun
2018-05-01
While Minimally Invasive Surgery (MIS) offers considerable benefits to patients, it also imposes big challenges on a surgeon's performance due to well-known issues and restrictions associated with the field of view (FOV), hand-eye misalignment and disorientation, as well as the lack of stereoscopic depth perception in monocular endoscopy. Augmented Reality (AR) technology can help to overcome these limitations by augmenting the real scene with annotations, labels, tumour measurements or even a 3D reconstruction of anatomy structures at the target surgical locations. However, previous research attempts of using AR technology in monocular MIS surgical scenes have been mainly focused on the information overlay without addressing correct spatial calibrations, which could lead to incorrect localization of annotations and labels, and inaccurate depth cues and tumour measurements. In this paper, we present a novel intra-operative dense surface reconstruction framework that is capable of providing geometry information from only monocular MIS videos for geometry-aware AR applications such as site measurements and depth cues. We address a number of compelling issues in augmenting a scene for a monocular MIS environment, such as drifting and inaccurate planar mapping. A state-of-the-art Simultaneous Localization And Mapping (SLAM) algorithm used in robotics has been extended to deal with monocular MIS surgical scenes for reliable endoscopic camera tracking and salient point mapping. A robust global 3D surface reconstruction framework has been developed for building a dense surface using only unorganized sparse point clouds extracted from the SLAM. The 3D surface reconstruction framework employs the Moving Least Squares (MLS) smoothing algorithm and the Poisson surface reconstruction framework for real time processing of the point clouds data set. Finally, the 3D geometric information of the surgical scene allows better understanding and accurate placement AR augmentations based on a robust 3D calibration. We demonstrate the clinical relevance of our proposed system through two examples: (a) measurement of the surface; (b) depth cues in monocular endoscopy. The performance and accuracy evaluations of the proposed framework consist of two steps. First, we have created a computer-generated endoscopy simulation video to quantify the accuracy of the camera tracking by comparing the results of the video camera tracking with the recorded ground-truth camera trajectories. The accuracy of the surface reconstruction is assessed by evaluating the Root Mean Square Distance (RMSD) of surface vertices of the reconstructed mesh with that of the ground truth 3D models. An error of 1.24 mm for the camera trajectories has been obtained and the RMSD for surface reconstruction is 2.54 mm, which compare favourably with previous approaches. Second, in vivo laparoscopic videos are used to examine the quality of accurate AR based annotation and measurement, and the creation of depth cues. These results show the potential promise of our geometry-aware AR technology to be used in MIS surgical scenes. The results show that the new framework is robust and accurate in dealing with challenging situations such as the rapid endoscopy camera movements in monocular MIS scenes. Both camera tracking and surface reconstruction based on a sparse point cloud are effective and operated in real-time. This demonstrates the potential of our algorithm for accurate AR localization and depth augmentation with geometric cues and correct surface measurements in MIS with monocular endoscopes. Copyright © 2018 Elsevier B.V. All rights reserved.
Riva, Giuseppe; Baños, Rosa M.; Botella, Cristina; Mantovani, Fabrizia; Gaggioli, Andrea
2016-01-01
During life, many personal changes occur. These include changing house, school, work, and even friends and partners. However, the daily experience shows clearly that, in some situations, subjects are unable to change even if they want to. The recent advances in psychology and neuroscience are now providing a better view of personal change, the change affecting our assumptive world: (a) the focus of personal change is reducing the distance between self and reality (conflict); (b) this reduction is achieved through (1) an intense focus on the particular experience creating the conflict or (2) an internal or external reorganization of this experience; (c) personal change requires a progression through a series of different stages that however happen in discontinuous and non-linear ways; and (d) clinical psychology is often used to facilitate personal change when subjects are unable to move forward. Starting from these premises, the aim of this paper is to review the potential of virtuality for enhancing the processes of personal and clinical change. First, the paper focuses on the two leading virtual technologies – augmented reality (AR) and virtual reality (VR) – exploring their current uses in behavioral health and the outcomes of the 28 available systematic reviews and meta-analyses. Then the paper discusses the added value provided by VR and AR in transforming our external experience by focusing on the high level of personal efficacy and self-reflectiveness generated by their sense of presence and emotional engagement. Finally, it outlines the potential future use of virtuality for transforming our inner experience by structuring, altering, and/or replacing our bodily self-consciousness. The final outcome may be a new generation of transformative experiences that provide knowledge that is epistemically inaccessible to the individual until he or she has that experience, while at the same time transforming the individual’s worldview. PMID:27746747
Riva, Giuseppe; Baños, Rosa M; Botella, Cristina; Mantovani, Fabrizia; Gaggioli, Andrea
2016-01-01
During life, many personal changes occur. These include changing house, school, work, and even friends and partners. However, the daily experience shows clearly that, in some situations, subjects are unable to change even if they want to. The recent advances in psychology and neuroscience are now providing a better view of personal change, the change affecting our assumptive world: (a) the focus of personal change is reducing the distance between self and reality (conflict); (b) this reduction is achieved through (1) an intense focus on the particular experience creating the conflict or (2) an internal or external reorganization of this experience; (c) personal change requires a progression through a series of different stages that however happen in discontinuous and non-linear ways; and (d) clinical psychology is often used to facilitate personal change when subjects are unable to move forward. Starting from these premises, the aim of this paper is to review the potential of virtuality for enhancing the processes of personal and clinical change. First, the paper focuses on the two leading virtual technologies - augmented reality (AR) and virtual reality (VR) - exploring their current uses in behavioral health and the outcomes of the 28 available systematic reviews and meta-analyses. Then the paper discusses the added value provided by VR and AR in transforming our external experience by focusing on the high level of personal efficacy and self-reflectiveness generated by their sense of presence and emotional engagement. Finally, it outlines the potential future use of virtuality for transforming our inner experience by structuring, altering, and/or replacing our bodily self-consciousness. The final outcome may be a new generation of transformative experiences that provide knowledge that is epistemically inaccessible to the individual until he or she has that experience, while at the same time transforming the individual's worldview.
NASA Astrophysics Data System (ADS)
Lan, Lu; Liu, Kaiming; Xia, Yan; Wu, Jiayingzi; Li, Rui; Wang, Pu; Han, Linda K.; Cheng, Ji-Xin
2017-02-01
Breast-conserving surgery is a well-accepted breast cancer treatment. However, it is still challenging for the surgeon to accurately localize the tumor during the surgery. Also, the guidance provided by current methods is 1 dimensional distance information, which is indirect and not intuitive. Therefore, it creates problems on a large re-excision rate, and a prolonged surgical time. To solve these problems, we have developed a fiber-delivered optoacoustic guide (OG), which mimics the traditional localization guide wire and is preoperatively placed into tumor mass, and an augmented reality (AR) system to provide real-time visualization on the location of the tumor with sub-millimeter variance. By a nano-composite light diffusion sphere and light absorbing layer formed on the tip of an optical fiber, the OG creates an omnidirectional acoustic source inside tumor mass under pulsed laser excitation. The optoacoustic signal generated has a high dynamic range ( 58dB) and spreads in a large apex angle of 320 degrees. Then, an acoustic radar with three ultrasound transducers is attached to the breast skin, and triangulates the location of the OG tip. With an AR system to sense the location of the acoustic radar, the relative position of the OG tip inside the tumor to the AR display is calculated and rendered. This provides direct visual feedback of the tumor location to surgeons, which will greatly ease the surgical planning during the operation and save surgical time. A proof-of-concept experiment using a tablet and a stereo-vision camera is demonstrated and 0.25 mm tracking variance is achieved.
Tang, Rui; Ma, Longfei; Xiang, Canhong; Wang, Xuedong; Li, Ang; Liao, Hongen; Dong, Jiahong
2017-01-01
Abstract Rationale: Patients who undergo hilar cholangiocarcinoma (HCAC) resection with concomitant hepatectomy have a high risk of postoperative morbidity and mortality due to surgical trauma to the hepatic and biliary vasculature. Patient concerns: A 58-year-old Chinese man with yellowing skin and sclera, abdominal distension, pruritus, and anorexia for approximately 3 weeks. Diagnoses: Magnetic resonance cholangiopancreatography and enhanced computed tomography (CT) scanning revealed a mass over the biliary tree at the porta hepatis, which diagnosed to be s a hilar cholangiocarcinoma. Intervention: Three-dimensional (3D) images of the patient's hepatic and biliary structures were reconstructed preoperatively from CT data, and the 3D images were used for preoperative planning and augmented reality (AR)-assisted intraoperative navigation during open HCAC resection with hemihepatectomy. A 3D-printed model of the patient's biliary structures was also used intraoperatively as a visual reference. Outcomes: No serious postoperative complications occurred, and the patient was tumor-free at the 9-month follow-up examination based on CT results. Lessons: AR-assisted preoperative planning and intraoperative navigation might be beneficial in other patients with HCAC patients to reduce postoperative complications and ensure disease-free survival. In our postoperative analysis, we also found that, when the3D images were superimposed 3D-printed model using a see-through integral video graphy display device, our senses of depth perception and motion parallax were improved, compared with that which we had experienced intraoperatively using the videobased AR display system. PMID:28906410
Generating classes of 3D virtual mandibles for AR-based medical simulation.
Hippalgaonkar, Neha R; Sider, Alexa D; Hamza-Lup, Felix G; Santhanam, Anand P; Jaganathan, Bala; Imielinska, Celina; Rolland, Jannick P
2008-01-01
Simulation and modeling represent promising tools for several application domains from engineering to forensic science and medicine. Advances in 3D imaging technology convey paradigms such as augmented reality (AR) and mixed reality inside promising simulation tools for the training industry. Motivated by the requirement for superimposing anatomically correct 3D models on a human patient simulator (HPS) and visualizing them in an AR environment, the purpose of this research effort was to develop and validate a method for scaling a source human mandible to a target human mandible within a 2 mm root mean square (RMS) error. Results show that, given a distance between 2 same landmarks on 2 different mandibles, a relative scaling factor may be computed. Using this scaling factor, results show that a 3D virtual mandible model can be made morphometrically equivalent to a real target-specific mandible within a 1.30 mm RMS error. The virtual mandible may be further used as a reference target for registering other anatomic models, such as the lungs, on the HPS. Such registration will be made possible by physical constraints among the mandible and the spinal column in the horizontal normal rest position.
An independent shopping experience for wheelchair users through augmented reality and RFID.
Rashid, Zulqarnain; Pous, Rafael; Norrie, Christopher S
2017-06-23
People with physical and mobility impairments continue to struggle to attain independence in the performance of routine activities and tasks. For example, browsing in a store and interacting with products located beyond an arm's length may be impossible without the enabling intervention of a human assistant. This research article describes a study undertaken to design, develop, and evaluate potential interaction methods for motor-impaired individuals, specifically those who use wheelchairs. Our study includes a user-centered approach, and a categorization of wheelchair users based upon the severity of their disability and their individual needs. We designed and developed access solutions that utilize radio frequency identification (RFID), augmented reality (AR), and touchscreen technologies in order to help people who use wheelchairs to carry out certain tasks autonomously. In this way, they have been empowered to go shopping independently, free from reliance upon the assistance of others. A total of 18 wheelchair users participated in the completed study.
On the use of virtual and augmented reality for upper limb prostheses training and simulation.
Lamounier, Edgard; Lopes, Kenedy; Cardoso, Alexandre; Andrade, Adriano; Soares, Alcimar
2010-01-01
Accidents happen and unfortunately people may loose part of their body members. Studies have shown that in this case, most individuals suffer physically and psychologically. For this reason, actions to restore the patient's freedom and mobility are imperative. Traditional solutions require ways to adapt the individual to prosthetic devices. This idea is also applied to patients who have congenital limitations. However, one of the major difficulties faced by those who are fitted with these devices is the great mental effort needed during first stages of training. As a result, a meaningful number of patients give up the use of theses devices very soon. Thus, this article reports on a solution designed by the authors to help patients during the learning phases, without actually having to wear the prosthesis. This solution considers Virtual (VR) and Augmented Reality (AR) techniques to mimic the prosthesis natural counterparts. Thus, it is expected that problems such as weight, heat and pain should not contribute to an already hard task.
Feasibility study on the readiness, suitability and acceptance of M-Learning AR in learning History
NASA Astrophysics Data System (ADS)
Taharim, Nurwahida Faradila; Lokman, Anitawati Mohd; Hanesh, Amjad; Aziz, Azhar Abd
2016-02-01
There is no doubt that globalization and innovation in technology has led to the use of technology widespread in almost all sectors, including in the field of education. In recent years, the use of technology in the field of education has more widely and rapidly expand worldwide. Integration of technology in education always open to new opportunities where past studies have shown that technology enhances teaching and learning experience. There are various technologies that have been integrated into the various disciplines of education. Augmented Reality (AR) in mobile learning, which allows a combination of real and virtual worlds in a mobile device, is one of the latest technological potential and has been applied in the field of education. The aim of this research work is to mitigate the challenges faced by end users namely students and teachers of history class by means of creating Augmented Reality mobile application to increase the interest in both delivering and receiving the subject matter. The system consists a mobile platform for students to view the content, and a cloud-based engine to deliver the content based on recognized marker. The impact of such system has been tested on both students and teachers, where students showed interest in learning history, while teachers expressed interest to extend and adopt the system school wide.
Current status of robotic simulators in acquisition of robotic surgical skills.
Kumar, Anup; Smith, Roger; Patel, Vipul R
2015-03-01
This article provides an overview of the current status of simulator systems in robotic surgery training curriculum, focusing on available simulators for training, their comparison, new technologies introduced in simulation focusing on concepts of training along with existing challenges and future perspectives of simulator training in robotic surgery. The different virtual reality simulators available in the market like dVSS, dVT, RoSS, ProMIS and SEP have shown face, content and construct validity in robotic skills training for novices outside the operating room. Recently, augmented reality simulators like HoST, Maestro AR and RobotiX Mentor have been introduced in robotic training providing a more realistic operating environment, emphasizing more on procedure-specific robotic training . Further, the Xperience Team Trainer, which provides training to console surgeon and bed-side assistant simultaneously, has been recently introduced to emphasize the importance of teamwork and proper coordination. Simulator training holds an important place in current robotic training curriculum of future robotic surgeons. There is a need for more procedure-specific augmented reality simulator training, utilizing advancements in computing and graphical capabilities for new innovations in simulator technology. Further studies are required to establish its cost-benefit ratio along with concurrent and predictive validity.
Social Behaviors and Active Videogame Play in Children with Autism Spectrum Disorder.
Chung, Peter J; Vanderbilt, Douglas L; Soares, Neelkamal S
2015-06-01
Children with autism spectrum disorder (ASD) often display problematic and excessive videogame play. Using active videogames (AVGs) may have physical benefits, but its effects on socialization are unknown. We conducted an A-B-A' experiment comparing sedentary videogames and AVGs for three dyads of a child with ASD and his sibling. An augmented reality (AR) game was used to introduce AVGs. Sessions were coded for communication, positive affect, and aggression. One dyad had increases in positive affect with AVGs. Otherwise, social behaviors were unchanged or worse. The AR game demonstrated consistent elevations in social behaviors. Use of AVGs has inconsistent effects on social behavior for children with ASD. Further research is needed to understand mediators of response to AVGs. AR games should be evaluated for potential benefits on socialization and positive affect.
Real-time augmented reality overlay for an energy-efficient car study
NASA Astrophysics Data System (ADS)
Wozniak, Peter; Javahiraly, Nicolas; Curticapean, Dan
2017-06-01
Our university carries out various research projects. Among others, the project Schluckspecht is an interdisciplinary work on different ultra-efficient car concepts for international contests. Besides the engineering work, one part of the project deals with real-time data visualization. In order to increase the efficiency of the vehicle, an online monitoring of the runtime parameters is necessary. The driving parameters of the vehicle are transmitted to a processing station via a wireless network connection. We plan to use an augmented reality (AR) application to visualize different data on top of the view of the real car. By utilizing a mobile Android or iOS device a user can interactively view various real-time and statistical data. The car and its components are meant to be augmented by various additional information, whereby that information should appear at the correct position of the components. An engine e.g. could show the current rpm and consumption values. A battery on the other hand could show the current charge level. The goal of this paper is to evaluate different possible approaches, their suitability and to expand our application to other projects at our university.
Ahmadvand, Alireza; Drennan, Judy; Burgess, Jean; Clark, Michele; Kavanagh, David; Burns, Kara; Howard, Sarah; Kelly, Fleur; Campbell, Chris; Nissen, Lisa
2018-01-01
Introduction Low health literacy is common in people with type 2 diabetes mellitus (T2DM) (up to 40%), associated with decreased self-efficacy in managing T2DM and its important complications, mainly hypertension. This study introduces, for the first time, an easy-to-use solution based on augmented reality (AR) on smartphones, to enhance health literacy around antihypertensive medicines. It assesses the feasibility of the solution for improving health literacy, oriented specifically to angiotensin II receptor blockers; embedding the health literacy improvement into the use cycle of angiotensin II receptor blockers and providing continuous access to information as a form of patient engagement. Methods and analysis This is a technology evaluation study with one technology group (AR plus usual care) and one non-technology group (usual care). Both groups receive face-to-face communications with community pharmacists regarding angiotensin II receptor blockers; the technology group receive additional AR-enhanced digital consumer medicine information throughout the use of their medications. The primary outcome is the change in health literacy and the hypothesis is that the proportions of people who show high health literacy will be larger in the technology group. Mixed effects models will be used to analyse solution effectiveness on outcomes. Multiple regression models will be used to find additional variables that might affect the relationship between health literacy and the AR solution. Ethics and dissemination Queensland University of Technology (QUT) Human Research Ethics Committee has approved the study as a low-risk technology evaluation study (approval number: 1700000275). Findings will be disseminated via attending scientific conferences and publishing in peer-reviewed journals. Facilitated by QUT, two press releases have been published in public media and two presentations have been made in university classrooms. PMID:29705754
Ahmadvand, Alireza; Drennan, Judy; Burgess, Jean; Clark, Michele; Kavanagh, David; Burns, Kara; Howard, Sarah; Kelly, Fleur; Campbell, Chris; Nissen, Lisa
2018-04-28
Low health literacy is common in people with type 2 diabetes mellitus (T2DM) (up to 40%), associated with decreased self-efficacy in managing T2DM and its important complications, mainly hypertension. This study introduces, for the first time, an easy-to-use solution based on augmented reality (AR) on smartphones, to enhance health literacy around antihypertensive medicines. It assesses the feasibility of the solution for improving health literacy, oriented specifically to angiotensin II receptor blockers; embedding the health literacy improvement into the use cycle of angiotensin II receptor blockers and providing continuous access to information as a form of patient engagement. This is a technology evaluation study with one technology group (AR plus usual care) and one non-technology group (usual care). Both groups receive face-to-face communications with community pharmacists regarding angiotensin II receptor blockers; the technology group receive additional AR-enhanced digital consumer medicine information throughout the use of their medications. The primary outcome is the change in health literacy and the hypothesis is that the proportions of people who show high health literacy will be larger in the technology group. Mixed effects models will be used to analyse solution effectiveness on outcomes. Multiple regression models will be used to find additional variables that might affect the relationship between health literacy and the AR solution. Queensland University of Technology (QUT) Human Research Ethics Committee has approved the study as a low-risk technology evaluation study (approval number: 1700000275). Findings will be disseminated via attending scientific conferences and publishing in peer-reviewed journals. Facilitated by QUT, two press releases have been published in public media and two presentations have been made in university classrooms. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Improving Robotic Operator Performance Using Augmented Reality
NASA Technical Reports Server (NTRS)
Maida, James C.; Bowen, Charles K.; Pace, John W.
2007-01-01
The Special Purpose Dexterous Manipulator (SPDM) is a two-armed robot that functions as an extension to the end effector of the Space Station Robotics Manipulator System (SSRMS), currently in use on the International Space Station (ISS). Crew training for the SPDM is accomplished using a robotic hardware simulator, which performs most of SPDM functions under normal static Earth gravitational forces. Both the simulator and SPDM are controlled from a standard robotic workstation using a laptop for the user interface and three monitors for camera views. Most operations anticipated for the SPDM involve the manipulation, insertion, and removal of any of several types of Orbital Replaceable Unit (ORU), modules which control various ISS functions. Alignment tolerances for insertion of the ORU into its receptacle are 0.25 inch and 0.5 degree from nominal values. The pre-insertion alignment task must be performed within these tolerances by using available video camera views of the intrinsic features of the ORU and receptacle, without special registration markings. Since optimum camera views may not be available, and dynamic orbital lighting conditions may limit periods of viewing, a successful ORU insertion operation may require an extended period of time. This study explored the feasibility of using augmented reality (AR) to assist SPDM operations. Geometric graphical symbols were overlaid on one of the workstation monitors to afford cues to assist the operator in attaining adequate pre-insertion ORU alignment. Twelve skilled subjects performed eight ORU insertion tasks using the simulator with and without the AR symbols in a repeated measures experimental design. Results indicated that using the AR symbols reduced pre-insertion alignment error for all subjects and reduced the time to complete pre-insertion alignment for most subjects.
Registration using natural features for augmented reality systems.
Yuan, M L; Ong, S K; Nee, A Y C
2006-01-01
Registration is one of the most difficult problems in augmented reality (AR) systems. In this paper, a simple registration method using natural features based on the projective reconstruction technique is proposed. This method consists of two steps: embedding and rendering. Embedding involves specifying four points to build the world coordinate system on which a virtual object will be superimposed. In rendering, the Kanade-Lucas-Tomasi (KLT) feature tracker is used to track the natural feature correspondences in the live video. The natural features that have been tracked are used to estimate the corresponding projective matrix in the image sequence. Next, the projective reconstruction technique is used to transfer the four specified points to compute the registration matrix for augmentation. This paper also proposes a robust method for estimating the projective matrix, where the natural features that have been tracked are normalized (translation and scaling) and used as the input data. The estimated projective matrix will be used as an initial estimate for a nonlinear optimization method that minimizes the actual residual errors based on the Levenberg-Marquardt (LM) minimization method, thus making the results more robust and stable. The proposed registration method has three major advantages: 1) It is simple, as no predefined fiducials or markers are used for registration for either indoor and outdoor AR applications. 2) It is robust, because it remains effective as long as at least six natural features are tracked during the entire augmentation, and the existence of the corresponding projective matrices in the live video is guaranteed. Meanwhile, the robust method to estimate the projective matrix can obtain stable results even when there are some outliers during the tracking process. 3) Virtual objects can still be superimposed on the specified areas, even if some parts of the areas are occluded during the entire process. Some indoor and outdoor experiments have been conducted to validate the performance of this proposed method.
A Study on Software-based Sensing Technology for Multiple Object Control in AR Video
Jung, Sungmo; Song, Jae-gu; Hwang, Dae-Joon; Ahn, Jae Young; Kim, Seoksoo
2010-01-01
Researches on Augmented Reality (AR) have recently received attention. With these, the Machine-to-Machine (M2M) market has started to be active and there are numerous efforts to apply this to real life in all sectors of society. To date, the M2M market has applied the existing marker-based AR technology in entertainment, business and other industries. With the existing marker-based AR technology, a designated object can only be loaded on the screen from one marker and a marker has to be added to load on the screen the same object again. This situation creates a problem where the relevant marker’should be extracted and printed in screen so that loading of the multiple objects is enabled. However, since the distance between markers will not be measured in the process of detecting and copying markers, the markers can be overlapped and thus the objects would not be augmented. To solve this problem, a circle having the longest radius needs to be created from a focal point of a marker to be copied, so that no object is copied within the confines of the circle. In this paper, software-based sensing technology for multiple object detection and loading using PPHT has been developed and overlapping marker control according to multiple object control has been studied using the Bresenham and Mean Shift algorithms. PMID:22163444
A study on software-based sensing technology for multiple object control in AR video.
Jung, Sungmo; Song, Jae-Gu; Hwang, Dae-Joon; Ahn, Jae Young; Kim, Seoksoo
2010-01-01
Researches on Augmented Reality (AR) have recently received attention. With these, the Machine-to-Machine (M2M) market has started to be active and there are numerous efforts to apply this to real life in all sectors of society. To date, the M2M market has applied the existing marker-based AR technology in entertainment, business and other industries. With the existing marker-based AR technology, a designated object can only be loaded on the screen from one marker and a marker has to be added to load on the screen the same object again. This situation creates a problem where the relevant marker'should be extracted and printed in screen so that loading of the multiple objects is enabled. However, since the distance between markers will not be measured in the process of detecting and copying markers, the markers can be overlapped and thus the objects would not be augmented. To solve this problem, a circle having the longest radius needs to be created from a focal point of a marker to be copied, so that no object is copied within the confines of the circle. In this paper, software-based sensing technology for multiple object detection and loading using PPHT has been developed and overlapping marker control according to multiple object control has been studied using the Bresenham and Mean Shift algorithms.
A Survey of Mobile and Wireless Technologies for Augmented Reality Systems (Preprint)
2008-02-01
Windows XP. A number of researchers have started employing them in AR simulations such as Wagner et al [25], Newman et al [46] and specifically the Sony ...different music clubs and styles of music according to the selection and tastes of the listeners. In the intro sequence the user can select an animated...3-D character (avatar) as his or her virtual persona and visit the different music rooms in the virtual disco. Users can download or stream music in
An augmented reality tool for learning spatial anatomy on mobile devices.
Jain, Nishant; Youngblood, Patricia; Hasel, Matthew; Srivastava, Sakti
2017-09-01
Augmented Realty (AR) offers a novel method of blending virtual and real anatomy for intuitive spatial learning. Our first aim in the study was to create a prototype AR tool for mobile devices. Our second aim was to complete a technical evaluation of our prototype AR tool focused on measuring the system's ability to accurately render digital content in the real world. We imported Computed Tomography (CT) data derived virtual surface models into a 3D Unity engine environment and implemented an AR algorithm to display these on mobile devices. We investigated the accuracy of the virtual renderings by comparing a physical cube with an identical virtual cube for dimensional accuracy. Our comparative study confirms that our AR tool renders 3D virtual objects with a high level of accuracy as evidenced by the degree of similarity between measurements of the dimensions of a virtual object (a cube) and the corresponding physical object. We developed an inexpensive and user-friendly prototype AR tool for mobile devices that creates highly accurate renderings. This prototype demonstrates an intuitive, portable, and integrated interface for spatial interaction with virtual anatomical specimens. Integrating this AR tool with a library of CT derived surface models provides a platform for spatial learning in the anatomy curriculum. The segmentation methodology implemented to optimize human CT data for mobile viewing can be extended to include anatomical variations and pathologies. The ability of this inexpensive educational platform to deliver a library of interactive, 3D models to students worldwide demonstrates its utility as a supplemental teaching tool that could greatly benefit anatomical instruction. Clin. Anat. 30:736-741, 2017. © 2017Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
LivePhantom: Retrieving Virtual World Light Data to Real Environments.
Kolivand, Hoshang; Billinghurst, Mark; Sunar, Mohd Shahrizal
2016-01-01
To achieve realistic Augmented Reality (AR), shadows play an important role in creating a 3D impression of a scene. Casting virtual shadows on real and virtual objects is one of the topics of research being conducted in this area. In this paper, we propose a new method for creating complex AR indoor scenes using real time depth detection to exert virtual shadows on virtual and real environments. A Kinect camera was used to produce a depth map for the physical scene mixing into a single real-time transparent tacit surface. Once this is created, the camera's position can be tracked from the reconstructed 3D scene. Real objects are represented by virtual object phantoms in the AR scene enabling users holding a webcam and a standard Kinect camera to capture and reconstruct environments simultaneously. The tracking capability of the algorithm is shown and the findings are assessed drawing upon qualitative and quantitative methods making comparisons with previous AR phantom generation applications. The results demonstrate the robustness of the technique for realistic indoor rendering in AR systems.
LivePhantom: Retrieving Virtual World Light Data to Real Environments
2016-01-01
To achieve realistic Augmented Reality (AR), shadows play an important role in creating a 3D impression of a scene. Casting virtual shadows on real and virtual objects is one of the topics of research being conducted in this area. In this paper, we propose a new method for creating complex AR indoor scenes using real time depth detection to exert virtual shadows on virtual and real environments. A Kinect camera was used to produce a depth map for the physical scene mixing into a single real-time transparent tacit surface. Once this is created, the camera’s position can be tracked from the reconstructed 3D scene. Real objects are represented by virtual object phantoms in the AR scene enabling users holding a webcam and a standard Kinect camera to capture and reconstruct environments simultaneously. The tracking capability of the algorithm is shown and the findings are assessed drawing upon qualitative and quantitative methods making comparisons with previous AR phantom generation applications. The results demonstrate the robustness of the technique for realistic indoor rendering in AR systems. PMID:27930663
Recent Development of Augmented Reality in Surgery: A Review.
Vávra, P; Roman, J; Zonča, P; Ihnát, P; Němec, M; Kumar, J; Habib, N; El-Gendi, A
2017-01-01
The development augmented reality devices allow physicians to incorporate data visualization into diagnostic and treatment procedures to improve work efficiency, safety, and cost and to enhance surgical training. However, the awareness of possibilities of augmented reality is generally low. This review evaluates whether augmented reality can presently improve the results of surgical procedures. We performed a review of available literature dating from 2010 to November 2016 by searching PubMed and Scopus using the terms "augmented reality" and "surgery." Results . The initial search yielded 808 studies. After removing duplicates and including only journal articles, a total of 417 studies were identified. By reading of abstracts, 91 relevant studies were chosen to be included. 11 references were gathered by cross-referencing. A total of 102 studies were included in this review. The present literature suggest an increasing interest of surgeons regarding employing augmented reality into surgery leading to improved safety and efficacy of surgical procedures. Many studies showed that the performance of newly devised augmented reality systems is comparable to traditional techniques. However, several problems need to be addressed before augmented reality is implemented into the routine practice.
Augmented Reality-Guided Lumbar Facet Joint Injections.
Agten, Christoph A; Dennler, Cyrill; Rosskopf, Andrea B; Jaberg, Laurenz; Pfirrmann, Christian W A; Farshad, Mazda
2018-05-08
The aim of this study was to assess feasibility and accuracy of augmented reality-guided lumbar facet joint injections. A spine phantom completely embedded in hardened opaque agar with 3 ring markers was built. A 3-dimensional model of the phantom was uploaded to an augmented reality headset (Microsoft HoloLens). Two radiologists independently performed 20 augmented reality-guided and 20 computed tomography (CT)-guided facet joint injections each: for each augmented reality-guided injection, the hologram was manually aligned with the phantom container using the ring markers. The radiologists targeted the virtual facet joint and tried to place the needle tip in the holographic joint space. Computed tomography was performed after each needle placement to document final needle tip position. Time needed from grabbing the needle to final needle placement was measured for each simulated injection. An independent radiologist rated images of all needle placements in a randomized order blinded to modality (augmented reality vs CT) and performer as perfect, acceptable, incorrect, or unsafe. Accuracy and time to place needles were compared between augmented reality-guided and CT-guided facet joint injections. In total, 39/40 (97.5%) of augmented reality-guided needle placements were either perfect or acceptable compared with 40/40 (100%) CT-guided needle placements (P = 0.5). One augmented reality-guided injection missed the facet joint space by 2 mm. No unsafe needle placements occurred. Time to final needle placement was substantially faster with augmented reality guidance (mean 14 ± 6 seconds vs 39 ± 15 seconds, P < 0.001 for both readers). Augmented reality-guided facet joint injections are feasible and accurate without potentially harmful needle placement in an experimental setting.
Learning Optimized Local Difference Binaries for Scalable Augmented Reality on Mobile Devices.
Xin Yang; Kwang-Ting Cheng
2014-06-01
The efficiency, robustness and distinctiveness of a feature descriptor are critical to the user experience and scalability of a mobile augmented reality (AR) system. However, existing descriptors are either too computationally expensive to achieve real-time performance on a mobile device such as a smartphone or tablet, or not sufficiently robust and distinctive to identify correct matches from a large database. As a result, current mobile AR systems still only have limited capabilities, which greatly restrict their deployment in practice. In this paper, we propose a highly efficient, robust and distinctive binary descriptor, called Learning-based Local Difference Binary (LLDB). LLDB directly computes a binary string for an image patch using simple intensity and gradient difference tests on pairwise grid cells within the patch. To select an optimized set of grid cell pairs, we densely sample grid cells from an image patch and then leverage a modified AdaBoost algorithm to automatically extract a small set of critical ones with the goal of maximizing the Hamming distance between mismatches while minimizing it between matches. Experimental results demonstrate that LLDB is extremely fast to compute and to match against a large database due to its high robustness and distinctiveness. Compared to the state-of-the-art binary descriptors, primarily designed for speed, LLDB has similar efficiency for descriptor construction, while achieving a greater accuracy and faster matching speed when matching over a large database with 2.3M descriptors on mobile devices.
Augmented reality guidance system for peripheral nerve blocks
NASA Astrophysics Data System (ADS)
Wedlake, Chris; Moore, John; Rachinsky, Maxim; Bainbridge, Daniel; Wiles, Andrew D.; Peters, Terry M.
2010-02-01
Peripheral nerve block treatments are ubiquitous in hospitals and pain clinics worldwide. State of the art techniques use ultrasound (US) guidance and/or electrical stimulation to verify needle tip location. However, problems such as needle-US beam alignment, poor echogenicity of block needles and US beam thickness can make it difficult for the anesthetist to know the exact needle tip location. Inaccurate therapy delivery raises obvious safety and efficacy issues. We have developed and evaluated a needle guidance system that makes use of a magnetic tracking system (MTS) to provide an augmented reality (AR) guidance platform to accurately localize the needle tip as well as its projected trajectory. Five anesthetists and five novices performed simulated nerve block deliveries in a polyvinyl alcohol phantom to compare needle guidance under US alone to US placed in our AR environment. Our phantom study demonstrated a decrease in targeting attempts, decrease in contacting of critical structures, and an increase in accuracy of 0.68 mm compared to 1.34mm RMS in US guidance alone. Currently, the MTS uses 18 and 21 gauge hypodermic needles with a 5 degree of freedom sensor located at the needle tip. These needles can only be sterilized using an ethylene oxide process. In the interest of providing clinicians with a simple and efficient guidance system, we also evaluated attaching the sensor at the needle hub as a simple clip-on device. To do this, we simultaneously performed a needle bending study to assess the reliability of a hub-based sensor.
ARBOOK: Development and Assessment of a Tool Based on Augmented Reality for Anatomy
NASA Astrophysics Data System (ADS)
Ferrer-Torregrosa, J.; Torralba, J.; Jimenez, M. A.; García, S.; Barcia, J. M.
2015-02-01
The evolution of technologies and the development of new tools with educational purposes are growing up. This work presents the experience of a new tool based on augmented reality (AR) focusing on the anatomy of the lower limb. ARBOOK was constructed and developed based on TC and MRN images, dissections and drawings. For ARBOOK evaluation, a specific questionnaire of three blocks was performed and validated according to the Delphi method. The questionnaire included motivation and attention tasks, autonomous work and three-dimensional interpretation tasks. A total of 211 students from 7 public and private Spanish universities were divided in two groups. Control group received standard teaching sessions supported by books, and video. The ARBOOK group received the same standard sessions but additionally used the ARBOOK tool. At the end of the training, a written test on lower limb anatomy was done by students. Statistically significant better scorings for the ARBOOK group were found on attention-motivation, autonomous work and three-dimensional comprehension tasks. Additionally, significantly better scoring was obtained by the ARBOOK group in the written test. The results strongly suggest that the use of AR is suitable for anatomical purposes. Concretely, the results indicate how this technology is helpful for student motivation, autonomous work or spatial interpretation. The use of this type of technologies must be taken into account even more at the present moment, when new technologies are naturally incorporated to our current lives.
Takano, Kouji; Hata, Naoki; Kansaku, Kenji
2011-01-01
The brain–machine interface (BMI) or brain–computer interface is a new interface technology that uses neurophysiological signals from the brain to control external machines or computers. This technology is expected to support daily activities, especially for persons with disabilities. To expand the range of activities enabled by this type of interface, here, we added augmented reality (AR) to a P300-based BMI. In this new system, we used a see-through head-mount display (HMD) to create control panels with flicker visual stimuli to support the user in areas close to controllable devices. When the attached camera detects an AR marker, the position and orientation of the marker are calculated, and the control panel for the pre-assigned appliance is created by the AR system and superimposed on the HMD. The participants were required to control system-compatible devices, and they successfully operated them without significant training. Online performance with the HMD was not different from that using an LCD monitor. Posterior and lateral (right or left) channel selections contributed to operation of the AR–BMI with both the HMD and LCD monitor. Our results indicate that AR–BMI systems operated with a see-through HMD may be useful in building advanced intelligent environments. PMID:21541307
NASA Astrophysics Data System (ADS)
Halik, Łukasz
2012-11-01
The objective of the present deliberations was to systematise our knowledge of static visual variables used to create cartographic symbols, and also to analyse the possibility of their utilisation in the Augmented Reality (AR) applications on smartphone-type mobile devices. This was accomplished by combining the visual variables listed over the years by different researchers. Research approach was to determine the level of usefulness of particular characteristics of visual variables such as selective, associative, quantitative and order. An attempt was made to provide an overview of static visual variables and to describe the AR system which is a new paradigm of the user interface. Changing the approach to the presentation of point objects is caused by applying different perspective in the observation of objects (egocentric view) than it is done on traditional analogue maps (geocentric view). Presented topics will refer to the fast-developing field of cartography, namely mobile cartography. Particular emphasis will be put on smartphone-type mobile devices and their applicability in the process of designing cartographic symbols. Celem artykułu było usystematyzowanie wiedzy na temat statycznych zmiennych wizualnych, które sa kluczowymi składnikami budujacymi sygnatury kartograficzne. Podjeto próbe zestawienia zmiennych wizualnych wyodrebnionych przez kartografów na przestrzeni ostatnich piecdziesieciu lat, zaczynajac od klasyfikacji przedstawionej przez J. Bertin’a. Dokonano analizy stopnia uzytecznosci poszczególnych zmiennych graficznych w aspekcie ich wykorzystania w projektowaniu znaków punktowych dla mobilnych aplikacji tworzonych w technologii Rzeczywistosci Rozszerzonej (Augmented Reality). Zmienne poddano analizie pod wzgledem czterech charakterystyk: selektywnosci, skojarzeniowosci, odzwierciedlenia ilosci oraz porzadku. W artykule zwrócono uwage na odmienne zastosowanie perspektywy pomiedzy tradycyjnymi analogowymi mapami (geocentrycznosc) a aplikacjami tworzonymi w technologii Rozszerzonej Rzeczywistosci (egocentrycznosc). Tresci prezentowane w pracy dotycza szybko rozwijajacej sie gałezi kartografii - kartografii mobilnej. Dodatkowy nacisk połozony został na próbe implementacji załozen projektowania punktowych znaków kartograficznych na urzadzenia mobilne typu smartphone.
Invisible waves and hidden realms: augmented reality and experimental art
NASA Astrophysics Data System (ADS)
Ruzanka, Silvia
2012-03-01
Augmented reality is way of both altering the visible and revealing the invisible. It offers new opportunities for artistic exploration through virtual interventions in real space. In this paper, the author describes the implementation of two art installations using different AR technologies, one using optical marker tracking on mobile devices and one integrating stereoscopic projections into the physical environment. The first artwork, De Ondas y Abejas (The Waves and the Bees), is based on the widely publicized (but unproven) hypothesis of a link between cellphone radiation and the phenomenon of bee colony collapse disorder. Using an Android tablet, viewers search out small fiducial markers in the shape of electromagnetic waves hidden throughout the gallery, which reveal swarms of bees scattered on the floor. The piece also creates a generative soundscape based on electromagnetic fields. The second artwork, Urban Fauna, is a series of animations in which features of the urban landscape become plants and animals. Surveillance cameras become flocks of birds while miniature cellphone towers, lampposts, and telephone poles grow like small seedlings in time-lapse animation. The animations are presented as small stereoscopic projections, integrated into the physical space of the gallery. These two pieces explore the relationship between nature and technology through the visualization of invisible forces and hidden alternate realities.
The development of AR book for computer learning
NASA Astrophysics Data System (ADS)
Phadung, Muneeroh; Wani, Najela; Tongmnee, Nur-aiynee
2017-08-01
Educators need to provide the alternative educational tools to foster learning outcomes of students. By using AR technology to create exciting edutainment experiences, this paper presents how augmented reality (AR) can be applied in the education. This study aims to develop the AR book for tenth grade students (age 15-16) and evaluate its quality. The AR book was developed based on ADDIE framework processes to provide computer learning on software computer knowledge. The content was accorded with the current Thai education curriculum. The AR book had 10 pages in three topics (the first was "Introduction," the second was "System Software" and the third was "Application Software"). Each page contained markers that placed virtual objects (2D animation and video clip). The obtained data were analyzed in terms of average and standard deviation. The validity of multimedia design of the AR book was assessed by three experts in multimedia design. A five-point Likert scale was used and the values were X¯ =4 .84 , S.D. = 1.27 which referred to very high. Moreover, three content experts, who specialize in computer teaching, evaluated the AR book's validity. The values determined by the experts were X¯ =4 .69 , S.D. = 0.29 which referred to very high. Implications for future study and education are discussed.
Augmented Reality Implementation in Watch Catalog as e-Marketing Based on Mobile Aplication
NASA Astrophysics Data System (ADS)
Adrianto, D.; Luwinda, F. A.; Yesmaya, V.
2017-01-01
Augmented Reality is one of important methods to provide user with a better interactive user interface. In this research, Augmented Reality in Mobile Application will be applied to provide user with useful information related with Watch Catalogue. This research will be focused on design and implementation an application using Augmented Reality. The process model used in this research is Extreme Programming. Extreme Programming have a several steps which are planning, design, coding, and testing. The result of this research is Augmented Reality application based on Android. This research will be conclude that implementation of Augmented Reality based on Android in Watch Catalogue will help customer to collect the useful information related to the specific object of watch.
The Effect of an Augmented Reality Enhanced Mathematics Lesson on Student Achievement and Motivation
ERIC Educational Resources Information Center
Estapa, Anne; Nadolny, Larysa
2015-01-01
The purpose of the study was to assess student achievement and motivation during a high school augmented reality mathematics activity focused on dimensional analysis. Included in this article is a review of the literature on the use of augmented reality in mathematics and the combination of print with augmented reality, also known as interactive…
An Analysis of Engagement in a Combination Indoor/Outdoor Augmented Reality Educational Game
ERIC Educational Resources Information Center
Folkestad, James; O'Shea, Patrick
2011-01-01
This paper describes the results of a qualitative analysis of video captured during a dual indoor/outdoor Augmented Reality experience. Augmented Reality is the layering of virtual information on top of the physical world. This Augmented Reality experience asked students to interact with the San Diego Museum of Art and the Botanical Gardens in San…
Evaluating Augmented Reality to Complete a Chain Task for Elementary Students with Autism
ERIC Educational Resources Information Center
Cihak, David F.; Moore, Eric J.; Wright, Rachel E.; McMahon, Don D.; Gibbons, Melinda M.; Smith, Cate
2016-01-01
The purpose of this study was to examine the effects of augmented reality to teach a chain task to three elementary-age students with autism spectrum disorders (ASDs). Augmented reality blends digital information within the real world. This study used a marker-based augmented reality picture prompt to trigger a video model clip of a student…
Berryman, Donna R
2012-01-01
Augmented reality is a technology that overlays digital information on objects or places in the real world for the purpose of enhancing the user experience. It is not virtual reality, that is, the technology that creates a totally digital or computer created environment. Augmented reality, with its ability to combine reality and digital information, is being studied and implemented in medicine, marketing, museums, fashion, and numerous other areas. This article presents an overview of augmented reality, discussing what it is, how it works, its current implementations, and its potential impact on libraries.
Fransson, Boel A; Chen, Chi-Ya; Noyes, Julie A; Ragle, Claude A
2016-11-01
To determine the construct and concurrent validity of instrument motion metrics for laparoscopic skills assessment in virtual reality and augmented reality simulators. Evaluation study. Veterinarian students (novice, n = 14) and veterinarians (experienced, n = 11) with no or variable laparoscopic experience. Participants' minimally invasive surgery (MIS) experience was determined by hospital records of MIS procedures performed in the Teaching Hospital. Basic laparoscopic skills were assessed by 5 tasks using a physical box trainer. Each participant completed 2 tasks for assessments in each type of simulator (virtual reality: bowel handling and cutting; augmented reality: object positioning and a pericardial window model). Motion metrics such as instrument path length, angle or drift, and economy of motion of each simulator were recorded. None of the motion metrics in a virtual reality simulator showed correlation with experience, or to the basic laparoscopic skills score. All metrics in augmented reality were significantly correlated with experience (time, instrument path, and economy of movement), except for the hand dominance metric. The basic laparoscopic skills score was correlated to all performance metrics in augmented reality. The augmented reality motion metrics differed between American College of Veterinary Surgeons diplomates and residents, whereas basic laparoscopic skills score and virtual reality metrics did not. Our results provide construct validity and concurrent validity for motion analysis metrics for an augmented reality system, whereas a virtual reality system was validated only for the time score. © Copyright 2016 by The American College of Veterinary Surgeons.
Functional Reflective Polarizer for Augmented Reality and Color Vision Deficiency
2016-03-03
Functional reflective polarizer for augmented reality and color vision deficiency Ruidong Zhu, Guanjun Tan, Jiamin Yuan, and Shin-Tson Wu* College...polarizer that can be incorporated into a compact augmented reality system. The design principle of the functional reflective polarizer is explained and...augment reality system is relatively high as compared to a polarizing beam splitter or a conventional reflective polarizer. Such a functional reflective
NASA Technical Reports Server (NTRS)
Gutensohn, Michael
2018-01-01
The task for this project was to design, develop, test, and deploy a facial recognition system for the Kennedy Space Center Augmented/Virtual Reality Lab. This system will serve as a means of user authentication as part of the NUI of the lab. The overarching goal is to create a seamless user interface that will allow the user to initiate and interact with AR and VR experiences without ever needing to use a mouse or keyboard at any step in the process.
Towards the Enhancement of "MINOR" Archaeological Heritage
NASA Astrophysics Data System (ADS)
Morandi, S.; Tremari, M.; Mandelli, A.
2017-02-01
The research is an analysis of the recording, reconstruction and visualisation of the 3D data of a XVIII century watermill, identified in an emergency archaeological excavation during the construction of the mini-hydroelectric plant on the bank of the Adda river in the municipality of Pizzighettone (Cremona, Lombardy, Italy). The work examines the use and the potentials of modern digital 3D modelling techniques applied to archaeological heritage aimed to increase the research, maintenance and presentation with interactive products. The use of three-dimensional models managed through AR (Augmented Reality) and VR (Virtual Reality) technologies with mobile devices gives several opportunities in the field of study and communication. It also improves on-site exploration of the landscape, enhancing the "minor" archaeological sites, daily subjected to numerous emergency works and facilitating the understanding of heritage sites.
Quality metric for spherical panoramic video
NASA Astrophysics Data System (ADS)
Zakharchenko, Vladyslav; Choi, Kwang Pyo; Park, Jeong Hoon
2016-09-01
Virtual reality (VR)/ augmented reality (AR) applications allow users to view artificial content of a surrounding space simulating presence effect with a help of special applications or devices. Synthetic contents production is well known process form computer graphics domain and pipeline has been already fixed in the industry. However emerging multimedia formats for immersive entertainment applications such as free-viewpoint television (FTV) or spherical panoramic video require different approaches in content management and quality assessment. The international standardization on FTV has been promoted by MPEG. This paper is dedicated to discussion of immersive media distribution format and quality estimation process. Accuracy and reliability of the proposed objective quality estimation method had been verified with spherical panoramic images demonstrating good correlation results with subjective quality estimation held by a group of experts.
Augmented Reality Imaging System: 3D Viewing of a Breast Cancer.
Douglas, David B; Boone, John M; Petricoin, Emanuel; Liotta, Lance; Wilson, Eugene
2016-01-01
To display images of breast cancer from a dedicated breast CT using Depth 3-Dimensional (D3D) augmented reality. A case of breast cancer imaged using contrast-enhanced breast CT (Computed Tomography) was viewed with the augmented reality imaging, which uses a head display unit (HDU) and joystick control interface. The augmented reality system demonstrated 3D viewing of the breast mass with head position tracking, stereoscopic depth perception, focal point convergence and the use of a 3D cursor and joy-stick enabled fly through with visualization of the spiculations extending from the breast cancer. The augmented reality system provided 3D visualization of the breast cancer with depth perception and visualization of the mass's spiculations. The augmented reality system should be further researched to determine the utility in clinical practice.
Confronting an Augmented Reality
ERIC Educational Resources Information Center
Munnerley, Danny; Bacon, Matt; Wilson, Anna; Steele, James; Hedberg, John; Fitzgerald, Robert
2012-01-01
How can educators make use of augmented reality technologies and practices to enhance learning and why would we want to embrace such technologies anyway? How can an augmented reality help a learner confront, interpret and ultimately comprehend reality itself ? In this article, we seek to initiate a discussion that focuses on these questions, and…
The status of augmented reality in laparoscopic surgery as of 2016.
Bernhardt, Sylvain; Nicolau, Stéphane A; Soler, Luc; Doignon, Christophe
2017-04-01
This article establishes a comprehensive review of all the different methods proposed by the literature concerning augmented reality in intra-abdominal minimally invasive surgery (also known as laparoscopic surgery). A solid background of surgical augmented reality is first provided in order to support the survey. Then, the various methods of laparoscopic augmented reality as well as their key tasks are categorized in order to better grasp the current landscape of the field. Finally, the various issues gathered from these reviewed approaches are organized in order to outline the remaining challenges of augmented reality in laparoscopic surgery. Copyright © 2017 Elsevier B.V. All rights reserved.
Recent Development of Augmented Reality in Surgery: A Review
Vávra, P.; Zonča, P.; Ihnát, P.; El-Gendi, A.
2017-01-01
Introduction The development augmented reality devices allow physicians to incorporate data visualization into diagnostic and treatment procedures to improve work efficiency, safety, and cost and to enhance surgical training. However, the awareness of possibilities of augmented reality is generally low. This review evaluates whether augmented reality can presently improve the results of surgical procedures. Methods We performed a review of available literature dating from 2010 to November 2016 by searching PubMed and Scopus using the terms “augmented reality” and “surgery.” Results. The initial search yielded 808 studies. After removing duplicates and including only journal articles, a total of 417 studies were identified. By reading of abstracts, 91 relevant studies were chosen to be included. 11 references were gathered by cross-referencing. A total of 102 studies were included in this review. Conclusions The present literature suggest an increasing interest of surgeons regarding employing augmented reality into surgery leading to improved safety and efficacy of surgical procedures. Many studies showed that the performance of newly devised augmented reality systems is comparable to traditional techniques. However, several problems need to be addressed before augmented reality is implemented into the routine practice. PMID:29065604
Mixed Reality with HoloLens: Where Virtual Reality Meets Augmented Reality in the Operating Room.
Tepper, Oren M; Rudy, Hayeem L; Lefkowitz, Aaron; Weimer, Katie A; Marks, Shelby M; Stern, Carrie S; Garfein, Evan S
2017-11-01
Virtual reality and augmented reality devices have recently been described in the surgical literature. The authors have previously explored various iterations of these devices, and although they show promise, it has become clear that virtual reality and/or augmented reality devices alone do not adequately meet the demands of surgeons. The solution may lie in a hybrid technology known as mixed reality, which merges many virtual reality and augmented realty features. Microsoft's HoloLens, the first commercially available mixed reality device, provides surgeons intraoperative hands-free access to complex data, the real environment, and bidirectional communication. This report describes the use of HoloLens in the operating room to improve decision-making and surgical workflow. The pace of mixed reality-related technological development will undoubtedly be rapid in the coming years, and plastic surgeons are ideally suited to both lead and benefit from this advance.
Kenngott, Hannes G; Wagner, Martin; Gondan, Matthias; Nickel, Felix; Nolden, Marco; Fetzer, Andreas; Weitz, Jürgen; Fischer, Lars; Speidel, Stefanie; Meinzer, Hans-Peter; Böckler, Dittmar; Büchler, Markus W; Müller-Stich, Beat P
2014-03-01
Laparoscopic liver surgery is particularly challenging owing to restricted access, risk of bleeding, and lack of haptic feedback. Navigation systems have the potential to improve information on the exact position of intrahepatic tumors, and thus facilitate oncological resection. This study aims to evaluate the feasibility of a commercially available augmented reality (AR) guidance system employing intraoperative robotic C-arm cone-beam computed tomography (CBCT) for laparoscopic liver surgery. A human liver-like phantom with 16 target fiducials was used to evaluate the Syngo iPilot(®) AR system. Subsequently, the system was used for the laparoscopic resection of a hepatocellular carcinoma in segment 7 of a 50-year-old male patient. In the phantom experiment, the AR system showed a mean target registration error of 0.96 ± 0.52 mm, with a maximum error of 2.49 mm. The patient successfully underwent the operation and showed no postoperative complications. The use of intraoperative CBCT and AR for laparoscopic liver resection is feasible and could be considered an option for future liver surgery in complex cases.
Potential perils of peri-Pokémon perambulation: the dark reality of augmented reality?
Joseph, Bellal; Armstrong, David G.
2016-01-01
Recently, the layering of augmented reality information on top of smartphone applications has created unprecedented user engagement and popularity. One augmented reality-based entertainment application, Pokémon Go (Pokémon Company, Tokyo, Japan) has become the most rapidly downloaded in history. This technology holds tremendous promise to promote ambulatory activity. However, there exists the obvious potential for distraction-related morbidity. We report two cases, presenting simultaneously to our trauma center, with injuries sustained secondary to gameplay with this augmented reality-based application. PMID:27713831
Potential perils of peri-Pokémon perambulation: the dark reality of augmented reality?
Joseph, Bellal; Armstrong, David G
2016-10-01
Recently, the layering of augmented reality information on top of smartphone applications has created unprecedented user engagement and popularity. One augmented reality-based entertainment application, Pokémon Go (Pokémon Company, Tokyo, Japan) has become the most rapidly downloaded in history. This technology holds tremendous promise to promote ambulatory activity. However, there exists the obvious potential for distraction-related morbidity. We report two cases, presenting simultaneously to our trauma center, with injuries sustained secondary to gameplay with this augmented reality-based application.
Augmented Reality as a Countermeasure for Sleep Deprivation.
Baumeister, James; Dorrlan, Jillian; Banks, Siobhan; Chatburn, Alex; Smith, Ross T; Carskadon, Mary A; Lushington, Kurt; Thomas, Bruce H
2016-04-01
Sleep deprivation is known to have serious deleterious effects on executive functioning and job performance. Augmented reality has an ability to place pertinent information at the fore, guiding visual focus and reducing instructional complexity. This paper presents a study to explore how spatial augmented reality instructions impact procedural task performance on sleep deprived users. The user study was conducted to examine performance on a procedural task at six time points over the course of a night of total sleep deprivation. Tasks were provided either by spatial augmented reality-based projections or on an adjacent monitor. The results indicate that participant errors significantly increased with the monitor condition when sleep deprived. The augmented reality condition exhibited a positive influence with participant errors and completion time having no significant increase when sleep deprived. The results of our study show that spatial augmented reality is an effective sleep deprivation countermeasure under laboratory conditions.
D3D augmented reality imaging system: proof of concept in mammography.
Douglas, David B; Petricoin, Emanuel F; Liotta, Lance; Wilson, Eugene
2016-01-01
The purpose of this article is to present images from simulated breast microcalcifications and assess the pattern of the microcalcifications with a technical development called "depth 3-dimensional (D3D) augmented reality". A computer, head display unit, joystick, D3D augmented reality software, and an in-house script of simulated data of breast microcalcifications in a ductal distribution were used. No patient data was used and no statistical analysis was performed. The D3D augmented reality system demonstrated stereoscopic depth perception by presenting a unique image to each eye, focal point convergence, head position tracking, 3D cursor, and joystick fly-through. The D3D augmented reality imaging system offers image viewing with depth perception and focal point convergence. The D3D augmented reality system should be tested to determine its utility in clinical practice.
Applying Augmented Reality in practical classes for engineering students
NASA Astrophysics Data System (ADS)
Bazarov, S. E.; Kholodilin, I. Yu; Nesterov, A. S.; Sokhina, A. V.
2017-10-01
In this article the Augmented Reality application for teaching engineering students of electrical and technological specialties is introduced. In order to increase the motivation for learning and the independence of students, new practical guidelines on Augmented Reality were developed in the application to practical classes. During the application development, the authors used software such as Unity 3D and Vuforia. The Augmented Reality content consists of 3D-models, images and animations, which are superimposed on real objects, helping students to study specific tasks. A user who has a smartphone, a tablet PC, or Augmented Reality glasses can visualize on-screen virtual objects added to a real environment. Having analyzed the current situation in higher education: the learner’s interest in studying, their satisfaction with the educational process, and the impact of the Augmented Reality application on students, a questionnaire was developed and offered to students; the study involved 24 learners.
ERIC Educational Resources Information Center
Martín Gutiérrez, Jorge; Meneses Fernández, María Dolores
2014-01-01
This paper explores educational and professional uses of augmented learning environment concerned with issues of training and entertainment. We analyze the state-of-art research of some scenarios based on augmented reality. Some examples for the purpose of education and simulation are described. These applications show that augmented reality can…
NASA Astrophysics Data System (ADS)
Kilgus, T.; Franz, A. M.; Seitel, A.; Marz, K.; Bartha, L.; Fangerau, M.; Mersmann, S.; Groch, A.; Meinzer, H.-P.; Maier-Hein, L.
2012-02-01
Visualization of anatomical data for disease diagnosis, surgical planning, or orientation during interventional therapy is an integral part of modern health care. However, as anatomical information is typically shown on monitors provided by a radiological work station, the physician has to mentally transfer internal structures shown on the screen to the patient. To address this issue, we recently presented a new approach to on-patient visualization of 3D medical images, which combines the concept of augmented reality (AR) with an intuitive interaction scheme. Our method requires mounting a range imaging device, such as a Time-of-Flight (ToF) camera, to a portable display (e.g. a tablet PC). During the visualization process, the pose of the camera and thus the viewing direction of the user is continuously determined with a surface matching algorithm. By moving the device along the body of the patient, the physician is given the impression of looking directly into the human body. In this paper, we present and evaluate a new method for camera pose estimation based on an anisotropic trimmed variant of the well-known iterative closest point (ICP) algorithm. According to in-silico and in-vivo experiments performed with computed tomography (CT) and ToF data of human faces, knees and abdomens, our new method is better suited for surface registration with ToF data than the established trimmed variant of the ICP, reducing the target registration error (TRE) by more than 60%. The TRE obtained (approx. 4-5 mm) is promising for AR visualization, but clinical applications require maximization of robustness and run-time.
NASA Astrophysics Data System (ADS)
Kim, Seoksoo; Jung, Sungmo; Song, Jae-Gu; Kang, Byong-Ho
As augmented reality and a gravity sensor is of growing interest, siginificant developement is being made on related technology, which allows application of the technology in a variety of areas with greater expectations. In applying Context-aware to augmented reality, it can make useful programs. A traning system suggested in this study helps a user to understand an effcienct training method using augmented reality and make sure if his exercise is being done propery based on the data collected by a gravity sensor. Therefore, this research aims to suggest an efficient training environment that can enhance previous training methods by applying augmented reality and a gravity sensor.
Architecture and Key Techniques of Augmented Reality Maintenance Guiding System for Civil Aircrafts
NASA Astrophysics Data System (ADS)
hong, Zhou; Wenhua, Lu
2017-01-01
Augmented reality technology is introduced into the maintenance related field for strengthened information in real-world scenarios through integration of virtual assistant maintenance information with real-world scenarios. This can lower the difficulty of maintenance, reduce maintenance errors, and improve the maintenance efficiency and quality of civil aviation crews. Architecture of augmented reality virtual maintenance guiding system is proposed on the basis of introducing the definition of augmented reality and analyzing the characteristics of augmented reality virtual maintenance. Key techniques involved, such as standardization and organization of maintenance data, 3D registration, modeling of maintenance guidance information and virtual maintenance man-machine interaction, are elaborated emphatically, and solutions are given.
Augmented reality in neurosurgery
Tagaytayan, Raniel; Kelemen, Arpad
2016-01-01
Neurosurgery is a medical specialty that relies heavily on imaging. The use of computed tomography and magnetic resonance images during preoperative planning and intraoperative surgical navigation is vital to the success of the surgery and positive patient outcome. Augmented reality application in neurosurgery has the potential to revolutionize and change the way neurosurgeons plan and perform surgical procedures in the future. Augmented reality technology is currently commercially available for neurosurgery for simulation and training. However, the use of augmented reality in the clinical setting is still in its infancy. Researchers are now testing augmented reality system prototypes to determine and address the barriers and limitations of the technology before it can be widely accepted and used in the clinical setting. PMID:29765445
Augmented reality in neurosurgery.
Tagaytayan, Raniel; Kelemen, Arpad; Sik-Lanyi, Cecilia
2018-04-01
Neurosurgery is a medical specialty that relies heavily on imaging. The use of computed tomography and magnetic resonance images during preoperative planning and intraoperative surgical navigation is vital to the success of the surgery and positive patient outcome. Augmented reality application in neurosurgery has the potential to revolutionize and change the way neurosurgeons plan and perform surgical procedures in the future. Augmented reality technology is currently commercially available for neurosurgery for simulation and training. However, the use of augmented reality in the clinical setting is still in its infancy. Researchers are now testing augmented reality system prototypes to determine and address the barriers and limitations of the technology before it can be widely accepted and used in the clinical setting.
Augmented reality in dentistry: a current perspective.
Kwon, Ho-Beom; Park, Young-Seok; Han, Jung-Suk
2018-02-21
Augmentation reality technology offers virtual information in addition to that of the real environment and thus opens new possibilities in various fields. The medical applications of augmentation reality are generally concentrated on surgery types, including neurosurgery, laparoscopic surgery and plastic surgery. Augmentation reality technology is also widely used in medical education and training. In dentistry, oral and maxillofacial surgery is the primary area of use, where dental implant placement and orthognathic surgery are the most frequent applications. Recent technological advancements are enabling new applications of restorative dentistry, orthodontics and endodontics. This review briefly summarizes the history, definitions, features, and components of augmented reality technology and discusses its applications and future perspectives in dentistry.
Reality Check: Basics of Augmented, Virtual, and Mixed Reality.
Brigham, Tara J
2017-01-01
Augmented, virtual, and mixed reality applications all aim to enhance a user's current experience or reality. While variations of this technology are not new, within the last few years there has been a significant increase in the number of artificial reality devices or applications available to the general public. This column will explain the difference between augmented, virtual, and mixed reality and how each application might be useful in libraries. It will also provide an overview of the concerns surrounding these different reality applications and describe how and where they are currently being used.
Ortiz-Catalan, Max; Sander, Nichlas; Kristoffersen, Morten B.; Håkansson, Bo; Brånemark, Rickard
2014-01-01
A variety of treatments have been historically used to alleviate phantom limb pain (PLP) with varying efficacy. Recently, virtual reality (VR) has been employed as a more sophisticated mirror therapy. Despite the advantages of VR over a conventional mirror, this approach has retained the use of the contralateral limb and is therefore restricted to unilateral amputees. Moreover, this strategy disregards the actual effort made by the patient to produce phantom motions. In this work, we investigate a treatment in which the virtual limb responds directly to myoelectric activity at the stump, while the illusion of a restored limb is enhanced through augmented reality (AR). Further, phantom motions are facilitated and encouraged through gaming. The proposed set of technologies was administered to a chronic PLP patient who has shown resistance to a variety of treatments (including mirror therapy) for 48 years. Individual and simultaneous phantom movements were predicted using myoelectric pattern recognition and were then used as input for VR and AR environments, as well as for a racing game. The sustained level of pain reported by the patient was gradually reduced to complete pain-free periods. The phantom posture initially reported as a strongly closed fist was gradually relaxed, interestingly resembling the neutral posture displayed by the virtual limb. The patient acquired the ability to freely move his phantom limb, and a telescopic effect was observed where the position of the phantom hand was restored to the anatomically correct distance. More importantly, the effect of the interventions was positively and noticeably perceived by the patient and his relatives. Despite the limitation of a single case study, the successful results of the proposed system in a patient for whom other medical and non-medical treatments have been ineffective justifies and motivates further investigation in a wider study. PMID:24616655
Ortiz-Catalan, Max; Sander, Nichlas; Kristoffersen, Morten B; Håkansson, Bo; Brånemark, Rickard
2014-01-01
A variety of treatments have been historically used to alleviate phantom limb pain (PLP) with varying efficacy. Recently, virtual reality (VR) has been employed as a more sophisticated mirror therapy. Despite the advantages of VR over a conventional mirror, this approach has retained the use of the contralateral limb and is therefore restricted to unilateral amputees. Moreover, this strategy disregards the actual effort made by the patient to produce phantom motions. In this work, we investigate a treatment in which the virtual limb responds directly to myoelectric activity at the stump, while the illusion of a restored limb is enhanced through augmented reality (AR). Further, phantom motions are facilitated and encouraged through gaming. The proposed set of technologies was administered to a chronic PLP patient who has shown resistance to a variety of treatments (including mirror therapy) for 48 years. Individual and simultaneous phantom movements were predicted using myoelectric pattern recognition and were then used as input for VR and AR environments, as well as for a racing game. The sustained level of pain reported by the patient was gradually reduced to complete pain-free periods. The phantom posture initially reported as a strongly closed fist was gradually relaxed, interestingly resembling the neutral posture displayed by the virtual limb. The patient acquired the ability to freely move his phantom limb, and a telescopic effect was observed where the position of the phantom hand was restored to the anatomically correct distance. More importantly, the effect of the interventions was positively and noticeably perceived by the patient and his relatives. Despite the limitation of a single case study, the successful results of the proposed system in a patient for whom other medical and non-medical treatments have been ineffective justifies and motivates further investigation in a wider study.
Gervaix, Alain; Haddad, Kevin; Lacroix, Laurence; Schrurs, Philippe; Sahin, Ayhan; Lovis, Christian; Manzano, Sergio
2017-01-01
Background The American Heart Association (AHA) guidelines for cardiopulmonary resuscitation (CPR) are nowadays recognized as the world’s most authoritative resuscitation guidelines. Adherence to these guidelines optimizes the management of critically ill patients and increases their chances of survival after cardiac arrest. Despite their availability, suboptimal quality of CPR is still common. Currently, the median hospital survival rate after pediatric in-hospital cardiac arrest is 36%, whereas it falls below 10% for out-of-hospital cardiac arrest. Among emerging information technologies and devices able to support caregivers during resuscitation and increase adherence to AHA guidelines, augmented reality (AR) glasses have not yet been assessed. In order to assess their potential, we adapted AHA Pediatric Advanced Life Support (PALS) guidelines for AR glasses. Objective The study aimed to determine whether adapting AHA guidelines for AR glasses increased adherence by reducing deviation and time to initiation of critical life-saving maneuvers during pediatric CPR when compared with the use of PALS pocket reference cards. Methods We conducted a randomized controlled trial with two parallel groups of voluntary pediatric residents, comparing AR glasses to PALS pocket reference cards during a simulation-based pediatric cardiac arrest scenario—pulseless ventricular tachycardia (pVT). The primary outcome was the elapsed time in seconds in each allocation group, from onset of pVT to the first defibrillation attempt. Secondary outcomes were time elapsed to (1) initiation of chest compression, (2) subsequent defibrillation attempts, and (3) administration of drugs, as well as the time intervals between defibrillation attempts and drug doses, shock doses, and number of shocks. All these outcomes were assessed for deviation from AHA guidelines. Results Twenty residents were randomized into 2 groups. Time to first defibrillation attempt (mean: 146 s) and adherence to AHA guidelines in terms of time to other critical resuscitation endpoints and drug dose delivery were not improved using AR glasses. However, errors and deviations were significantly reduced in terms of defibrillation doses when compared with the use of the PALS pocket reference cards. In a total of 40 defibrillation attempts, residents not wearing AR glasses used wrong doses in 65% (26/40) of cases, including 21 shock overdoses >100 J, for a cumulative defibrillation dose of 18.7 Joules per kg. These errors were reduced by 53% (21/40, P<.001) and cumulative defibrillation dose by 37% (5.14/14, P=.001) with AR glasses. Conclusions AR glasses did not decrease time to first defibrillation attempt and other critical resuscitation endpoints when compared with PALS pocket cards. However, they improved adherence and performance among residents in terms of administering the defibrillation doses set by AHA. PMID:28554878
SmartG: Spontaneous Malaysian Augmented Reality Tourist Guide
NASA Astrophysics Data System (ADS)
Kasinathan, Vinothini; Mustapha, Aida; Subramaniam, Tanabalan
2016-11-01
In effort to attract higher tourist expenditure along with higher tourist arrivals, this paper proposes a travel application called the SmartG, acronym for Spontaneous Malaysian Augmented Reality Tourist Guide, which operates by making recommendations to user based on the travel objective and individual budget constraints. The applications relies on augmented reality technology, whereby a three dimensional model is presented to the user based on input from real world environment. User testing returned a favorable feedback on the concept of using augmented reality in promoting Malaysian tourism.
Leveraging Emerging Technologies in Outreach for JWST
NASA Astrophysics Data System (ADS)
Meinke, Bonnie K.; Green, Joel D.; Smith, Louis Chad; Smith, Denise A.; Lawton, Brandon L.; Gough, Michael
2017-10-01
The James Webb Space Telescope (JWST) is NASA’s next great observatory, launching in October 2018. How will we maintain the prestige and cultural impact of the Hubble Space Telescope as the torch passes to Webb? Emerging technologies such as augmented (AR) and virtual reality (VR) bring the viewer into the data and introduce the telescope in previously unimaginable immersive detail. Adoption of mobile devices, many of which easily support AR and VR, has expanded access to information for wide swaths of the public. From software like Worldwide Telescope to hardware like the HTC Vive, immersive environments are providing new avenues for learning. If we develop materials properly tailored to these media, we can reach more diverse audiences than ever before. STScI is piloting tools related to JWST to showcase at DPS, and in local events, which I highlight here.
NASA Astrophysics Data System (ADS)
Kress, Bernard C.
2015-09-01
Three years ago, industry and consumers learned that there was more to Head Mounted Displays (HMDs) than the long-lasting but steady market for defense or the market for gadget video player headsets: the first versions of Smart Glasses were introduced to the public. Since then, most major consumer electronics companies unveiled their own versions of Connected Glasses, Smart Glasses or Smart Eyewear, AR (Augmented Reality) and VR (Virtual Reality) headsets. This rush resulted in the build-up of a formidable zoo of optical technologies, each claiming to be best suited for the task on hand. Today, the question is not so much anymore "will the Smart Glass market happen?" but rather "which optical technologies will be best fitted for the various declinations of the existing wearable display market," one of the main declination being the Smart Glasses market.
Evaluating the iterative development of VR/AR human factors tools for manual work.
Liston, Paul M; Kay, Alison; Cromie, Sam; Leva, Chiara; D'Cruz, Mirabelle; Patel, Harshada; Langley, Alyson; Sharples, Sarah; Aromaa, Susanna
2012-01-01
This paper outlines the approach taken to iteratively evaluate a set of VR/AR (virtual reality / augmented reality) applications for five different manual-work applications - terrestrial spacecraft assembly, assembly-line design, remote maintenance of trains, maintenance of nuclear reactors, and large-machine assembly process design - and examines the evaluation data for evidence of the effectiveness of the evaluation framework as well as the benefits to the development process of feedback from iterative evaluation. ManuVAR is an EU-funded research project that is working to develop an innovative technology platform and a framework to support high-value, high-knowledge manual work throughout the product lifecycle. The results of this study demonstrate the iterative improvements reached throughout the design cycles, observable through the trending of the quantitative results from three successive trials of the applications and the investigation of the qualitative interview findings. The paper discusses the limitations of evaluation in complex, multi-disciplinary development projects and finds evidence of the effectiveness of the use of the particular set of complementary evaluation methods incorporating a common inquiry structure used for the evaluation - particularly in facilitating triangulation of the data.
Applied Augmented Reality for High Precision Maintenance
NASA Astrophysics Data System (ADS)
Dever, Clark
Augmented Reality had a major consumer breakthrough this year with Pokemon Go. The underlying technologies that made that app a success with gamers can be applied to improve the efficiency and efficacy of workers. This session will explore some of the use cases for augmented reality in an industrial environment. In doing so, the environmental impacts and human factors that must be considered will be explored. Additionally, the sensors, algorithms, and visualization techniques used to realize augmented reality will be discussed. The benefits of augmented reality solutions in industrial environments include automated data recording, improved quality assurance, reduction in training costs and improved mean-time-to-resolution. As technology continues to follow Moore's law, more applications will become feasible as performance-per-dollar increases across all system components.
[Display technologies for augmented reality in medical applications].
Eck, Ulrich; Winkler, Alexander
2018-04-01
One of the main challenges for modern surgery is the effective use of the many available imaging modalities and diagnostic methods. Augmented reality systems can be used in the future to blend patient and planning information into the view of surgeons, which can improve the efficiency and safety of interventions. In this article we present five visualization methods to integrate augmented reality displays into medical procedures and the advantages and disadvantages are explained. Based on an extensive literature review the various existing approaches for integration of augmented reality displays into medical procedures are divided into five categories and the most important research results for each approach are presented. A large number of mixed and augmented reality solutions for medical interventions have been developed as research prototypes; however, only very few systems have been tested on patients. In order to integrate mixed and augmented reality displays into medical practice, highly specialized solutions need to be developed. Such systems must comply with the requirements with respect to accuracy, fidelity, ergonomics and seamless integration into the surgical workflow.
Augmented Visual Experience of Simulated Solar Phenomena
NASA Astrophysics Data System (ADS)
Tucker, A. O., IV; Berardino, R. A.; Hahne, D.; Schreurs, B.; Fox, N. J.; Raouafi, N.
2017-12-01
The Parker Solar Probe (PSP) mission will explore the Sun's corona, studying solar wind, flares and coronal mass ejections. The effects of these phenomena can impact the technology that we use in ways that are not readily apparent, including affecting satellite communications and power grids. Determining the structure and dynamics of corona magnetic fields, tracing the flow of energy that heats the corona, and exploring dusty plasma near the Sun to understand its influence on solar wind and energetic particle formation requires a suite of sensors on board the PSP spacecraft that are engineered to observe specific phenomena. Using models of these sensors and simulated observational data, we can visualize what the PSP spacecraft will "see" during its multiple passes around the Sun. Augmented reality (AR) technologies enable convenient user access to massive data sets. We are developing an application that allows users to experience environmental data from the point of view of the PSP spacecraft in AR using the Microsoft HoloLens. Observational data, including imagery, magnetism, temperature, and density are visualized in 4D within the user's immediate environment. Our application provides an educational tool for comprehending the complex relationships of observational data, which aids in our understanding of the Sun.
Camera pose estimation for augmented reality in a small indoor dynamic scene
NASA Astrophysics Data System (ADS)
Frikha, Rawia; Ejbali, Ridha; Zaied, Mourad
2017-09-01
Camera pose estimation remains a challenging task for augmented reality (AR) applications. Simultaneous localization and mapping (SLAM)-based methods are able to estimate the six degrees of freedom camera motion while constructing a map of an unknown environment. However, these methods do not provide any reference for where to insert virtual objects since they do not have any information about scene structure and may fail in cases of occlusion of three-dimensional (3-D) map points or dynamic objects. This paper presents a real-time monocular piece wise planar SLAM method using the planar scene assumption. Using planar structures in the mapping process allows rendering virtual objects in a meaningful way on the one hand and improving the precision of the camera pose and the quality of 3-D reconstruction of the environment by adding constraints on 3-D points and poses in the optimization process on the other hand. We proposed to benefit from the 3-D planes rigidity motion in the tracking process to enhance the system robustness in the case of dynamic scenes. Experimental results show that using a constrained planar scene improves our system accuracy and robustness compared with the classical SLAM systems.
Large-scale Exploration of Neuronal Morphologies Using Deep Learning and Augmented Reality.
Li, Zhongyu; Butler, Erik; Li, Kang; Lu, Aidong; Ji, Shuiwang; Zhang, Shaoting
2018-02-12
Recently released large-scale neuron morphological data has greatly facilitated the research in neuroinformatics. However, the sheer volume and complexity of these data pose significant challenges for efficient and accurate neuron exploration. In this paper, we propose an effective retrieval framework to address these problems, based on frontier techniques of deep learning and binary coding. For the first time, we develop a deep learning based feature representation method for the neuron morphological data, where the 3D neurons are first projected into binary images and then learned features using an unsupervised deep neural network, i.e., stacked convolutional autoencoders (SCAEs). The deep features are subsequently fused with the hand-crafted features for more accurate representation. Considering the exhaustive search is usually very time-consuming in large-scale databases, we employ a novel binary coding method to compress feature vectors into short binary codes. Our framework is validated on a public data set including 58,000 neurons, showing promising retrieval precision and efficiency compared with state-of-the-art methods. In addition, we develop a novel neuron visualization program based on the techniques of augmented reality (AR), which can help users take a deep exploration of neuron morphologies in an interactive and immersive manner.
ERIC Educational Resources Information Center
Montoya, Mauricio Hincapié; Díaz, Christian Andrés; Moreno, Gustavo Adolfo
2017-01-01
Nowadays, the use of technology to improve teaching and learning experiences in the classroom has been promoted. One of these technologies is augmented reality, which allows overlaying layers of virtual information on real scene with the aim of increasing the perception that user has of reality. Augmented reality has proved to offer several…
The Local Games Lab ABQ: Homegrown Augmented Reality
ERIC Educational Resources Information Center
Holden, Christopher
2014-01-01
Experiments in the use of augmented reality games formerly required extensive material resources and expertise to implement above and beyond what might be possible within the usual educational contexts. Currently, the more common availability of hardware in these contexts and the existence of easy-to-use, general purpose augmented reality design…
Hybrid Reality Lab Capabilities - Video 2
NASA Technical Reports Server (NTRS)
Delgado, Francisco J.; Noyes, Matthew
2016-01-01
Our Hybrid Reality and Advanced Operations Lab is developing incredibly realistic and immersive systems that could be used to provide training, support engineering analysis, and augment data collection for various human performance metrics at NASA. To get a better understanding of what Hybrid Reality is, let's go through the two most commonly known types of immersive realities: Virtual Reality, and Augmented Reality. Virtual Reality creates immersive scenes that are completely made up of digital information. This technology has been used to train astronauts at NASA, used during teleoperation of remote assets (arms, rovers, robots, etc.) and other activities. One challenge with Virtual Reality is that if you are using it for real time-applications (like landing an airplane) then the information used to create the virtual scenes can be old (i.e. visualized long after physical objects moved in the scene) and not accurate enough to land the airplane safely. This is where Augmented Reality comes in. Augmented Reality takes real-time environment information (from a camera, or see through window, and places digitally created information into the scene so that it matches with the video/glass information). Augmented Reality enhances real environment information collected with a live sensor or viewport (e.g. camera, window, etc.) with the information-rich visualization provided by Virtual Reality. Hybrid Reality takes Augmented Reality even further, by creating a higher level of immersion where interactivity can take place. Hybrid Reality takes Virtual Reality objects and a trackable, physical representation of those objects, places them in the same coordinate system, and allows people to interact with both objects' representations (virtual and physical) simultaneously. After a short period of adjustment, the individuals begin to interact with all the objects in the scene as if they were real-life objects. The ability to physically touch and interact with digitally created objects that have the same shape, size, location to their physical object counterpart in virtual reality environment can be a game changer when it comes to training, planning, engineering analysis, science, entertainment, etc. Our Project is developing such capabilities for various types of environments. The video outlined with this abstract is a representation of an ISS Hybrid Reality experience. In the video you can see various Hybrid Reality elements that provide immersion beyond just standard Virtual Reality or Augmented Reality.
On Location Learning: Authentic Applied Science with Networked Augmented Realities
ERIC Educational Resources Information Center
Rosenbaum, Eric; Klopfer, Eric; Perry, Judy
2007-01-01
The learning of science can be made more like the practice of science through authentic simulated experiences. We have created a networked handheld Augmented Reality environment that combines the authentic role-playing of Augmented Realities and the underlying models of Participatory Simulations. This game, known as Outbreak @ The Institute, is…
Augmenting a Child's Reality: Using Educational Tablet Technology
ERIC Educational Resources Information Center
Tanner, Patricia; Karas, Carly; Schofield, Damian
2014-01-01
This study investigates the classroom integration of an innovative technology, augmented reality. Although the process of adding new technologies into a classroom setting can be daunting, the concept of augmented reality has demonstrated the ability to educate students and to assist with their comprehension of a procedural task. One half of the…
Enhancing Education through Mobile Augmented Reality
ERIC Educational Resources Information Center
Joan, D. R. Robert
2015-01-01
In this article, the author has discussed about the Mobile Augmented Reality and enhancing education through it. The aim of the present study was to give some general information about mobile augmented reality which helps to boost education. Purpose of the current study reveals the mobile networks which are used in the institution campus as well…
ERIC Educational Resources Information Center
Lan, Chung-Hsien; Chao, Stefan; Kinshuk; Chao, Kuo-Hung
2013-01-01
This study presents a conceptual framework for supporting mobile peer assessment by incorporating augmented reality technology to eliminate limitation of reviewing and assessing. According to the characteristics of mobile technology and augmented reality, students' work can be shown in various ways by considering the locations and situations. This…
ERIC Educational Resources Information Center
Rattanarungrot, Sasithorn; White, Martin; Newbury, Paul
2014-01-01
This paper describes the design of our service-oriented architecture to support mobile multiple object tracking augmented reality applications applied to education and learning scenarios. The architecture is composed of a mobile multiple object tracking augmented reality client, a web service framework, and dynamic content providers. Tracking of…
Yoo, Ha-Na; Chung, Eunjung; Lee, Byoung-Hee
2013-07-01
[Purpose] The purpose of this study was to determine the effects of augmented reality-based Otago exercise on balance, gait, and falls efficacy of elderly women. [Subjects] The subjects were 21 elderly women, who were randomly divided into two groups: an augmented reality-based Otago exercise group of 10 subjects and an Otago exercise group of 11 subjects. [Methods] All subjects were evaluated for balance (Berg Balance Scale, BBS), gait parameters (velocity, cadence, step length, and stride length), and falls efficacy. Within 12 weeks, Otago exercise for muscle strengthening and balance training was conducted three times, for a period of 60 minutes each, and subjects in the experimental group performed augmented reality-based Otago exercise. [Results] Following intervention, the augmented reality-based Otago exercise group showed significant increases in BBS, velocity, cadence, step length (right side), stride length (right side and left side) and falls efficacy. [Conclusion] The results of this study suggest the feasibility and suitability of this augmented reality-based Otago exercise for elderly women.
AR.Drone: security threat analysis and exemplary attack to track persons
NASA Astrophysics Data System (ADS)
Samland, Fred; Fruth, Jana; Hildebrandt, Mario; Hoppe, Tobias; Dittmann, Jana
2012-01-01
In this article we illustrate an approach of a security threat analysis of the quadrocopter AR.Drone, a toy for augmented reality (AR) games. The technical properties of the drone can be misused for attacks, which may relate security and/or privacy aspects. Our aim is to sensitize for the possibility of misuses and the motivation for an implementation of improved security mechanisms of the quadrocopter. We focus primarily on obvious security vulnerabilities (e.g. communication over unencrypted WLAN, usage of UDP, live video streaming via unencrypted WLAN to the control device) of this quadrocopter. We could practically verify in three exemplary scenarios that this can be misused by unauthorized persons for several attacks: high-jacking of the drone, eavesdropping of the AR.Drones unprotected video streams, and the tracking of persons. Amongst other aspects, our current research focuses on the realization of the attack of tracking persons and objects with the drone. Besides the realization of attacks, we want to evaluate the potential of this particular drone for a "safe-landing" function, as well as potential security enhancements. Additionally, in future we plan to investigate an automatic tracking of persons or objects without the need of human interactions.
3D augmented reality with integral imaging display
NASA Astrophysics Data System (ADS)
Shen, Xin; Hua, Hong; Javidi, Bahram
2016-06-01
In this paper, a three-dimensional (3D) integral imaging display for augmented reality is presented. By implementing the pseudoscopic-to-orthoscopic conversion method, elemental image arrays with different capturing parameters can be transferred into the identical format for 3D display. With the proposed merging algorithm, a new set of elemental images for augmented reality display is generated. The newly generated elemental images contain both the virtual objects and real world scene with desired depth information and transparency parameters. The experimental results indicate the feasibility of the proposed 3D augmented reality with integral imaging.
Augmented Reality in Architecture: Rebuilding Archeological Heritage
NASA Astrophysics Data System (ADS)
de la Fuente Prieto, J.; Castaño Perea, E.; Labrador Arroyo, F.
2017-02-01
With the development in recent years of augmented reality and the appearance of new mobile terminals and storage bases on-line, we find the possibility of using a powerful tool for transmitting architecture. This paper analyzes the relationship between Augmented Reality and Architecture. Firstly, connects the theoretical framework of both disciplines through the Representation concept. Secondly, describes the milestones and possibilities of Augmented Reality in the particular field of archaeological reconstruction. And lastly, once recognized the technology developed, we face the same analysis from a critical point of view, assessing their suitability to the discipline that concerns us is the architecture and within archeology.
An Augmented Reality magic mirror as additive teaching device for gross anatomy.
Kugelmann, Daniela; Stratmann, Leonard; Nühlen, Nils; Bork, Felix; Hoffmann, Saskia; Samarbarksh, Golbarg; Pferschy, Anna; von der Heide, Anna Maria; Eimannsberger, Andreas; Fallavollita, Pascal; Navab, Nassir; Waschke, Jens
2018-01-01
When preparing young medical students for clinical activity, it is indispensable to acquaint them with anatomical section images which enable them to use the clinical application of imaging methods. A new Augmented Reality Magic Mirror (AR MM) system, which provides the advantage of a novel, interactive learning tool in addition to a regular dissection course, was therefore tested and evaluated by 880 first-year medical students as part of the macroscopic anatomy course in 2015/16 at Ludwig-Maximilians-Universität (LMU) in Munich. The system consists of an RGB-D sensor as a real-time tracking device, which enables the system to link a deposited section image to the projection of the user's body, as well as a large display mimicking a real-world physical mirror. Using gesture input, the users have the ability to interactively explore radiological images in different anatomical intersection planes. We designed a tutorial during which students worked with the system in groups of about 12 and evaluated the results. Subsequently, each participant was asked to assess the system's value by filling out a Likert-scale questionnaire. The respondents approved all statements which stressed the potential of the system to serve as an additional learning resource for anatomical education. In this case, emphasis was put on active learning, 3-dimensional understanding, and a better comprehension of the course of structures. We are convinced that such an AR MM system can be beneficially installed into anatomical education in order to prepare medical students more effectively for the clinical standards and for more interactive, student-centered learning. Copyright © 2017. Published by Elsevier GmbH.
Greco, Francesco; Cadeddu, Jeffrey A; Gill, Inderbir S; Kaouk, Jihad H; Remzi, Mesut; Thompson, R Houston; van Leeuwen, Fijs W B; van der Poel, Henk G; Fornara, Paolo; Rassweiler, Jens
2014-05-01
Molecular imaging (MI) entails the visualisation, characterisation, and measurement of biologic processes at the molecular and cellular levels in humans and other living systems. Translating this technology to interventions in real-time enables interventional MI/image-guided surgery, for example, by providing better detection of tumours and their dimensions. To summarise and critically analyse the available evidence on image-guided surgery for genitourinary (GU) oncologic diseases. A comprehensive literature review was performed using PubMed and the Thomson Reuters Web of Science. In the free-text protocol, the following terms were applied: molecular imaging, genitourinary oncologic surgery, surgical navigation, image-guided surgery, and augmented reality. Review articles, editorials, commentaries, and letters to the editor were included if deemed to contain relevant information. We selected 79 articles according to the search strategy based on the Preferred Reporting Items for Systematic Reviews and Meta-analysis criteria and the IDEAL method. MI techniques included optical imaging and fluorescent techniques, the augmented reality (AR) navigation system, magnetic resonance imaging spectroscopy, positron emission tomography, and single-photon emission computed tomography. Experimental studies on the AR navigation system were restricted to the detection and therapy of adrenal and renal malignancies and in the relatively infrequent cases of prostate cancer, whereas fluorescence techniques and optical imaging presented a wide application of intraoperative GU oncologic surgery. In most cases, image-guided surgery was shown to improve the surgical resectability of tumours. Based on the evidence to date, image-guided surgery has promise in the near future for multiple GU malignancies. Further optimisation of targeted imaging agents, along with the integration of imaging modalities, is necessary to further enhance intraoperative GU oncologic surgery. Copyright © 2013 European Association of Urology. Published by Elsevier B.V. All rights reserved.
Transient Go: A Mobile App for Transient Astronomy Outreach
NASA Astrophysics Data System (ADS)
Crichton, D.; Mahabal, A.; Djorgovski, S. G.; Drake, A.; Early, J.; Ivezic, Z.; Jacoby, S.; Kanbur, S.
2016-12-01
Augmented Reality (AR) is set to revolutionize human interaction with the real world as demonstrated by the phenomenal success of `Pokemon Go'. That very technology can be used to rekindle the interest in science at the school level. We are in the process of developing a prototype app based on sky maps that will use AR to introduce different classes of astronomical transients to students as they are discovered i.e. in real-time. This will involve transient streams from surveys such as the Catalina Real-time Transient Survey (CRTS) today and the Large Synoptic Survey Telescope (LSST) in the near future. The transient streams will be combined with archival and latest image cut-outs and other auxiliary data as well as historical and statistical perspectives on each of the transient types being served. Such an app could easily be adapted to work with various NASA missions and NSF projects to enrich the student experience.
ERIC Educational Resources Information Center
Önal, Nezih; Ibili, Emin; Çaliskan, Erkan
2017-01-01
The purpose of this research is to determine the impact of augmented reality technology and geometry teaching on elementary school mathematics teacher candidates' technology acceptance and to examine participants' views on augmented reality. The sample of the research was composed of 40 elementary school mathematics teacher candidates who were…
Indoor vs. Outdoor Depth Perception for Mobile Augmented Reality
2009-03-01
International Symposium on Mixed and Augmented Reality, pages 77–86, Sept. 2008. [12] M. A. Livingston, J. E. Swan II, J. L. Gabbard , T. H. Höllerer, D. Hix...D. Brown, Y. Baillot, J. L. Gabbard , and D. Hix. A perceptual matching technique for depth judgments in optical, see-through augmented reality. In
Machine learning-based augmented reality for improved surgical scene understanding.
Pauly, Olivier; Diotte, Benoit; Fallavollita, Pascal; Weidert, Simon; Euler, Ekkehard; Navab, Nassir
2015-04-01
In orthopedic and trauma surgery, AR technology can support surgeons in the challenging task of understanding the spatial relationships between the anatomy, the implants and their tools. In this context, we propose a novel augmented visualization of the surgical scene that mixes intelligently the different sources of information provided by a mobile C-arm combined with a Kinect RGB-Depth sensor. Therefore, we introduce a learning-based paradigm that aims at (1) identifying the relevant objects or anatomy in both Kinect and X-ray data, and (2) creating an object-specific pixel-wise alpha map that permits relevance-based fusion of the video and the X-ray images within one single view. In 12 simulated surgeries, we show very promising results aiming at providing for surgeons a better surgical scene understanding as well as an improved depth perception. Copyright © 2014 Elsevier Ltd. All rights reserved.
Helios: a tangible and augmented environment to learn optical phenomena in astronomy
NASA Astrophysics Data System (ADS)
Fleck, Stéphanie; Hachet, Martin
2015-10-01
France is among the few countries that have integrated astronomy in primary school levels. However, for fifteen years, a lot of studies have shown that children have difficulties in understanding elementary astronomic phenomena such as day/night alternation, seasons or moon phases' evolution. To understand these phenomena, learners have to mentally construct 3D perceptions of aster motions and to understand how light propagates from an allocentric point of view. Therefore, 4-5 grades children (8 to 11 years old), who are developing their spatial cognition, have many difficulties to assimilate geometric optical problems that are linked to astronomy. To make astronomical learning more efficient for young pupils, we have designed an Augmented Inquiry-Based Learning Environment (AIBLE): HELIOS. Because manipulations in astronomy are intrinsically not possible, we propose to manipulate the underlying model. With HELIOS, virtual replicas of the Sun, Moon and Earth are directly manipulated from tangible manipulations. This digital support combines the possibilities of Augmented Reality (AR) while maintaining intuitive interactions following the principles of didactic of sciences. Light properties are taken into account and shadows of Earth and Moon are directly produced by an omnidirectional light source associated to the virtual Sun. This AR environment provides users with experiences they would otherwise not be able to experiment in the physical world. Our main goal is that students can take active control of their learning, express and support their ideas, make predictions and hypotheses, and test them by conducting investigations.
Lee, Byoung-Hee
2016-04-01
[Purpose] This study investigated the effects of real-time feedback using infrared camera recognition technology-based augmented reality in gait training for children with cerebral palsy. [Subjects] Two subjects with cerebral palsy were recruited. [Methods] In this study, augmented reality based real-time feedback training was conducted for the subjects in two 30-minute sessions per week for four weeks. Spatiotemporal gait parameters were used to measure the effect of augmented reality-based real-time feedback training. [Results] Velocity, cadence, bilateral step and stride length, and functional ambulation improved after the intervention in both cases. [Conclusion] Although additional follow-up studies of the augmented reality based real-time feedback training are required, the results of this study demonstrate that it improved the gait ability of two children with cerebral palsy. These findings suggest a variety of applications of conservative therapeutic methods which require future clinical trials.
Cabrilo, Ivan; Bijlenga, Philippe; Schaller, Karl
2014-09-01
Augmented reality technology has been used for intraoperative image guidance through the overlay of virtual images, from preoperative imaging studies, onto the real-world surgical field. Although setups based on augmented reality have been used for various neurosurgical pathologies, very few cases have been reported for the surgery of arteriovenous malformations (AVM). We present our experience with AVM surgery using a system designed for image injection of virtual images into the operating microscope's eyepiece, and discuss why augmented reality may be less appealing in this form of surgery. N = 5 patients underwent AVM resection assisted by augmented reality. Virtual three-dimensional models of patients' heads, skulls, AVM nidi, and feeder and drainage vessels were selectively segmented and injected into the microscope's eyepiece for intraoperative image guidance, and their usefulness was assessed in each case. Although the setup helped in performing tailored craniotomies, in guiding dissection and in localizing drainage veins, it did not provide the surgeon with useful information concerning feeder arteries, due to the complexity of AVM angioarchitecture. The difficulty in intraoperatively conveying useful information on feeder vessels may make augmented reality a less engaging tool in this form of surgery, and might explain its underrepresentation in the literature. Integrating an AVM's hemodynamic characteristics into the augmented rendering could make it more suited to AVM surgery.
Siebert, Johan N; Ehrler, Frederic; Gervaix, Alain; Haddad, Kevin; Lacroix, Laurence; Schrurs, Philippe; Sahin, Ayhan; Lovis, Christian; Manzano, Sergio
2017-05-29
The American Heart Association (AHA) guidelines for cardiopulmonary resuscitation (CPR) are nowadays recognized as the world's most authoritative resuscitation guidelines. Adherence to these guidelines optimizes the management of critically ill patients and increases their chances of survival after cardiac arrest. Despite their availability, suboptimal quality of CPR is still common. Currently, the median hospital survival rate after pediatric in-hospital cardiac arrest is 36%, whereas it falls below 10% for out-of-hospital cardiac arrest. Among emerging information technologies and devices able to support caregivers during resuscitation and increase adherence to AHA guidelines, augmented reality (AR) glasses have not yet been assessed. In order to assess their potential, we adapted AHA Pediatric Advanced Life Support (PALS) guidelines for AR glasses. The study aimed to determine whether adapting AHA guidelines for AR glasses increased adherence by reducing deviation and time to initiation of critical life-saving maneuvers during pediatric CPR when compared with the use of PALS pocket reference cards. We conducted a randomized controlled trial with two parallel groups of voluntary pediatric residents, comparing AR glasses to PALS pocket reference cards during a simulation-based pediatric cardiac arrest scenario-pulseless ventricular tachycardia (pVT). The primary outcome was the elapsed time in seconds in each allocation group, from onset of pVT to the first defibrillation attempt. Secondary outcomes were time elapsed to (1) initiation of chest compression, (2) subsequent defibrillation attempts, and (3) administration of drugs, as well as the time intervals between defibrillation attempts and drug doses, shock doses, and number of shocks. All these outcomes were assessed for deviation from AHA guidelines. Twenty residents were randomized into 2 groups. Time to first defibrillation attempt (mean: 146 s) and adherence to AHA guidelines in terms of time to other critical resuscitation endpoints and drug dose delivery were not improved using AR glasses. However, errors and deviations were significantly reduced in terms of defibrillation doses when compared with the use of the PALS pocket reference cards. In a total of 40 defibrillation attempts, residents not wearing AR glasses used wrong doses in 65% (26/40) of cases, including 21 shock overdoses >100 J, for a cumulative defibrillation dose of 18.7 Joules per kg. These errors were reduced by 53% (21/40, P<.001) and cumulative defibrillation dose by 37% (5.14/14, P=.001) with AR glasses. AR glasses did not decrease time to first defibrillation attempt and other critical resuscitation endpoints when compared with PALS pocket cards. However, they improved adherence and performance among residents in terms of administering the defibrillation doses set by AHA. ©Johan N Siebert, Frederic Ehrler, Alain Gervaix, Kevin Haddad, Laurence Lacroix, Philippe Schrurs, Ayhan Sahin, Christian Lovis, Sergio Manzano. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 29.05.2017.
Computer-generated holographic near-eye display system based on LCoS phase only modulator
NASA Astrophysics Data System (ADS)
Sun, Peng; Chang, Shengqian; Zhang, Siman; Xie, Ting; Li, Huaye; Liu, Siqi; Wang, Chang; Tao, Xiao; Zheng, Zhenrong
2017-09-01
Augmented reality (AR) technology has been applied in various areas, such as large-scale manufacturing, national defense, healthcare, movie and mass media and so on. An important way to realize AR display is using computer-generated hologram (CGH), which is accompanied by low image quality and heavy computing defects. Meanwhile, the diffraction of Liquid Crystal on Silicon (LCoS) has a negative effect on image quality. In this paper, a modified algorithm based on traditional Gerchberg-Saxton (GS) algorithm was proposed to improve the image quality, and new method to establish experimental system was used to broaden field of view (FOV). In the experiment, undesired zero-order diffracted light was eliminated and high definition 2D image was acquired with FOV broadened to 36.1 degree. We have also done some pilot research in 3D reconstruction with tomography algorithm based on Fresnel diffraction. With the same experimental system, experimental results demonstrate the feasibility of 3D reconstruction. These modifications are effective and efficient, and may provide a better solution in AR realization.
Augmented reality-assisted skull base surgery.
Cabrilo, I; Sarrafzadeh, A; Bijlenga, P; Landis, B N; Schaller, K
2014-12-01
Neuronavigation is widely considered as a valuable tool during skull base surgery. Advances in neuronavigation technology, with the integration of augmented reality, present advantages over traditional point-based neuronavigation. However, this development has not yet made its way into routine surgical practice, possibly due to a lack of acquaintance with these systems. In this report, we illustrate the usefulness and easy application of augmented reality-based neuronavigation through a case example of a patient with a clivus chordoma. We also demonstrate how augmented reality can help throughout all phases of a skull base procedure, from the verification of neuronavigation accuracy to intraoperative image-guidance. Copyright © 2014 Elsevier Masson SAS. All rights reserved.
CityGuideTour Toruń - tourist application using augmented reality
NASA Astrophysics Data System (ADS)
Węgrzyn, Magdalena; Mościcka, Albina
2017-12-01
The aim of the article is to show the possibilities of augmented reality in the fi eld of geodesy and cartography. It discusses the concept of augmented reality, its origins and development, as well as areas of the existing applications. The practical functioning of augmented reality in the area of geodesy and cartography is presented on the example of an application developed for the tourist city of Toruń, created with the use of CityGuideTour software. The principles of developing an application and the way it operates are also discussed. As a result, a fully operational bilingual application is available free of charge on the Web.
A see-through holographic head-mounted display with the large viewing angle
NASA Astrophysics Data System (ADS)
Chen, Zhidong; sang, Xinzhu; Lin, Qiaojun; Li, Jin; Yu, Xunbo; Gao, Xin; Yan, Binbin; Wang, Kuiru; Yu, Chongxiu; Xie, Songlin
2017-02-01
A novel solution for the large view angle holographic head-mounted display (HHMD) is presented. Divergent light is used for the hologram illumination to construct a large size three-dimensional object outside the display in a short distance. A designed project-type lens with large numerical aperture projects the object constructed by the hologram to its real location. The presented solution can realize a compact HHMD system with a large field of view. The basic principle and the structure of the system are described. An augmented reality (AR) prototype with the size of 50 mm×40 mm and the view angle above 60° is demonstrated.
Handheld pose tracking using vision-inertial sensors with occlusion handling
NASA Astrophysics Data System (ADS)
Li, Juan; Slembrouck, Maarten; Deboeverie, Francis; Bernardos, Ana M.; Besada, Juan A.; Veelaert, Peter; Aghajan, Hamid; Casar, José R.; Philips, Wilfried
2016-07-01
Tracking of a handheld device's three-dimensional (3-D) position and orientation is fundamental to various application domains, including augmented reality (AR), virtual reality, and interaction in smart spaces. Existing systems still offer limited performance in terms of accuracy, robustness, computational cost, and ease of deployment. We present a low-cost, accurate, and robust system for handheld pose tracking using fused vision and inertial data. The integration of measurements from embedded accelerometers reduces the number of unknown parameters in the six-degree-of-freedom pose calculation. The proposed system requires two light-emitting diode (LED) markers to be attached to the device, which are tracked by external cameras through a robust algorithm against illumination changes. Three data fusion methods have been proposed, including the triangulation-based stereo-vision system, constraint-based stereo-vision system with occlusion handling, and triangulation-based multivision system. Real-time demonstrations of the proposed system applied to AR and 3-D gaming are also included. The accuracy assessment of the proposed system is carried out by comparing with the data generated by the state-of-the-art commercial motion tracking system OptiTrack. Experimental results show that the proposed system has achieved high accuracy of few centimeters in position estimation and few degrees in orientation estimation.
Evaluation of Augmented REality Sandtable (ARES) during Sand Table Construction
2018-01-01
ARL-TR-8278 ● JAN 2018 US Army Research Laboratory Evaluation of Augmented REality Sandtable ( ARES ) during Sand Table...NOTICES Disclaimers The findings in this report are not to be construed as an official Department of the Army position unless so designated by...Evaluation of Augmented REality Sandtable ( ARES ) during Sand Table Construction by Kelly S Hale and Jennifer M Riley Design Interactive
González, Fernando Cornelio Jimènez; Villegas, Osslan Osiris Vergara; Ramírez, Dulce Esperanza Torres; Sánchez, Vianey Guadalupe Cruz; Domínguez, Humberto Ochoa
2014-01-01
Technological innovations in the field of disease prevention and maintenance of patient health have enabled the evolution of fields such as monitoring systems. One of the main advances is the development of real-time monitors that use intelligent and wireless communication technology. In this paper, a system is presented for the remote monitoring of the body temperature and heart rate of a patient by means of a wireless sensor network (WSN) and mobile augmented reality (MAR). The combination of a WSN and MAR provides a novel alternative to remotely measure body temperature and heart rate in real time during patient care. The system is composed of (1) hardware such as Arduino microcontrollers (in the patient nodes), personal computers (for the nurse server), smartphones (for the mobile nurse monitor and the virtual patient file) and sensors (to measure body temperature and heart rate), (2) a network layer using WiFly technology, and (3) software such as LabView, Android SDK, and DroidAR. The results obtained from tests show that the system can perform effectively within a range of 20 m and requires ten minutes to stabilize the temperature sensor to detect hyperthermia, hypothermia or normal body temperature conditions. Additionally, the heart rate sensor can detect conditions of tachycardia and bradycardia. PMID:25230306
González, Fernando Cornelio Jiménez; Villegas, Osslan Osiris Vergara; Ramírez, Dulce Esperanza Torres; Sánchez, Vianey Guadalupe Cruz; Domínguez, Humberto Ochoa
2014-09-16
Technological innovations in the field of disease prevention and maintenance of patient health have enabled the evolution of fields such as monitoring systems. One of the main advances is the development of real-time monitors that use intelligent and wireless communication technology. In this paper, a system is presented for the remote monitoring of the body temperature and heart rate of a patient by means of a wireless sensor network (WSN) and mobile augmented reality (MAR). The combination of a WSN and MAR provides a novel alternative to remotely measure body temperature and heart rate in real time during patient care. The system is composed of (1) hardware such as Arduino microcontrollers (in the patient nodes), personal computers (for the nurse server), smartphones (for the mobile nurse monitor and the virtual patient file) and sensors (to measure body temperature and heart rate), (2) a network layer using WiFly technology, and (3) software such as LabView, Android SDK, and DroidAR. The results obtained from tests show that the system can perform effectively within a range of 20 m and requires ten minutes to stabilize the temperature sensor to detect hyperthermia, hypothermia or normal body temperature conditions. Additionally, the heart rate sensor can detect conditions of tachycardia and bradycardia.
Transforming Polar Research with Google Glass Augmented Reality (Invited)
NASA Astrophysics Data System (ADS)
Ruthkoski, T.
2013-12-01
Augmented reality is a new technology with the potential to accelerate the advancement of science, particularly in geophysical research. Augmented reality is defined as a live, direct or indirect, view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated sensory input such as sound, video, graphics or GPS data. When paired with advanced computing techniques on cloud resources, augmented reality has the potential to improve data collection techniques, visualizations, as well as in-situ analysis for many areas of research. Google is currently a pioneer of augmented reality technology and has released beta versions of their wearable computing device, Google Glass, to a select number of developers and beta testers. This community of 'Glass Explorers' is the vehicle from which Google shapes the future of their augmented reality device. Example applications of Google Glass in geophysical research range from use as a data gathering interface in harsh climates to an on-site visualization and analysis tool. Early participation in the shaping of the Google Glass device is an opportunity for researchers to tailor this new technology to their specific needs. The purpose of this presentation is to provide geophysical researchers with a hands-on first look at Google Glass and its potential as a scientific tool. Attendees will be given an overview of the technical specifications as well as a live demonstration of the device. Potential applications to geophysical research in polar regions will be the primary focus. The presentation will conclude with an open call to participate, during which attendees may indicate interest in developing projects that integrate Google Glass into their research. Application Mockup: Penguin Counter Google Glass Augmented Reality Device
Transforming Polar Research with Google Glass Augmented Reality (Invited)
NASA Astrophysics Data System (ADS)
Ramachandran, R.; McEniry, M.; Maskey, M.
2011-12-01
Augmented reality is a new technology with the potential to accelerate the advancement of science, particularly in geophysical research. Augmented reality is defined as a live, direct or indirect, view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated sensory input such as sound, video, graphics or GPS data. When paired with advanced computing techniques on cloud resources, augmented reality has the potential to improve data collection techniques, visualizations, as well as in-situ analysis for many areas of research. Google is currently a pioneer of augmented reality technology and has released beta versions of their wearable computing device, Google Glass, to a select number of developers and beta testers. This community of 'Glass Explorers' is the vehicle from which Google shapes the future of their augmented reality device. Example applications of Google Glass in geophysical research range from use as a data gathering interface in harsh climates to an on-site visualization and analysis tool. Early participation in the shaping of the Google Glass device is an opportunity for researchers to tailor this new technology to their specific needs. The purpose of this presentation is to provide geophysical researchers with a hands-on first look at Google Glass and its potential as a scientific tool. Attendees will be given an overview of the technical specifications as well as a live demonstration of the device. Potential applications to geophysical research in polar regions will be the primary focus. The presentation will conclude with an open call to participate, during which attendees may indicate interest in developing projects that integrate Google Glass into their research. Application Mockup: Penguin Counter Google Glass Augmented Reality Device
Kong, Seong-Ho; Haouchine, Nazim; Soares, Renato; Klymchenko, Andrey; Andreiuk, Bohdan; Marques, Bruno; Shabat, Galyna; Piechaud, Thierry; Diana, Michele; Cotin, Stéphane; Marescaux, Jacques
2017-07-01
Augmented reality (AR) is the fusion of computer-generated and real-time images. AR can be used in surgery as a navigation tool, by creating a patient-specific virtual model through 3D software manipulation of DICOM imaging (e.g., CT scan). The virtual model can be superimposed to real-time images enabling transparency visualization of internal anatomy and accurate localization of tumors. However, the 3D model is rigid and does not take into account inner structures' deformations. We present a concept of automated AR registration, while the organs undergo deformation during surgical manipulation, based on finite element modeling (FEM) coupled with optical imaging of fluorescent surface fiducials. Two 10 × 1 mm wires (pseudo-tumors) and six 10 × 0.9 mm fluorescent fiducials were placed in ex vivo porcine kidneys (n = 10). Biomechanical FEM-based models were generated from CT scan. Kidneys were deformed and the shape changes were identified by tracking the fiducials, using a near-infrared optical system. The changes were registered automatically with the virtual model, which was deformed accordingly. Accuracy of prediction of pseudo-tumors' location was evaluated with a CT scan in the deformed status (ground truth). In vivo: fluorescent fiducials were inserted under ultrasound guidance in the kidney of one pig, followed by a CT scan. The FEM-based virtual model was superimposed on laparoscopic images by automatic registration of the fiducials. Biomechanical models were successfully generated and accurately superimposed on optical images. The mean measured distance between the estimated tumor by biomechanical propagation and the scanned tumor (ground truth) was 0.84 ± 0.42 mm. All fiducials were successfully placed in in vivo kidney and well visualized in near-infrared mode enabling accurate automatic registration of the virtual model on the laparoscopic images. Our preliminary experiments showed the potential of a biomechanical model with fluorescent fiducials to propagate the deformation of solid organs' surface to their inner structures including tumors with good accuracy and automatized robust tracking.
NASA Astrophysics Data System (ADS)
McFadden, D.; Tavakkoli, A.; Regenbrecht, J.; Wilson, B.
2017-12-01
Virtual Reality (VR) and Augmented Reality (AR) applications have recently seen an impressive growth, thanks to the advent of commercial Head Mounted Displays (HMDs). This new visualization era has opened the possibility of presenting researchers from multiple disciplines with data visualization techniques not possible via traditional 2D screens. In a purely VR environment researchers are presented with the visual data in a virtual environment, whereas in a purely AR application, a piece of virtual object is projected into the real world with which researchers could interact. There are several limitations to the purely VR or AR application when taken within the context of remote planetary exploration. For example, in a purely VR environment, contents of the planet surface (e.g. rocks, terrain, or other features) should be created off-line from a multitude of images using image processing techniques to generate 3D mesh data that will populate the virtual surface of the planet. This process usually takes a tremendous amount of computational resources and cannot be delivered in real-time. As an alternative, video frames may be superimposed on the virtual environment to save processing time. However, such rendered video frames will lack 3D visual information -i.e. depth information. In this paper, we present a technique to utilize a remotely situated robot's stereoscopic cameras to provide a live visual feed from the real world into the virtual environment in which planetary scientists are immersed. Moreover, the proposed technique will blend the virtual environment with the real world in such a way as to preserve both the depth and visual information from the real world while allowing for the sensation of immersion when the entire sequence is viewed via an HMD such as Oculus Rift. The figure shows the virtual environment with an overlay of the real-world stereoscopic video being presented in real-time into the virtual environment. Notice the preservation of the object's shape, shadows, and depth information. The distortions shown in the image are due to the rendering of the stereoscopic data into a 2D image for the purposes of taking screenshots.
Chowriappa, Ashirwad; Raza, Syed Johar; Fazili, Anees; Field, Erinn; Malito, Chelsea; Samarasekera, Dinesh; Shi, Yi; Ahmed, Kamran; Wilding, Gregory; Kaouk, Jihad; Eun, Daniel D; Ghazi, Ahmed; Peabody, James O; Kesavadas, Thenkurussi; Mohler, James L; Guru, Khurshid A
2015-02-01
To validate robot-assisted surgery skills acquisition using an augmented reality (AR)-based module for urethrovesical anastomosis (UVA). Participants at three institutions were randomised to a Hands-on Surgical Training (HoST) technology group or a control group. The HoST group was given procedure-based training for UVA within the haptic-enabled AR-based HoST environment. The control group did not receive any training. After completing the task, the control group was offered to cross over to the HoST group (cross-over group). A questionnaire administered after HoST determined the feasibility and acceptability of the technology. Performance of UVA using an inanimate model on the daVinci Surgical System (Intuitive Surgical Inc., Sunnyvale, CA, USA) was assessed using a UVA evaluation score and a Global Evaluative Assessment of Robotic Skills (GEARS) score. Participants completed the National Aeronautics and Space Administration Task Load Index (NASA TLX) questionnaire for cognitive assessment, as outcome measures. A Wilcoxon rank-sum test was used to compare outcomes among the groups (HoST group vs control group and control group vs cross-over group). A total of 52 individuals participated in the study. UVA evaluation scores showed significant differences in needle driving (3.0 vs 2.3; P = 0.042), needle positioning (3.0 vs 2.4; P = 0.033) and suture placement (3.4 vs 2.6; P = 0.014) in the HoST vs the control group. The HoST group obtained significantly higher scores (14.4 vs 11.9; P 0.012) on the GEARS. The NASA TLX indicated lower temporal demand and effort in the HoST group (5.9 vs 9.3; P = 0.001 and 5.8 vs 11.9; P = 0.035, respectively). In all, 70% of participants found that HoST was similar to the real surgical procedure, and 75% believed that HoST could improve confidence for carrying out the real intervention. Training in UVA in an AR environment improves technical skill acquisition with minimal cognitive demand. © 2014 The Authors. BJU International © 2014 BJU International.
Lee, Byoung-Hee
2016-01-01
[Purpose] This study investigated the effects of real-time feedback using infrared camera recognition technology-based augmented reality in gait training for children with cerebral palsy. [Subjects] Two subjects with cerebral palsy were recruited. [Methods] In this study, augmented reality based real-time feedback training was conducted for the subjects in two 30-minute sessions per week for four weeks. Spatiotemporal gait parameters were used to measure the effect of augmented reality-based real-time feedback training. [Results] Velocity, cadence, bilateral step and stride length, and functional ambulation improved after the intervention in both cases. [Conclusion] Although additional follow-up studies of the augmented reality based real-time feedback training are required, the results of this study demonstrate that it improved the gait ability of two children with cerebral palsy. These findings suggest a variety of applications of conservative therapeutic methods which require future clinical trials. PMID:27190489
My thoughts through a robot's eyes: an augmented reality-brain-machine interface.
Kansaku, Kenji; Hata, Naoki; Takano, Kouji
2010-02-01
A brain-machine interface (BMI) uses neurophysiological signals from the brain to control external devices, such as robot arms or computer cursors. Combining augmented reality with a BMI, we show that the user's brain signals successfully controlled an agent robot and operated devices in the robot's environment. The user's thoughts became reality through the robot's eyes, enabling the augmentation of real environments outside the anatomy of the human body.
Shen, Xin; Javidi, Bahram
2018-03-01
We have developed a three-dimensional (3D) dynamic integral-imaging (InIm)-system-based optical see-through augmented reality display with enhanced depth range of a 3D augmented image. A focus-tunable lens is adopted in the 3D display unit to relay the elemental images with various positions to the micro lens array. Based on resolution priority integral imaging, multiple lenslet image planes are generated to enhance the depth range of the 3D image. The depth range is further increased by utilizing both the real and virtual 3D imaging fields. The 3D reconstructed image and the real-world scene are overlaid using an optical see-through display for augmented reality. The proposed system can significantly enhance the depth range of a 3D reconstructed image with high image quality in the micro InIm unit. This approach provides enhanced functionality for augmented information and adjusts the vergence-accommodation conflict of a traditional augmented reality display.
Botella, Cristina; Pérez-Ara, M Ángeles; Bretón-López, Juana; Quero, Soledad; García-Palacios, Azucena; Baños, Rosa María
2016-01-01
Although in vivo exposure is the treatment of choice for specific phobias, some acceptability problems have been associated with it. Virtual Reality exposure has been shown to be as effective as in vivo exposure, and it is widely accepted for the treatment of specific phobias, but only preliminary data are available in the literature about the efficacy of Augmented Reality. The purpose of the present study was to examine the efficacy and acceptance of two treatment conditions for specific phobias in which the exposure component was applied in different ways: In vivo exposure (N = 31) versus an Augmented Reality system (N = 32) in a randomized controlled trial. "One-session treatment" guidelines were followed. Participants in the Augmented Reality condition significantly improved on all the outcome measures at post-treatment and follow-ups. When the two treatment conditions were compared, some differences were found at post-treatment, favoring the participants who received in vivo exposure. However, these differences disappeared at the 3- and 6-month follow-ups. Regarding participants' expectations and satisfaction with the treatment, very positive ratings were reported in both conditions. In addition, participants from in vivo exposure condition considered the treatment more useful for their problem whereas participants from Augmented Reality exposure considered the treatment less aversive. Results obtained in this study indicate that Augmented Reality exposure is an effective treatment for specific phobias and well accepted by the participants.
Botella, Cristina; Pérez-Ara, M. Ángeles; Bretón-López, Juana; Quero, Soledad; García-Palacios, Azucena; Baños, Rosa María
2016-01-01
Although in vivo exposure is the treatment of choice for specific phobias, some acceptability problems have been associated with it. Virtual Reality exposure has been shown to be as effective as in vivo exposure, and it is widely accepted for the treatment of specific phobias, but only preliminary data are available in the literature about the efficacy of Augmented Reality. The purpose of the present study was to examine the efficacy and acceptance of two treatment conditions for specific phobias in which the exposure component was applied in different ways: In vivo exposure (N = 31) versus an Augmented Reality system (N = 32) in a randomized controlled trial. “One-session treatment” guidelines were followed. Participants in the Augmented Reality condition significantly improved on all the outcome measures at post-treatment and follow-ups. When the two treatment conditions were compared, some differences were found at post-treatment, favoring the participants who received in vivo exposure. However, these differences disappeared at the 3- and 6-month follow-ups. Regarding participants’ expectations and satisfaction with the treatment, very positive ratings were reported in both conditions. In addition, participants from in vivo exposure condition considered the treatment more useful for their problem whereas participants from Augmented Reality exposure considered the treatment less aversive. Results obtained in this study indicate that Augmented Reality exposure is an effective treatment for specific phobias and well accepted by the participants. PMID:26886423
Usability engineering: domain analysis activities for augmented-reality systems
NASA Astrophysics Data System (ADS)
Gabbard, Joseph; Swan, J. E., II; Hix, Deborah; Lanzagorta, Marco O.; Livingston, Mark; Brown, Dennis B.; Julier, Simon J.
2002-05-01
This paper discusses our usability engineering process for the Battlefield Augmented Reality System (BARS). Usability engineering is a structured, iterative, stepwise development process. Like the related disciplines of software and systems engineering, usability engineering is a combination of management principals and techniques, formal and semi- formal evaluation techniques, and computerized tools. BARS is an outdoor augmented reality system that displays heads- up battlefield intelligence information to a dismounted warrior. The paper discusses our general usability engineering process. We originally developed the process in the context of virtual reality applications, but in this work we are adapting the procedures to an augmented reality system. The focus of this paper is our work on domain analysis, the first activity of the usability engineering process. We describe our plans for and our progress to date on our domain analysis for BARS. We give results in terms of a specific urban battlefield use case we have designed.
Basic Perception in Head-worn Augmented Reality Displays
2012-01-01
Basic Perception in Head-worn Augmented Reality Displays Mark A. Livingston, Joseph L. Gabbard , J. Edward Swan II, Ciara M. Sibley, and Jane H...mark.livingston@nrl.navy.mil Joseph L. Gabbard Virginia Polytechnic Institute and State University, Blacksburg, VA 24061, USA e-mail: jgabbard@vt.edu J...Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 2 Livingston, Gabbard , et al. 1 Introduction For many first-time users of augmented reality
Wilson, Kenneth L; Doswell, Jayfus T; Fashola, Olatokunbo S; Debeatham, Wayne; Darko, Nii; Walker, Travelyan M; Danner, Omar K; Matthews, Leslie R; Weaver, William L
2013-09-01
This study was to extrapolate potential roles of augmented reality goggles as a clinical support tool assisting in the reduction of preventable causes of death on the battlefield. Our pilot study was designed to improve medic performance in accurately placing a large bore catheter to release tension pneumothorax (prehospital setting) while using augmented reality goggles. Thirty-four preclinical medical students recruited from Morehouse School of Medicine performed needle decompressions on human cadaver models after hearing a brief training lecture on tension pneumothorax management. Clinical vignettes identifying cadavers as having life-threatening tension pneumothoraces as a consequence of improvised explosive device attacks were used. Study group (n = 13) performed needle decompression using augmented reality goggles whereas the control group (n = 21) relied solely on memory from the lecture. The two groups were compared according to their ability to accurately complete the steps required to decompress a tension pneumothorax. The medical students using augmented reality goggle support were able to treat the tension pneumothorax on the human cadaver models more accurately than the students relying on their memory (p < 0.008). Although the augmented reality group required more time to complete the needle decompression intervention (p = 0.0684), this did not reach statistical significance. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.
Virtual and Augmented Reality Systems for Renal Interventions: A Systematic Review.
Detmer, Felicitas J; Hettig, Julian; Schindele, Daniel; Schostak, Martin; Hansen, Christian
2017-01-01
Many virtual and augmented reality systems have been proposed to support renal interventions. This paper reviews such systems employed in the treatment of renal cell carcinoma and renal stones. A systematic literature search was performed. Inclusion criteria were virtual and augmented reality systems for radical or partial nephrectomy and renal stone treatment, excluding systems solely developed or evaluated for training purposes. In total, 52 research papers were identified and analyzed. Most of the identified literature (87%) deals with systems for renal cell carcinoma treatment. About 44% of the systems have already been employed in clinical practice, but only 20% in studies with ten or more patients. Main challenges remaining for future research include the consideration of organ movement and deformation, human factor issues, and the conduction of large clinical studies. Augmented and virtual reality systems have the potential to improve safety and outcomes of renal interventions. In the last ten years, many technical advances have led to more sophisticated systems, which are already applied in clinical practice. Further research is required to cope with current limitations of virtual and augmented reality assistance in clinical environments.
On the use of Augmented Reality techniques in learning and interpretation of cardiologic data.
Lamounier, Edgard; Bucioli, Arthur; Cardoso, Alexandre; Andrade, Adriano; Soares, Alcimar
2010-01-01
Augmented Reality is a technology which provides people with more intuitive ways of interaction and visualization, close to those in real world. The amount of applications using Augmented Reality is growing every day, and results can be already seen in several fields such as Education, Training, Entertainment and Medicine. The system proposed in this article intends to provide a friendly and intuitive interface based on Augmented Reality for heart beating evaluation and visualization. Cardiologic data is loaded from several distinct sources: simple standards of heart beating frequencies (for example situations like running or sleeping), files of heart beating signals, scanned electrocardiographs and real time data acquisition of patient's heart beating. All this data is processed to produce visualization within Augmented Reality environments. The results obtained in this research have shown that the developed system is able to simplify the understanding of concepts about heart beating and its functioning. Furthermore, the system can help health professionals in the task of retrieving, processing and converting data from all the sources handled by the system, with the support of an edition and visualization mode.
NASA Technical Reports Server (NTRS)
Ellis, Stephen R.
2006-01-01
The visual requirements for augmented reality or virtual environments displays that might be used in real or virtual towers are reviewed with respect to similar displays already used in aircraft. As an example of the type of human performance studies needed to determine the useful specifications of augmented reality displays, an optical see-through display was used in an ATC Tower simulation. Three different binocular fields of view (14deg, 28deg, and 47deg) were examined to determine their effect on subjects ability to detect aircraft maneuvering and landing. The results suggest that binocular fields of view much greater than 47deg are unlikely to dramatically improve search performance and that partial binocular overlap is a feasible display technique for augmented reality Tower applications.