ERIC Educational Resources Information Center
Taçgin, Zeynep; Arslan, Ahmet
2017-01-01
The purpose of this study is to determine perception of postgraduate Computer Education and Instructional Technologies (CEIT) students regarding the concepts of Augmented Reality (AR), Virtual Reality (VR), Mixed Reality (MR), Augmented Virtuality (AV) and Mirror Reality; and to offer a table that includes differences and similarities between…
Fransson, Boel A; Chen, Chi-Ya; Noyes, Julie A; Ragle, Claude A
2016-11-01
To determine the construct and concurrent validity of instrument motion metrics for laparoscopic skills assessment in virtual reality and augmented reality simulators. Evaluation study. Veterinarian students (novice, n = 14) and veterinarians (experienced, n = 11) with no or variable laparoscopic experience. Participants' minimally invasive surgery (MIS) experience was determined by hospital records of MIS procedures performed in the Teaching Hospital. Basic laparoscopic skills were assessed by 5 tasks using a physical box trainer. Each participant completed 2 tasks for assessments in each type of simulator (virtual reality: bowel handling and cutting; augmented reality: object positioning and a pericardial window model). Motion metrics such as instrument path length, angle or drift, and economy of motion of each simulator were recorded. None of the motion metrics in a virtual reality simulator showed correlation with experience, or to the basic laparoscopic skills score. All metrics in augmented reality were significantly correlated with experience (time, instrument path, and economy of movement), except for the hand dominance metric. The basic laparoscopic skills score was correlated to all performance metrics in augmented reality. The augmented reality motion metrics differed between American College of Veterinary Surgeons diplomates and residents, whereas basic laparoscopic skills score and virtual reality metrics did not. Our results provide construct validity and concurrent validity for motion analysis metrics for an augmented reality system, whereas a virtual reality system was validated only for the time score. © Copyright 2016 by The American College of Veterinary Surgeons.
Transduction between worlds: using virtual and mixed reality for earth and planetary science
NASA Astrophysics Data System (ADS)
Hedley, N.; Lochhead, I.; Aagesen, S.; Lonergan, C. D.; Benoy, N.
2017-12-01
Virtual reality (VR) and augmented reality (AR) have the potential to transform the way we visualize multidimensional geospatial datasets in support of geoscience research, exploration and analysis. The beauty of virtual environments is that they can be built at any scale, users can view them at many levels of abstraction, move through them in unconventional ways, and experience spatial phenomena as if they had superpowers. Similarly, augmented reality allows you to bring the power of virtual 3D data visualizations into everyday spaces. Spliced together, these interface technologies hold incredible potential to support 21st-century geoscience. In my ongoing research, my team and I have made significant advances to connect data and virtual simulations with real geographic spaces, using virtual environments, geospatial augmented reality and mixed reality. These research efforts have yielded new capabilities to connect users with spatial data and phenomena. These innovations include: geospatial x-ray vision; flexible mixed reality; augmented 3D GIS; situated augmented reality 3D simulations of tsunamis and other phenomena interacting with real geomorphology; augmented visual analytics; and immersive GIS. These new modalities redefine the ways in which we can connect digital spaces of spatial analysis, simulation and geovisualization, with geographic spaces of data collection, fieldwork, interpretation and communication. In a way, we are talking about transduction between real and virtual worlds. Taking a mixed reality approach to this, we can link real and virtual worlds. This paper presents a selection of our 3D geovisual interface projects in terrestrial, coastal, underwater and other environments. Using rigorous applied geoscience data, analyses and simulations, our research aims to transform the novelty of virtual and augmented reality interface technologies into game-changing mixed reality geoscience.
Architecture and Key Techniques of Augmented Reality Maintenance Guiding System for Civil Aircrafts
NASA Astrophysics Data System (ADS)
hong, Zhou; Wenhua, Lu
2017-01-01
Augmented reality technology is introduced into the maintenance related field for strengthened information in real-world scenarios through integration of virtual assistant maintenance information with real-world scenarios. This can lower the difficulty of maintenance, reduce maintenance errors, and improve the maintenance efficiency and quality of civil aviation crews. Architecture of augmented reality virtual maintenance guiding system is proposed on the basis of introducing the definition of augmented reality and analyzing the characteristics of augmented reality virtual maintenance. Key techniques involved, such as standardization and organization of maintenance data, 3D registration, modeling of maintenance guidance information and virtual maintenance man-machine interaction, are elaborated emphatically, and solutions are given.
Mixed Reality with HoloLens: Where Virtual Reality Meets Augmented Reality in the Operating Room.
Tepper, Oren M; Rudy, Hayeem L; Lefkowitz, Aaron; Weimer, Katie A; Marks, Shelby M; Stern, Carrie S; Garfein, Evan S
2017-11-01
Virtual reality and augmented reality devices have recently been described in the surgical literature. The authors have previously explored various iterations of these devices, and although they show promise, it has become clear that virtual reality and/or augmented reality devices alone do not adequately meet the demands of surgeons. The solution may lie in a hybrid technology known as mixed reality, which merges many virtual reality and augmented realty features. Microsoft's HoloLens, the first commercially available mixed reality device, provides surgeons intraoperative hands-free access to complex data, the real environment, and bidirectional communication. This report describes the use of HoloLens in the operating room to improve decision-making and surgical workflow. The pace of mixed reality-related technological development will undoubtedly be rapid in the coming years, and plastic surgeons are ideally suited to both lead and benefit from this advance.
Reality Check: Basics of Augmented, Virtual, and Mixed Reality.
Brigham, Tara J
2017-01-01
Augmented, virtual, and mixed reality applications all aim to enhance a user's current experience or reality. While variations of this technology are not new, within the last few years there has been a significant increase in the number of artificial reality devices or applications available to the general public. This column will explain the difference between augmented, virtual, and mixed reality and how each application might be useful in libraries. It will also provide an overview of the concerns surrounding these different reality applications and describe how and where they are currently being used.
Hybrid Reality Lab Capabilities - Video 2
NASA Technical Reports Server (NTRS)
Delgado, Francisco J.; Noyes, Matthew
2016-01-01
Our Hybrid Reality and Advanced Operations Lab is developing incredibly realistic and immersive systems that could be used to provide training, support engineering analysis, and augment data collection for various human performance metrics at NASA. To get a better understanding of what Hybrid Reality is, let's go through the two most commonly known types of immersive realities: Virtual Reality, and Augmented Reality. Virtual Reality creates immersive scenes that are completely made up of digital information. This technology has been used to train astronauts at NASA, used during teleoperation of remote assets (arms, rovers, robots, etc.) and other activities. One challenge with Virtual Reality is that if you are using it for real time-applications (like landing an airplane) then the information used to create the virtual scenes can be old (i.e. visualized long after physical objects moved in the scene) and not accurate enough to land the airplane safely. This is where Augmented Reality comes in. Augmented Reality takes real-time environment information (from a camera, or see through window, and places digitally created information into the scene so that it matches with the video/glass information). Augmented Reality enhances real environment information collected with a live sensor or viewport (e.g. camera, window, etc.) with the information-rich visualization provided by Virtual Reality. Hybrid Reality takes Augmented Reality even further, by creating a higher level of immersion where interactivity can take place. Hybrid Reality takes Virtual Reality objects and a trackable, physical representation of those objects, places them in the same coordinate system, and allows people to interact with both objects' representations (virtual and physical) simultaneously. After a short period of adjustment, the individuals begin to interact with all the objects in the scene as if they were real-life objects. The ability to physically touch and interact with digitally created objects that have the same shape, size, location to their physical object counterpart in virtual reality environment can be a game changer when it comes to training, planning, engineering analysis, science, entertainment, etc. Our Project is developing such capabilities for various types of environments. The video outlined with this abstract is a representation of an ISS Hybrid Reality experience. In the video you can see various Hybrid Reality elements that provide immersion beyond just standard Virtual Reality or Augmented Reality.
NASA Astrophysics Data System (ADS)
De Mauro, Alessandro; Ardanza, Aitor; Monge, Esther; Molina Rueda, Francisco
2013-03-01
Several studies have shown that both virtual and augmented reality are technologies suitable for rehabilitation therapy due to the inherent ability of simulating real daily life activities while improving patient motivation. In this paper we will first present the state of the art in the use of virtual and augmented reality applications for rehabilitation of motor disorders and second we will focus on the analysis of the results of our project. In particular, requirements of patients with cerebrovascular accidents, spinal cord injuries and cerebral palsy to the use of virtual and augmented reality systems will be detailed.
Zhu, Ming; Liu, Fei; Chai, Gang; Pan, Jun J.; Jiang, Taoran; Lin, Li; Xin, Yu; Zhang, Yan; Li, Qingfeng
2017-01-01
Augmented reality systems can combine virtual images with a real environment to ensure accurate surgery with lower risk. This study aimed to develop a novel registration and tracking technique to establish a navigation system based on augmented reality for maxillofacial surgery. Specifically, a virtual image is reconstructed from CT data using 3D software. The real environment is tracked by the augmented reality (AR) software. The novel registration strategy that we created uses an occlusal splint compounded with a fiducial marker (OSM) to establish a relationship between the virtual image and the real object. After the fiducial marker is recognized, the virtual image is superimposed onto the real environment, forming the “integrated image” on semi-transparent glass. Via the registration process, the integral image, which combines the virtual image with the real scene, is successfully presented on the semi-transparent helmet. The position error of this navigation system is 0.96 ± 0.51 mm. This augmented reality system was applied in the clinic and good surgical outcomes were obtained. The augmented reality system that we established for maxillofacial surgery has the advantages of easy manipulation and high accuracy, which can improve surgical outcomes. Thus, this system exhibits significant potential in clinical applications. PMID:28198442
Zhu, Ming; Liu, Fei; Chai, Gang; Pan, Jun J; Jiang, Taoran; Lin, Li; Xin, Yu; Zhang, Yan; Li, Qingfeng
2017-02-15
Augmented reality systems can combine virtual images with a real environment to ensure accurate surgery with lower risk. This study aimed to develop a novel registration and tracking technique to establish a navigation system based on augmented reality for maxillofacial surgery. Specifically, a virtual image is reconstructed from CT data using 3D software. The real environment is tracked by the augmented reality (AR) software. The novel registration strategy that we created uses an occlusal splint compounded with a fiducial marker (OSM) to establish a relationship between the virtual image and the real object. After the fiducial marker is recognized, the virtual image is superimposed onto the real environment, forming the "integrated image" on semi-transparent glass. Via the registration process, the integral image, which combines the virtual image with the real scene, is successfully presented on the semi-transparent helmet. The position error of this navigation system is 0.96 ± 0.51 mm. This augmented reality system was applied in the clinic and good surgical outcomes were obtained. The augmented reality system that we established for maxillofacial surgery has the advantages of easy manipulation and high accuracy, which can improve surgical outcomes. Thus, this system exhibits significant potential in clinical applications.
Virtual and Augmented Reality Systems for Renal Interventions: A Systematic Review.
Detmer, Felicitas J; Hettig, Julian; Schindele, Daniel; Schostak, Martin; Hansen, Christian
2017-01-01
Many virtual and augmented reality systems have been proposed to support renal interventions. This paper reviews such systems employed in the treatment of renal cell carcinoma and renal stones. A systematic literature search was performed. Inclusion criteria were virtual and augmented reality systems for radical or partial nephrectomy and renal stone treatment, excluding systems solely developed or evaluated for training purposes. In total, 52 research papers were identified and analyzed. Most of the identified literature (87%) deals with systems for renal cell carcinoma treatment. About 44% of the systems have already been employed in clinical practice, but only 20% in studies with ten or more patients. Main challenges remaining for future research include the consideration of organ movement and deformation, human factor issues, and the conduction of large clinical studies. Augmented and virtual reality systems have the potential to improve safety and outcomes of renal interventions. In the last ten years, many technical advances have led to more sophisticated systems, which are already applied in clinical practice. Further research is required to cope with current limitations of virtual and augmented reality assistance in clinical environments.
Augmenting the Thermal Flux Experiment: A Mixed Reality Approach with the HoloLens
ERIC Educational Resources Information Center
Strzys, M. P.; Kapp, S.; Thees, M.; Kuhn, J.; Lukowicz, P.; Knierim, P.; Schmidt, A.
2017-01-01
In the field of Virtual Reality (VR) and Augmented Reality (AR), technologies have made huge progress during the last years and also reached the field of education. The virtuality continuum, ranging from pure virtuality on one side to the real world on the other, has been successfully covered by the use of immersive technologies like head-mounted…
Innovative application of virtual display technique in virtual museum
NASA Astrophysics Data System (ADS)
Zhang, Jiankang
2017-09-01
Virtual museum refers to display and simulate the functions of real museum on the Internet in the form of 3 Dimensions virtual reality by applying interactive programs. Based on Virtual Reality Modeling Language, virtual museum building and its effective interaction with the offline museum lie in making full use of 3 Dimensions panorama technique, virtual reality technique and augmented reality technique, and innovatively taking advantages of dynamic environment modeling technique, real-time 3 Dimensions graphics generating technique, system integration technique and other key virtual reality techniques to make sure the overall design of virtual museum.3 Dimensions panorama technique, also known as panoramic photography or virtual reality, is a technique based on static images of the reality. Virtual reality technique is a kind of computer simulation system which can create and experience the interactive 3 Dimensions dynamic visual world. Augmented reality, also known as mixed reality, is a technique which simulates and mixes the information (visual, sound, taste, touch, etc.) that is difficult for human to experience in reality. These technologies make virtual museum come true. It will not only bring better experience and convenience to the public, but also be conducive to improve the influence and cultural functions of the real museum.
The Influences of the 2D Image-Based Augmented Reality and Virtual Reality on Student Learning
ERIC Educational Resources Information Center
Liou, Hsin-Hun; Yang, Stephen J. H.; Chen, Sherry Y.; Tarng, Wernhuar
2017-01-01
Virtual reality (VR) learning environments can provide students with concepts of the simulated phenomena, but users are not allowed to interact with real elements. Conversely, augmented reality (AR) learning environments blend real-world environments so AR could enhance the effects of computer simulation and promote students' realistic experience.…
ERIC Educational Resources Information Center
Gavish, Nirit; Gutiérrez, Teresa; Webel, Sabine; Rodríguez, Jorge; Peveri, Matteo; Bockholt, Uli; Tecchia, Franco
2015-01-01
The current study evaluated the use of virtual reality (VR) and augmented reality (AR) platforms, developed within the scope of the SKILLS Integrated Project, for industrial maintenance and assembly (IMA) tasks training. VR and AR systems are now widely regarded as promising training platforms for complex and highly demanding IMA tasks. However,…
A novel augmented reality system of image projection for image-guided neurosurgery.
Mahvash, Mehran; Besharati Tabrizi, Leila
2013-05-01
Augmented reality systems combine virtual images with a real environment. To design and develop an augmented reality system for image-guided surgery of brain tumors using image projection. A virtual image was created in two ways: (1) MRI-based 3D model of the head matched with the segmented lesion of a patient using MRIcro software (version 1.4, freeware, Chris Rorden) and (2) Digital photograph based model in which the tumor region was drawn using image-editing software. The real environment was simulated with a head phantom. For direct projection of the virtual image to the head phantom, a commercially available video projector (PicoPix 1020, Philips) was used. The position and size of the virtual image was adjusted manually for registration, which was performed using anatomical landmarks and fiducial markers position. An augmented reality system for image-guided neurosurgery using direct image projection has been designed successfully and implemented in first evaluation with promising results. The virtual image could be projected to the head phantom and was registered manually. Accurate registration (mean projection error: 0.3 mm) was performed using anatomical landmarks and fiducial markers position. The direct projection of a virtual image to the patients head, skull, or brain surface in real time is an augmented reality system that can be used for image-guided neurosurgery. In this paper, the first evaluation of the system is presented. The encouraging first visualization results indicate that the presented augmented reality system might be an important enhancement of image-guided neurosurgery.
Virtual reality, augmented reality…I call it i-Reality.
Grossmann, Rafael J
2015-01-01
The new term improved reality (i-Reality) is suggested to include virtual reality (VR) and augmented reality (AR). It refers to a real world that includes improved, enhanced and digitally created features that would offer an advantage on a particular occasion (i.e., a medical act). I-Reality may help us bridge the gap between the high demand for medical providers and the low supply of them by improving the interaction between providers and patients.
AR Feels "Softer" than VR: Haptic Perception of Stiffness in Augmented versus Virtual Reality.
Gaffary, Yoren; Le Gouis, Benoit; Marchal, Maud; Argelaguet, Ferran; Arnaldi, Bruno; Lecuyer, Anatole
2017-11-01
Does it feel the same when you touch an object in Augmented Reality (AR) or in Virtual Reality (VR)? In this paper we study and compare the haptic perception of stiffness of a virtual object in two situations: (1) a purely virtual environment versus (2) a real and augmented environment. We have designed an experimental setup based on a Microsoft HoloLens and a haptic force-feedback device, enabling to press a virtual piston, and compare its stiffness successively in either Augmented Reality (the virtual piston is surrounded by several real objects all located inside a cardboard box) or in Virtual Reality (the same virtual piston is displayed in a fully virtual scene composed of the same other objects). We have conducted a psychophysical experiment with 12 participants. Our results show a surprising bias in perception between the two conditions. The virtual piston is on average perceived stiffer in the VR condition compared to the AR condition. For instance, when the piston had the same stiffness in AR and VR, participants would select the VR piston as the stiffer one in 60% of cases. This suggests a psychological effect as if objects in AR would feel "softer" than in pure VR. Taken together, our results open new perspectives on perception in AR versus VR, and pave the way to future studies aiming at characterizing potential perceptual biases.
ERIC Educational Resources Information Center
Chang, Hsin-Yi; Wu, Hsin-Kai; Hsu, Ying-Shao
2013-01-01
virtual objects or information overlaying physical objects or environments, resulting in a mixed reality in which virtual objects and real environments coexist in a meaningful way to augment learning…
NASA Technical Reports Server (NTRS)
Ellis, Stephen R.
2006-01-01
The visual requirements for augmented reality or virtual environments displays that might be used in real or virtual towers are reviewed with respect to similar displays already used in aircraft. As an example of the type of human performance studies needed to determine the useful specifications of augmented reality displays, an optical see-through display was used in an ATC Tower simulation. Three different binocular fields of view (14deg, 28deg, and 47deg) were examined to determine their effect on subjects ability to detect aircraft maneuvering and landing. The results suggest that binocular fields of view much greater than 47deg are unlikely to dramatically improve search performance and that partial binocular overlap is a feasible display technique for augmented reality Tower applications.
Encarnação, L Miguel; Bimber, Oliver
2002-01-01
Collaborative virtual environments for diagnosis and treatment planning are increasingly gaining importance in our global society. Virtual and Augmented Reality approaches promised to provide valuable means for the involved interactive data analysis, but the underlying technologies still create a cumbersome work environment that is inadequate for clinical employment. This paper addresses two of the shortcomings of such technology: Intuitive interaction with multi-dimensional data in immersive and semi-immersive environments as well as stereoscopic multi-user displays combining the advantages of Virtual and Augmented Reality technology.
Hung, Andrew J; Shah, Swar H; Dalag, Leonard; Shin, Daniel; Gill, Inderbir S
2015-08-01
We developed a novel procedure specific simulation platform for robotic partial nephrectomy. In this study we prospectively evaluate its face, content, construct and concurrent validity. This hybrid platform features augmented reality and virtual reality. Augmented reality involves 3-dimensional robotic partial nephrectomy surgical videos overlaid with virtual instruments to teach surgical anatomy, technical skills and operative steps. Advanced technical skills are assessed with an embedded full virtual reality renorrhaphy task. Participants were classified as novice (no surgical training, 15), intermediate (less than 100 robotic cases, 13) or expert (100 or more robotic cases, 14) and prospectively assessed. Cohort performance was compared with the Kruskal-Wallis test (construct validity). Post-study questionnaire was used to assess the realism of simulation (face validity) and usefulness for training (content validity). Concurrent validity evaluated correlation between virtual reality renorrhaphy task and a live porcine robotic partial nephrectomy performance (Spearman's analysis). Experts rated the augmented reality content as realistic (median 8/10) and helpful for resident/fellow training (8.0-8.2/10). Experts rated the platform highly for teaching anatomy (9/10) and operative steps (8.5/10) but moderately for technical skills (7.5/10). Experts and intermediates outperformed novices (construct validity) in efficiency (p=0.0002) and accuracy (p=0.002). For virtual reality renorrhaphy, experts outperformed intermediates on GEARS metrics (p=0.002). Virtual reality renorrhaphy and in vivo porcine robotic partial nephrectomy performance correlated significantly (r=0.8, p <0.0001) (concurrent validity). This augmented reality simulation platform displayed face, content and construct validity. Performance in the procedure specific virtual reality task correlated highly with a porcine model (concurrent validity). Future efforts will integrate procedure specific virtual reality tasks and their global assessment. Copyright © 2015 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
Cabrilo, Ivan; Bijlenga, Philippe; Schaller, Karl
2014-09-01
Augmented reality technology has been used for intraoperative image guidance through the overlay of virtual images, from preoperative imaging studies, onto the real-world surgical field. Although setups based on augmented reality have been used for various neurosurgical pathologies, very few cases have been reported for the surgery of arteriovenous malformations (AVM). We present our experience with AVM surgery using a system designed for image injection of virtual images into the operating microscope's eyepiece, and discuss why augmented reality may be less appealing in this form of surgery. N = 5 patients underwent AVM resection assisted by augmented reality. Virtual three-dimensional models of patients' heads, skulls, AVM nidi, and feeder and drainage vessels were selectively segmented and injected into the microscope's eyepiece for intraoperative image guidance, and their usefulness was assessed in each case. Although the setup helped in performing tailored craniotomies, in guiding dissection and in localizing drainage veins, it did not provide the surgeon with useful information concerning feeder arteries, due to the complexity of AVM angioarchitecture. The difficulty in intraoperatively conveying useful information on feeder vessels may make augmented reality a less engaging tool in this form of surgery, and might explain its underrepresentation in the literature. Integrating an AVM's hemodynamic characteristics into the augmented rendering could make it more suited to AVM surgery.
An Analysis of Engagement in a Combination Indoor/Outdoor Augmented Reality Educational Game
ERIC Educational Resources Information Center
Folkestad, James; O'Shea, Patrick
2011-01-01
This paper describes the results of a qualitative analysis of video captured during a dual indoor/outdoor Augmented Reality experience. Augmented Reality is the layering of virtual information on top of the physical world. This Augmented Reality experience asked students to interact with the San Diego Museum of Art and the Botanical Gardens in San…
ARSC: Augmented Reality Student Card--An Augmented Reality Solution for the Education Field
ERIC Educational Resources Information Center
El Sayed, Neven A. M.; Zayed, Hala H.; Sharawy, Mohamed I.
2011-01-01
Augmented Reality (AR) is the technology of adding virtual objects to real scenes through enabling the addition of missing information in real life. As the lack of resources is a problem that can be solved through AR, this paper presents and explains the usage of AR technology we introduce Augmented Reality Student Card (ARSC) as an application of…
NASA Technical Reports Server (NTRS)
Ellis, Stephen R.
2006-01-01
The visual requirements for augmented reality or virtual environments displays that might be used in real or virtual towers are reviewed wi th respect to similar displays already used in aircraft. As an example of the type of human performance studies needed to determine the use ful specifications of augmented reality displays, an optical see-thro ugh display was used in an ATC Tower simulation. Three different binocular fields of view (14 deg, 28 deg, and 47 deg) were examined to det ermine their effect on subjects# ability to detect aircraft maneuveri ng and landing. The results suggest that binocular fields of view much greater than 47 deg are unlikely to dramatically improve search perf ormance and that partial binocular overlap is a feasible display tech nique for augmented reality Tower applications.
Immersive Education, an Annotated Webliography
ERIC Educational Resources Information Center
Pricer, Wayne F.
2011-01-01
In this second installment of a two-part feature on immersive education a webliography will provide resources discussing the use of various types of computer simulations including: (a) augmented reality, (b) virtual reality programs, (c) gaming resources for teaching with technology, (d) virtual reality lab resources, (e) virtual reality standards…
NASA Astrophysics Data System (ADS)
Wozniak, Peter; Vauderwange, Oliver; Mandal, Avikarsha; Javahiraly, Nicolas; Curticapean, Dan
2016-09-01
Practical exercises are a crucial part of many curricula. Even simple exercises can improve the understanding of the underlying subject. Most experimental setups require special hardware. To carry out e. g. a lens experiments the students need access to an optical bench, various lenses, light sources, apertures and a screen. In our previous publication we demonstrated the use of augmented reality visualization techniques in order to let the students prepare with a simulated experimental setup. Within the context of our intended blended learning concept we want to utilize augmented or virtual reality techniques for stationary laboratory exercises. Unlike applications running on mobile devices, stationary setups can be extended more easily with additional interfaces and thus allow for more complex interactions and simulations in virtual reality (VR) and augmented reality (AR). The most significant difference is the possibility to allow interactions beyond touching a screen. The LEAP Motion controller is a small inexpensive device that allows for the tracking of the user's hands and fingers in three dimensions. It is conceivable to allow the user to interact with the simulation's virtual elements by the user's very hand position, movement and gesture. In this paper we evaluate possible applications of the LEAP Motion controller for simulated experiments in augmented and virtual reality. We pay particular attention to the devices strengths and weaknesses and want to point out useful and less useful application scenarios.
Nifakos, Sokratis; Zary, Nabil
2014-01-01
The research community has called for the development of effective educational interventions for addressing prescription behaviour since antimicrobial resistance remains a global health issue. Examining the potential to displace the educational process from Personal Computers to Mobile devices, in this paper we investigated a new method of integration of Virtual Patients into Mobile devices with augmented reality technology, enriching the practitioner's education in prescription behavior. Moreover, we also explored which information are critical during the prescription behavior education and we visualized these information on real context with augmented reality technology, simultaneously with a running Virtual Patient's scenario. Following this process, we set the educational frame of experiential knowledge to a mixed (virtual and real) environment.
The Next Wave: Humans, Computers, and Redefining Reality
NASA Technical Reports Server (NTRS)
Little, William
2018-01-01
The Augmented/Virtual Reality (AVR) Lab at KSC is dedicated to " exploration into the growing computer fields of Extended Reality and the Natural User Interface (it is) a proving ground for new technologies that can be integrated into future NASA projects and programs." The topics of Human Computer Interface, Human Computer Interaction, Augmented Reality, Virtual Reality, and Mixed Reality are defined; examples of work being done in these fields in the AVR Lab are given. Current new and future work in Computer Vision, Speech Recognition, and Artificial Intelligence are also outlined.
Augmented Virtual Reality: How to Improve Education Systems
ERIC Educational Resources Information Center
Fernandez, Manuel
2017-01-01
This essay presents and discusses the developing role of virtual and augmented reality technologies in education. Addressing the challenges in adapting such technologies to focus on improving students' learning outcomes, the author discusses the inclusion of experiential modes as a vehicle for improving students' knowledge acquisition.…
Markerless client-server augmented reality system with natural features
NASA Astrophysics Data System (ADS)
Ning, Shuangning; Sang, Xinzhu; Chen, Duo
2017-10-01
A markerless client-server augmented reality system is presented. In this research, the more extensive and mature virtual reality head-mounted display is adopted to assist the implementation of augmented reality. The viewer is provided an image in front of their eyes with the head-mounted display. The front-facing camera is used to capture video signals into the workstation. The generated virtual scene is merged with the outside world information received from the camera. The integrated video is sent to the helmet display system. The distinguishing feature and novelty is to realize the augmented reality with natural features instead of marker, which address the limitations of the marker, such as only black and white, the inapplicability of different environment conditions, and particularly cannot work when the marker is partially blocked. Further, 3D stereoscopic perception of virtual animation model is achieved. The high-speed and stable socket native communication method is adopted for transmission of the key video stream data, which can reduce the calculation burden of the system.
Virtual interactive presence and augmented reality (VIPAR) for remote surgical assistance.
Shenai, Mahesh B; Dillavou, Marcus; Shum, Corey; Ross, Douglas; Tubbs, Richard S; Shih, Alan; Guthrie, Barton L
2011-03-01
Surgery is a highly technical field that combines continuous decision-making with the coordination of spatiovisual tasks. We designed a virtual interactive presence and augmented reality (VIPAR) platform that allows a remote surgeon to deliver real-time virtual assistance to a local surgeon, over a standard Internet connection. The VIPAR system consisted of a "local" and a "remote" station, each situated over a surgical field and a blue screen, respectively. Each station was equipped with a digital viewpiece, composed of 2 cameras for stereoscopic capture, and a high-definition viewer displaying a virtual field. The virtual field was created by digitally compositing selected elements within the remote field into the local field. The viewpieces were controlled by workstations mutually connected by the Internet, allowing virtual remote interaction in real time. Digital renderings derived from volumetric MRI were added to the virtual field to augment the surgeon's reality. For demonstration, a fixed-formalin cadaver head and neck were obtained, and a carotid endarterectomy (CEA) and pterional craniotomy were performed under the VIPAR system. The VIPAR system allowed for real-time, virtual interaction between a local (resident) and remote (attending) surgeon. In both carotid and pterional dissections, major anatomic structures were visualized and identified. Virtual interaction permitted remote instruction for the local surgeon, and MRI augmentation provided spatial guidance to both surgeons. Camera resolution, color contrast, time lag, and depth perception were identified as technical issues requiring further optimization. Virtual interactive presence and augmented reality provide a novel platform for remote surgical assistance, with multiple applications in surgical training and remote expert assistance.
NASA Astrophysics Data System (ADS)
Simonetto, E.; Froment, C.; Labergerie, E.; Ferré, G.; Séchet, B.; Chédorge, H.; Cali, J.; Polidori, L.
2013-07-01
Terrestrial Laser Scanning (TLS), 3-D modeling and its Web visualization are the three key steps needed to perform storage and grant-free and wide access to cultural heritage, as highlighted in many recent examples. The goal of this study is to set up 3-D Web resources for "virtually" visiting the exterior of the Abbaye de l'Epau, an old French abbey which has both a rich history and delicate architecture. The virtuality is considered in two ways: the flowing navigation in a virtual reality environment around the abbey and a game activity using augmented reality. First of all, the data acquisition consists in GPS and tacheometry survey, terrestrial laser scanning and photography acquisition. After data pre-processing, the meshed and textured 3-D model is generated using 3-D Reshaper commercial software. The virtual reality visit and augmented reality animation are then created using Unity software. This work shows the interest of such tools in bringing out the regional cultural heritage and making it attractive to the public.
Baus, Oliver; Bouchard, Stéphane
2014-01-01
This paper reviews the move from virtual reality exposure-based therapy to augmented reality exposure-based therapy (ARET). Unlike virtual reality (VR), which entails a complete virtual environment (VE), augmented reality (AR) limits itself to producing certain virtual elements to then merge them into the view of the physical world. Although, the general public may only have become aware of AR in the last few years, AR type applications have been around since beginning of the twentieth century. Since, then, technological developments have enabled an ever increasing level of seamless integration of virtual and physical elements into one view. Like VR, AR allows the exposure to stimuli which, due to various reasons, may not be suitable for real-life scenarios. As such, AR has proven itself to be a medium through which individuals suffering from specific phobia can be exposed “safely” to the object(s) of their fear, without the costs associated with programing complete VEs. Thus, ARET can offer an efficacious alternative to some less advantageous exposure-based therapies. Above and beyond presenting what has been accomplished in ARET, this paper covers some less well-known aspects of the history of AR, raises some ARET related issues, and proposes potential avenues to be followed. These include the type of measures to be used to qualify the user’s experience in an augmented reality environment, the exclusion of certain AR-type functionalities from the definition of AR, as well as the potential use of ARET to treat non-small animal phobias, such as social phobia. PMID:24624073
Baus, Oliver; Bouchard, Stéphane
2014-01-01
This paper reviews the move from virtual reality exposure-based therapy to augmented reality exposure-based therapy (ARET). Unlike virtual reality (VR), which entails a complete virtual environment (VE), augmented reality (AR) limits itself to producing certain virtual elements to then merge them into the view of the physical world. Although, the general public may only have become aware of AR in the last few years, AR type applications have been around since beginning of the twentieth century. Since, then, technological developments have enabled an ever increasing level of seamless integration of virtual and physical elements into one view. Like VR, AR allows the exposure to stimuli which, due to various reasons, may not be suitable for real-life scenarios. As such, AR has proven itself to be a medium through which individuals suffering from specific phobia can be exposed "safely" to the object(s) of their fear, without the costs associated with programing complete VEs. Thus, ARET can offer an efficacious alternative to some less advantageous exposure-based therapies. Above and beyond presenting what has been accomplished in ARET, this paper covers some less well-known aspects of the history of AR, raises some ARET related issues, and proposes potential avenues to be followed. These include the type of measures to be used to qualify the user's experience in an augmented reality environment, the exclusion of certain AR-type functionalities from the definition of AR, as well as the potential use of ARET to treat non-small animal phobias, such as social phobia.
What is going on in augmented reality simulation in laparoscopic surgery?
Botden, Sanne M B I; Jakimowicz, Jack J
2009-08-01
To prevent unnecessary errors and adverse results of laparoscopic surgery, proper training is of paramount importance. A safe way to train surgeons for laparoscopic skills is simulation. For this purpose traditional box trainers are often used, however they lack objective assessment of performance. Virtual reality laparoscopic simulators assess performance, but lack realistic haptic feedback. Augmented reality (AR) combines a virtual reality (VR) setting with real physical materials, instruments, and feedback. This article presents the current developments in augmented reality laparoscopic simulation. Pubmed searches were performed to identify articles regarding surgical simulation and augmented reality. Identified companies manufacturing an AR laparoscopic simulator received the same questionnaire referring to the features of the simulator. Seven simulators that fitted the definition of augmented reality were identified during the literature search. Five of the approached manufacturers returned a completed questionnaire, of which one simulator appeared to be VR and was therefore not applicable for this review. Several augmented reality simulators have been developed over the past few years and they are improving rapidly. We recommend the development of AR laparoscopic simulators for component tasks of procedural training. AR simulators should be implemented in current laparoscopic training curricula, in particular for laparoscopic suturing training.
Mobile devices, Virtual Reality, Augmented Reality, and Digital Geoscience Education.
NASA Astrophysics Data System (ADS)
Crompton, H.; De Paor, D. G.; Whitmeyer, S. J.; Bentley, C.
2016-12-01
Mobile devices are playing an increasing role in geoscience education. Affordances include instructor-student communication and class management in large classrooms, virtual and augmented reality applications, digital mapping, and crowd-sourcing. Mobile technologies have spawned the sub field of mobile learning or m-learning, which is defined as learning across multiple contexts, through social and content interactions. Geoscientists have traditionally engaged in non-digital mobile learning via fieldwork, but digital devices are greatly extending the possibilities, especially for non-traditional students. Smartphones and tablets are the most common devices but smart glasses such as Pivothead enable live streaming of a first-person view (see for example, https://youtu.be/gWrDaYP5w58). Virtual reality headsets such as Google Cardboard create an immersive virtual field experience and digital imagery such as GigaPan and Structure from Motion enables instructors and/or students to create virtual specimens and outcrops that are sharable across the globe. Whereas virtual reality (VR) replaces the real world with a virtual representation, augmented reality (AR) overlays digital data on the live scene visible to the user in real time. We have previously reported on our use of the AR application called FreshAiR for geoscientific "egg hunts." The popularity of Pokémon Go demonstrates the potential of AR for mobile learning in the geosciences.
ERIC Educational Resources Information Center
Orman, Evelyn K.; Price, Harry E.; Russell, Christine R.
2017-01-01
Acquiring nonverbal skills necessary to appropriately communicate and educate members of performing ensembles is essential for wind band conductors. Virtual reality learning environments (VRLEs) provide a unique setting for developing these proficiencies. For this feasibility study, we used an augmented immersive VRLE to enhance eye contact, torso…
Berryman, Donna R
2012-01-01
Augmented reality is a technology that overlays digital information on objects or places in the real world for the purpose of enhancing the user experience. It is not virtual reality, that is, the technology that creates a totally digital or computer created environment. Augmented reality, with its ability to combine reality and digital information, is being studied and implemented in medicine, marketing, museums, fashion, and numerous other areas. This article presents an overview of augmented reality, discussing what it is, how it works, its current implementations, and its potential impact on libraries.
Cranial implant design using augmented reality immersive system.
Ai, Zhuming; Evenhouse, Ray; Leigh, Jason; Charbel, Fady; Rasmussen, Mary
2007-01-01
Software tools that utilize haptics for sculpting precise fitting cranial implants are utilized in an augmented reality immersive system to create a virtual working environment for the modelers. The virtual environment is designed to mimic the traditional working environment as closely as possible, providing more functionality for the users. The implant design process uses patient CT data of a defective area. This volumetric data is displayed in an implant modeling tele-immersive augmented reality system where the modeler can build a patient specific implant that precisely fits the defect. To mimic the traditional sculpting workspace, the implant modeling augmented reality system includes stereo vision, viewer centered perspective, sense of touch, and collaboration. To achieve optimized performance, this system includes a dual-processor PC, fast volume rendering with three-dimensional texture mapping, the fast haptic rendering algorithm, and a multi-threading architecture. The system replaces the expensive and time consuming traditional sculpting steps such as physical sculpting, mold making, and defect stereolithography. This augmented reality system is part of a comprehensive tele-immersive system that includes a conference-room-sized system for tele-immersive small group consultation and an inexpensive, easily deployable networked desktop virtual reality system for surgical consultation, evaluation and collaboration. This system has been used to design patient-specific cranial implants with precise fit.
ERIC Educational Resources Information Center
Montoya, Mauricio Hincapié; Díaz, Christian Andrés; Moreno, Gustavo Adolfo
2017-01-01
Nowadays, the use of technology to improve teaching and learning experiences in the classroom has been promoted. One of these technologies is augmented reality, which allows overlaying layers of virtual information on real scene with the aim of increasing the perception that user has of reality. Augmented reality has proved to offer several…
Kiryu, Tohru; So, Richard H Y
2007-09-25
Around three years ago, in the special issue on augmented and virtual reality in rehabilitation, the topics of simulator sickness was briefly discussed in relation to vestibular rehabilitation. Simulator sickness with virtual reality applications have also been referred to as visually induced motion sickness or cybersickness. Recently, study on cybersickness has been reported in entertainment, training, game, and medical environment in several journals. Virtual stimuli can enlarge sensation of presence, but they sometimes also evoke unpleasant sensation. In order to safely apply augmented and virtual reality for long-term rehabilitation treatment, sensation of presence and cybersickness should be appropriately controlled. This issue presents the results of five studies conducted to evaluate visually-induced effects and speculate influences of virtual rehabilitation. In particular, the influence of visual and vestibular stimuli on cardiovascular responses are reported in terms of academic contribution.
Kiryu, Tohru; So, Richard HY
2007-01-01
Around three years ago, in the special issue on augmented and virtual reality in rehabilitation, the topics of simulator sickness was briefly discussed in relation to vestibular rehabilitation. Simulator sickness with virtual reality applications have also been referred to as visually induced motion sickness or cybersickness. Recently, study on cybersickness has been reported in entertainment, training, game, and medical environment in several journals. Virtual stimuli can enlarge sensation of presence, but they sometimes also evoke unpleasant sensation. In order to safely apply augmented and virtual reality for long-term rehabilitation treatment, sensation of presence and cybersickness should be appropriately controlled. This issue presents the results of five studies conducted to evaluate visually-induced effects and speculate influences of virtual rehabilitation. In particular, the influence of visual and vestibular stimuli on cardiovascular responses are reported in terms of academic contribution. PMID:17894857
NASA Astrophysics Data System (ADS)
Ribeiro, Allan; Santos, Helen
With the advent of new information and communication technologies (ICTs), the communicative interaction changes the way of being and acting of people, at the same time that changes the way of work activities related to education. In this range of possibilities provided by the advancement of computational resources include virtual reality (VR) and augmented reality (AR), are highlighted as new forms of information visualization in computer applications. While the RV allows user interaction with a virtual environment totally computer generated; in RA the virtual images are inserted in real environment, but both create new opportunities to support teaching and learning in formal and informal contexts. Such technologies are able to express representations of reality or of the imagination, as systems in nanoscale and low dimensionality, being imperative to explore, in the most diverse areas of knowledge, the potential offered by ICT and emerging technologies. In this sense, this work presents computer applications of virtual and augmented reality developed with the use of modeling and simulation in computational approaches to topics related to nanoscience and nanotechnology, and articulated with innovative pedagogical practices.
Kamel Boulos, Maged N; Lu, Zhihan; Guerrero, Paul; Jennett, Charlene; Steed, Anthony
2017-02-20
The latest generation of virtual and mixed reality hardware has rekindled interest in virtual reality GIS (VRGIS) and augmented reality GIS (ARGIS) applications in health, and opened up new and exciting opportunities and possibilities for using these technologies in the personal and public health arenas. From smart urban planning and emergency training to Pokémon Go, this article offers a snapshot of some of the most remarkable VRGIS and ARGIS solutions for tackling public and environmental health problems, and bringing about safer and healthier living options to individuals and communities. The article also covers the main technical foundations and issues underpinning these solutions.
ERIC Educational Resources Information Center
Yang, Mau-Tsuen; Liao, Wan-Che
2014-01-01
The physical-virtual immersion and real-time interaction play an essential role in cultural and language learning. Augmented reality (AR) technology can be used to seamlessly merge virtual objects with real-world images to realize immersions. Additionally, computer vision (CV) technology can recognize free-hand gestures from live images to enable…
Augmented Reality for Close Quarters Combat
None
2018-01-16
Sandia National Laboratories has developed a state-of-the-art augmented reality training system for close-quarters combat (CQB). This system uses a wearable augmented reality system to place the user in a real environment while engaging enemy combatants in virtual space (Boston Dynamics DI-Guy). Umbra modeling and simulation environment is used to integrate and control the AR system.
Augmenting Your Own Reality: Student Authoring of Science-Based Augmented Reality Games
ERIC Educational Resources Information Center
Klopfer, Eric; Sheldon, Josh
2010-01-01
Augmented Reality (AR) simulations superimpose a virtual overlay of data and interactions onto a real-world context. The simulation engine at the heart of this technology is built to afford elements of game play that support explorations and learning in students' natural context--their own community and surroundings. In one of the more recent…
Learning Anatomy via Mobile Augmented Reality: Effects on Achievement and Cognitive Load
ERIC Educational Resources Information Center
Küçük, Sevda; Kapakin, Samet; Göktas, Yüksel
2016-01-01
Augmented reality (AR), a new generation of technology, has attracted the attention of educators in recent years. In this study, a MagicBook was developed for a neuroanatomy topic by using mobile augmented reality (mAR) technology. This technology integrates virtual learning objects into the real world and allow users to interact with the…
NASA Astrophysics Data System (ADS)
Barrile, V.; Bilotta, G.; Meduri, G. M.; De Carlo, D.; Nunnari, A.
2017-11-01
In this study, using technologies such as laser scanner and GPR it was desired to see their potential in the cultural heritage. Also with regard to the processing part we are compared the results obtained by the various commercial software and algorithms developed and implemented in Matlab. Moreover, Virtual Reality and Augmented Reality allow integrating the real world with historical-artistic information, laser scanners and georadar (GPR) data and virtual objects, virtually enriching it with multimedia elements, graphic and textual information accessible through smartphones and tablets.
Augmented Reality for the Improvement of Remote Laboratories: An Augmented Remote Laboratory
ERIC Educational Resources Information Center
Andujar, J. M.; Mejias, A.; Marquez, M. A.
2011-01-01
Augmented reality (AR) provides huge opportunities for online teaching in science and engineering, as these disciplines place emphasis on practical training and unsuited to completely nonclassroom training. This paper proposes a new concept in virtual and remote laboratories: the augmented remote laboratory (ARL). ARL is being tested in the first…
Nomura, Tsutomu; Mamada, Yasuhiro; Nakamura, Yoshiharu; Matsutani, Takeshi; Hagiwara, Nobutoshi; Fujita, Isturo; Mizuguchi, Yoshiaki; Fujikura, Terumichi; Miyashita, Masao; Uchida, Eiji
2015-11-01
Definitive assessment of laparoscopic skill improvement after virtual reality simulator training is best obtained during an actual operation. However, this is impossible in medical students. Therefore, we developed an alternative assessment technique using an augmented reality simulator. Nineteen medical students completed a 6-week training program using a virtual reality simulator (LapSim). The pretest and post-test were performed using an object-positioning module and cholecystectomy on an augmented reality simulator(ProMIS). The mean performance measures between pre- and post-training on the LapSim were compared with a paired t-test. In the object-positioning module, the execution time of the task (P < 0.001), left and right instrument path length (P = 0.001), and left and right instrument economy of movement (P < 0.001) were significantly shorter after than before the LapSim training. With respect to improvement in laparoscopic cholecystectomy using a gallbladder model, the execution time to identify, clip, and cut the cystic duct and cystic artery as well as the execution time to dissect the gallbladder away from the liver bed were both significantly shorter after than before the LapSim training (P = 0.01). Our training curriculum using a virtual reality simulator improved the operative skills of medical students as objectively evaluated by assessment using an augmented reality simulator instead of an actual operation. We hope that these findings help to establish an effective training program for medical students. © 2015 Japan Society for Endoscopic Surgery, Asia Endosurgery Task Force and Wiley Publishing Asia Pty Ltd.
Utilizing virtual and augmented reality for educational and clinical enhancements in neurosurgery.
Pelargos, Panayiotis E; Nagasawa, Daniel T; Lagman, Carlito; Tenn, Stephen; Demos, Joanna V; Lee, Seung J; Bui, Timothy T; Barnette, Natalie E; Bhatt, Nikhilesh S; Ung, Nolan; Bari, Ausaf; Martin, Neil A; Yang, Isaac
2017-01-01
Neurosurgery has undergone a technological revolution over the past several decades, from trephination to image-guided navigation. Advancements in virtual reality (VR) and augmented reality (AR) represent some of the newest modalities being integrated into neurosurgical practice and resident education. In this review, we present a historical perspective of the development of VR and AR technologies, analyze its current uses, and discuss its emerging applications in the field of neurosurgery. Copyright © 2016 Elsevier Ltd. All rights reserved.
The Reality of Virtual Reality Product Development
NASA Astrophysics Data System (ADS)
Dever, Clark
Virtual Reality and Augmented Reality are emerging areas of research and product development in enterprise companies. This talk will discuss industry standard tools and current areas of application in the commercial market. Attendees will gain insights into how to research, design, and (most importantly) ship, world class products. The presentation will recount the lessons learned to date developing a Virtual Reality tool to solve physics problems resulting from trying to perform aircraft maintenance on ships at sea.
ERIC Educational Resources Information Center
Chen, Yu-Hsuan; Wang, Chang-Hwa
2018-01-01
Although research has indicated that augmented reality (AR)-facilitated instruction improves learning performance, further investigation of the usefulness of AR from a psychological perspective has been recommended. Researchers consider presence a major psychological effect when users are immersed in virtual reality environments. However, most…
Rothbaum, Barbara Olasov; Price, Matthew; Jovanovic, Tanja; Norrholm, Seth D; Gerardi, Maryrose; Dunlop, Boadie; Davis, Michael; Bradley, Bekh; Duncan, Erica J; Rizzo, Albert; Ressler, Kerry J
2014-06-01
The authors examined the effectiveness of virtual reality exposure augmented with D-cycloserine or alprazolam, compared with placebo, in reducing posttraumatic stress disorder (PTSD) due to military trauma. After an introductory session, five sessions of virtual reality exposure were augmented with D-cycloserine (50 mg) or alprazolam (0.25 mg) in a double-blind, placebo-controlled randomized clinical trial for 156 Iraq and Afghanistan war veterans with PTSD. PTSD symptoms significantly improved from pre- to posttreatment across all conditions and were maintained at 3, 6, and 12 months. There were no overall differences in symptoms between D-cycloserine and placebo at any time. Alprazolam and placebo differed significantly on the Clinician-Administered PTSD Scale score at posttreatment and PTSD diagnosis at 3 months posttreatment; the alprazolam group showed a higher rate of PTSD (82.8%) than the placebo group (47.8%). Between-session extinction learning was a treatment-specific enhancer of outcome for the D-cycloserine group only. At posttreatment, the D-cycloserine group had the lowest cortisol reactivity and smallest startle response during virtual reality scenes. A six-session virtual reality treatment was associated with reduction in PTSD diagnoses and symptoms in Iraq and Afghanistan veterans, although there was no control condition for the virtual reality exposure. There was no advantage of D-cycloserine for PTSD symptoms in primary analyses. In secondary analyses, alprazolam impaired recovery and D-cycloserine enhanced virtual reality outcome in patients who demonstrated within-session learning. D-cycloserine augmentation reduced cortisol and startle reactivity more than did alprazolam or placebo, findings that are consistent with those in the animal literature.
The NASA Augmented/Virtual Reality Lab: The State of the Art at KSC
NASA Technical Reports Server (NTRS)
Little, William
2017-01-01
The NASA Augmented Virtual Reality (AVR) Lab at Kennedy Space Center is dedicated to the investigation of Augmented Reality (AR) and Virtual Reality (VR) technologies, with the goal of determining potential uses of these technologies as human-computer interaction (HCI) devices in an aerospace engineering context. Begun in 2012, the AVR Lab has concentrated on commercially available AR and VR devices that are gaining in popularity and use in a number of fields such as gaming, training, and telepresence. We are working with such devices as the Microsoft Kinect, the Oculus Rift, the Leap Motion, the HTC Vive, motion capture systems, and the Microsoft Hololens. The focus of our work has been on human interaction with the virtual environment, which in turn acts as a communications bridge to remote physical devices and environments which the operator cannot or should not control or experience directly. Particularly in reference to dealing with spacecraft and the oftentimes hazardous environments they inhabit, it is our hope that AR and VR technologies can be utilized to increase human safety and mission success by physically removing humans from those hazardous environments while virtually putting them right in the middle of those environments.
Virtual Technologies Trends in Education
ERIC Educational Resources Information Center
Martín-Gutiérrez, Jorge; Mora, Carlos Efrén; Añorbe-Díaz, Beatriz; González-Marrero, Antonio
2017-01-01
Virtual reality captures people's attention. This technology has been applied in many sectors such as medicine, industry, education, video games, or tourism. Perhaps its biggest area of interest has been leisure and entertainment. Regardless the sector, the introduction of virtual or augmented reality had several constraints: it was expensive, it…
Augmented reality in dentistry: a current perspective.
Kwon, Ho-Beom; Park, Young-Seok; Han, Jung-Suk
2018-02-21
Augmentation reality technology offers virtual information in addition to that of the real environment and thus opens new possibilities in various fields. The medical applications of augmentation reality are generally concentrated on surgery types, including neurosurgery, laparoscopic surgery and plastic surgery. Augmentation reality technology is also widely used in medical education and training. In dentistry, oral and maxillofacial surgery is the primary area of use, where dental implant placement and orthognathic surgery are the most frequent applications. Recent technological advancements are enabling new applications of restorative dentistry, orthodontics and endodontics. This review briefly summarizes the history, definitions, features, and components of augmented reality technology and discusses its applications and future perspectives in dentistry.
Augmented reality (AR) and virtual reality (VR) applied in dentistry.
Huang, Ta-Ko; Yang, Chi-Hsun; Hsieh, Yu-Hsin; Wang, Jen-Chyan; Hung, Chun-Cheng
2018-04-01
The OSCE is a reliable evaluation method to estimate the preclinical examination of dental students. The most ideal assessment for OSCE is used the augmented reality simulator to evaluate. This literature review investigated a recently developed in virtual reality (VR) and augmented reality (AR) starting of the dental history to the progress of the dental skill. As result of the lacking of technology, it needs to depend on other device increasing the success rate and decreasing the risk of the surgery. The development of tracking unit changed the surgical and educational way. Clinical surgery is based on mature education. VR and AR simultaneously affected the skill of the training lesson and navigation system. Widely, the VR and AR not only applied in the dental training lesson and surgery, but also improved all field in our life. Copyright © 2018. Published by Elsevier Taiwan.
Virtual Reality and Augmented Reality in Plastic Surgery: A Review.
Kim, Youngjun; Kim, Hannah; Kim, Yong Oock
2017-05-01
Recently, virtual reality (VR) and augmented reality (AR) have received increasing attention, with the development of VR/AR devices such as head-mounted displays, haptic devices, and AR glasses. Medicine is considered to be one of the most effective applications of VR/AR. In this article, we describe a systematic literature review conducted to investigate the state-of-the-art VR/AR technology relevant to plastic surgery. The 35 studies that were ultimately selected were categorized into 3 representative topics: VR/AR-based preoperative planning, navigation, and training. In addition, future trends of VR/AR technology associated with plastic surgery and related fields are discussed.
Virtual Reality and Augmented Reality in Plastic Surgery: A Review
Kim, Youngjun; Kim, Hannah
2017-01-01
Recently, virtual reality (VR) and augmented reality (AR) have received increasing attention, with the development of VR/AR devices such as head-mounted displays, haptic devices, and AR glasses. Medicine is considered to be one of the most effective applications of VR/AR. In this article, we describe a systematic literature review conducted to investigate the state-of-the-art VR/AR technology relevant to plastic surgery. The 35 studies that were ultimately selected were categorized into 3 representative topics: VR/AR-based preoperative planning, navigation, and training. In addition, future trends of VR/AR technology associated with plastic surgery and related fields are discussed. PMID:28573091
3D augmented reality with integral imaging display
NASA Astrophysics Data System (ADS)
Shen, Xin; Hua, Hong; Javidi, Bahram
2016-06-01
In this paper, a three-dimensional (3D) integral imaging display for augmented reality is presented. By implementing the pseudoscopic-to-orthoscopic conversion method, elemental image arrays with different capturing parameters can be transferred into the identical format for 3D display. With the proposed merging algorithm, a new set of elemental images for augmented reality display is generated. The newly generated elemental images contain both the virtual objects and real world scene with desired depth information and transparency parameters. The experimental results indicate the feasibility of the proposed 3D augmented reality with integral imaging.
Three-Dimensional Sensor Common Operating Picture (3-D Sensor COP)
2017-01-01
created. Additionally, a 3-D model of the sensor itself can be created. Using these 3-D models, along with emerging virtual and augmented reality tools...augmented reality 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF PAGES 20 19a...iii Contents List of Figures iv 1. Introduction 1 2. The 3-D Sensor COP 2 3. Virtual Sensor Placement 7 4. Conclusions 10 5. References 11
NASA Astrophysics Data System (ADS)
Gonizzi Barsanti, S.; Malatesta, S. G.; Lella, F.; Fanini, B.; Sala, F.; Dodero, E.; Petacco, L.
2018-05-01
The best way to disseminate culture is, nowadays, the creation of scenarios with virtual and augmented reality that supply the visitors of museums with a powerful, interactive tool that allows to learn sometimes difficult concepts in an easy, entertaining way. 3D models derived from reality-based techniques are nowadays used to preserve, document and restore historical artefacts. These digital contents are also powerful instrument to interactively communicate their significance to non-specialist, making easier to understand concepts sometimes complicated or not clear. Virtual and Augmented Reality are surely a valid tool to interact with 3D models and a fundamental help in making culture more accessible to the wide public. These technologies can help the museum curators to adapt the cultural proposal and the information about the artefacts based on the different type of visitor's categories. These technologies allow visitors to travel through space and time and have a great educative function permitting to explain in an easy and attractive way information and concepts that could prove to be complicated. The aim of this paper is to create a virtual scenario and an augmented reality app to recreate specific spaces in the Capitoline Museum in Rome as they were during Winckelmann's time, placing specific statues in their original position in the 18th century.
ERIC Educational Resources Information Center
Menorath, Darren; Antonczak, Laurent
2017-01-01
This paper examines the state of the art of mobile Augmented Reality (AR) and mobile Virtual Reality (VR) in relation to collaboration and professional practices in a creative digital environment and higher education. To support their discussion, the authors use a recent design-based research project named "Juxtapose," which explores…
Visual Environment for Designing Interactive Learning Scenarios with Augmented Reality
ERIC Educational Resources Information Center
Mota, José Miguel; Ruiz-Rube, Iván; Dodero, Juan Manuel; Figueiredo, Mauro
2016-01-01
Augmented Reality (AR) technology allows the inclusion of virtual elements on a vision of actual physical environment for the creation of a mixed reality in real time. This kind of technology can be used in educational settings. However, the current AR authoring tools present several drawbacks, such as, the lack of a mechanism for tracking the…
Integrating Augmented Reality in Higher Education: A Multidisciplinary Study of Student Perceptions
ERIC Educational Resources Information Center
Delello, Julie A.; McWhorter, Rochell R.; Camp, Kerri M.
2015-01-01
Augmented reality (AR) is an emerging technology that blends physical objects with virtual reality. Through the integration of digital and print media, a gap between the "on and offline" worlds are merged, radically shifting student-computer interaction in the classroom. This research examined the results of a multiple case study on the…
Pursuit of X-ray Vision for Augmented Reality
2012-01-01
applications. Virtual Reality 15(2–3), 175–184 (2011) 29. Livingston, M.A., Swan II, J.E., Gabbard , J.L., Höllerer, T.H., Hix, D., Julier, S.J., Baillot, Y...Brown, D., Baillot, Y., Gabbard , J.L., Hix, D.: A perceptual matching technique for depth judgments in optical, see-through augmented reality. In: IEEE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gonzalez Anez, Francisco
This paper presents two development projects (STARMATE and VIRMAN) focused on supporting training on maintenance. Both projects aim at specifying, designing, developing, and demonstrating prototypes allowing computer guided maintenance of complex mechanical elements using Augmented and Virtual Reality techniques. VIRMAN is a Spanish development project. The objective is to create a computer tool for maintenance training course elaborations and training delivery based on 3D virtual reality models of complex components. The training delivery includes 3D record displays on maintenance procedures with all complementary information for intervention understanding. Users are requested to perform the maintenance intervention trying to follow up themore » procedure. Users can be evaluated about the level of knowledge achieved. Instructors can check the evaluation records left during the training sessions. VIRMAN is simple software supported by a regular computer and can be used in an Internet framework. STARMATE is a forward step in the area of virtual reality. STARMATE is a European Commission project in the frame of 'Information Societies Technologies'. A consortium of five companies and one research institute shares their expertise in this new technology. STARMATE provides two main functionalities (1) user assistance for achieving assembly/de-assembly and following maintenance procedures, and (2) workforce training. The project relies on Augmented Reality techniques, which is a growing area in Virtual Reality research. The idea of Augmented Reality is to combine a real scene, viewed by the user, with a virtual scene, generated by a computer, augmenting the reality with additional information. The user interface is see-through goggles, headphones, microphone and an optical tracking system. All these devices are integrated in a helmet connected with two regular computers. The user has his hands free for performing the maintenance intervention and he can navigate in the virtual world thanks to a voice recognition system and a virtual pointing device. The maintenance work is guided with audio instructions, 2D and 3D information are directly displayed into the user's goggles: There is a position-tracking system that allows 3D virtual models to be displayed in the real counterpart positions independently of the user allocation. The user can create his own virtual environment, placing the information required wherever he wants. The STARMATE system is applicable to a large variety of real work situations. (author)« less
Augmented reality for breast imaging.
Rancati, Alberto; Angrigiani, Claudio; Nava, Maurizio B; Catanuto, Giuseppe; Rocco, Nicola; Ventrice, Fernando; Dorr, Julio
2018-06-01
Augmented reality (AR) enables the superimposition of virtual reality reconstructions onto clinical images of a real patient, in real time. This allows visualization of internal structures through overlying tissues, thereby providing a virtual transparency vision of surgical anatomy. AR has been applied to neurosurgery, which utilizes a relatively fixed space, frames, and bony references; the application of AR facilitates the relationship between virtual and real data. Augmented breast imaging (ABI) is described. Breast MRI studies for breast implant patients with seroma were performed using a Siemens 3T system with a body coil and a four-channel bilateral phased-array breast coil as the transmitter and receiver, respectively. Gadolinium was injected as a contrast agent (0.1 mmol/kg at 2 mL/s) using a programmable power injector. Dicom formatted images data from 10 MRI cases of breast implant seroma and 10 MRI cases with T1-2 N0 M0 breast cancer, were imported and transformed into augmented reality images. ABI demonstrated stereoscopic depth perception, focal point convergence, 3D cursor use, and joystick fly-through. ABI can improve clinical outcomes, providing an enhanced view of the structures to work on. It should be further studied to determine its utility in clinical practice.
See-through 3D technology for augmented reality
NASA Astrophysics Data System (ADS)
Lee, Byoungho; Lee, Seungjae; Li, Gang; Jang, Changwon; Hong, Jong-Young
2017-06-01
Augmented reality is recently attracting a lot of attention as one of the most spotlighted next-generation technologies. In order to get toward realization of ideal augmented reality, we need to integrate 3D virtual information into real world. This integration should not be noticed by users blurring the boundary between the virtual and real worlds. Thus, ultimate device for augmented reality can reconstruct and superimpose 3D virtual information on the real world so that they are not distinguishable, which is referred to as see-through 3D technology. Here, we introduce our previous researches to combine see-through displays and 3D technologies using emerging optical combiners: holographic optical elements and index matched optical elements. Holographic optical elements are volume gratings that have angular and wavelength selectivity. Index matched optical elements are partially reflective elements using a compensation element for index matching. Using these optical combiners, we could implement see-through 3D displays based on typical methodologies including integral imaging, digital holographic displays, multi-layer displays, and retinal projection. Some of these methods are expected to be optimized and customized for head-mounted or wearable displays. We conclude with demonstration and analysis of fundamental researches for head-mounted see-through 3D displays.
Ranky, Richard G; Sivak, Mark L; Lewis, Jeffrey A; Gade, Venkata K; Deutsch, Judith E; Mavroidis, Constantinos
2014-06-05
Cycling has been used in the rehabilitation of individuals with both chronic and post-surgical conditions. Among the challenges with implementing bicycling for rehabilitation is the recruitment of both extremities, in particular when one is weaker or less coordinated. Feedback embedded in virtual reality (VR) augmented cycling may serve to address the requirement for efficacious cycling; specifically recruitment of both extremities and exercising at a high intensity. In this paper a mechatronic rehabilitation bicycling system with an interactive virtual environment, called Virtual Reality Augmented Cycling Kit (VRACK), is presented. Novel hardware components embedded with sensors were implemented on a stationary exercise bicycle to monitor physiological and biomechanical parameters of participants while immersing them in an augmented reality simulation providing the user with visual, auditory and haptic feedback. This modular and adaptable system attaches to commercially-available stationary bicycle systems and interfaces with a personal computer for simulation and data acquisition processes. The complete bicycle system includes: a) handle bars based on hydraulic pressure sensors; b) pedals that monitor pedal kinematics with an inertial measurement unit (IMU) and forces on the pedals while providing vibratory feedback; c) off the shelf electronics to monitor heart rate and d) customized software for rehabilitation. Bench testing for the handle and pedal systems is presented for calibration of the sensors detecting force and angle. The modular mechatronic kit for exercise bicycles was tested in bench testing and human tests. Bench tests performed on the sensorized handle bars and the instrumented pedals validated the measurement accuracy of these components. Rider tests with the VRACK system focused on the pedal system and successfully monitored kinetic and kinematic parameters of the rider's lower extremities. The VRACK system, a virtual reality mechatronic bicycle rehabilitation modular system was designed to convert most bicycles in virtual reality (VR) cycles. Preliminary testing of the augmented reality bicycle system was successful in demonstrating that a modular mechatronic kit can monitor and record kinetic and kinematic parameters of several riders.
Vision-based augmented reality system
NASA Astrophysics Data System (ADS)
Chen, Jing; Wang, Yongtian; Shi, Qi; Yan, Dayuan
2003-04-01
The most promising aspect of augmented reality lies in its ability to integrate the virtual world of the computer with the real world of the user. Namely, users can interact with the real world subjects and objects directly. This paper presents an experimental augmented reality system with a video see-through head-mounted device to display visual objects, as if they were lying on the table together with real objects. In order to overlay virtual objects on the real world at the right position and orientation, the accurate calibration and registration are most important. A vision-based method is used to estimate CCD external parameters by tracking 4 known points with different colors. It achieves sufficient accuracy for non-critical applications such as gaming, annotation and so on.
Performance Of The IEEE 802.15.4 Protocol As The Marker Of Augmented Reality In Museum
NASA Astrophysics Data System (ADS)
Kurniawan Saputro, Adi; Sumpeno, Surya; Hariadi, Mochamad
2018-04-01
Museum is a place to keep the historic objects and historical education center to introduce the nation’s culture. Utilizing technology in a museum to become a smart city is a challenge. Internet of thing (IOT) is a technological advance in Information and communication (ICT) that can be applied in the museum The current ICT development is not only a transmission medium, but Augmented Reality technology is also being developed. Currently, Augmented Reality technology creates virtual objects into the real world using markers or images. In this study, researcher used signals to make virtual objects appear in the real world using the IEEE 802.14.5 protocol replacing the Augmented Reality marker. RSSI and triangulation are used as a substitute microlocation for AR objects. The result is the performance of Wireless Sensor Network could be used for data transmission in the museum. LOS research at a distance of 15 meters with 1000 ms delay found 1.4% error rate and NLOS with 2.3% error rate. So it can be concluded that utilization technology (IOT) using signal wireless sensor network as a replace for marker augmented reality can be used in museum
Applying Augmented Reality in practical classes for engineering students
NASA Astrophysics Data System (ADS)
Bazarov, S. E.; Kholodilin, I. Yu; Nesterov, A. S.; Sokhina, A. V.
2017-10-01
In this article the Augmented Reality application for teaching engineering students of electrical and technological specialties is introduced. In order to increase the motivation for learning and the independence of students, new practical guidelines on Augmented Reality were developed in the application to practical classes. During the application development, the authors used software such as Unity 3D and Vuforia. The Augmented Reality content consists of 3D-models, images and animations, which are superimposed on real objects, helping students to study specific tasks. A user who has a smartphone, a tablet PC, or Augmented Reality glasses can visualize on-screen virtual objects added to a real environment. Having analyzed the current situation in higher education: the learner’s interest in studying, their satisfaction with the educational process, and the impact of the Augmented Reality application on students, a questionnaire was developed and offered to students; the study involved 24 learners.
Augmenting breath regulation using a mobile driven virtual reality therapy framework.
Abushakra, Ahmad; Faezipour, Miad
2014-05-01
This paper presents a conceptual framework of a virtual reality therapy to assist individuals, especially lung cancer patients or those with breathing disorders to regulate their breath through real-time analysis of respiration movements using a smartphone. Virtual reality technology is an attractive means for medical simulations and treatment, particularly for patients with cancer. The theories, methodologies and approaches, and real-world dynamic contents for all the components of this virtual reality therapy (VRT) via a conceptual framework using the smartphone will be discussed. The architecture and technical aspects of the offshore platform of the virtual environment will also be presented.
Application of Virtual, Augmented, and Mixed Reality to Urology.
Hamacher, Alaric; Kim, Su Jin; Cho, Sung Tae; Pardeshi, Sunil; Lee, Seung Hyun; Eun, Sung-Jong; Whangbo, Taeg Keun
2016-09-01
Recent developments in virtual, augmented, and mixed reality have introduced a considerable number of new devices into the consumer market. This momentum is also affecting the medical and health care sector. Although many of the theoretical and practical foundations of virtual reality (VR) were already researched and experienced in the 1980s, the vastly improved features of displays, sensors, interactivity, and computing power currently available in devices offer a new field of applications to the medical sector and also to urology in particular. The purpose of this review article is to review the extent to which VR technology has already influenced certain aspects of medicine, the applications that are currently in use in urology, and the future development trends that could be expected.
Application of Virtual, Augmented, and Mixed Reality to Urology
2016-01-01
Recent developments in virtual, augmented, and mixed reality have introduced a considerable number of new devices into the consumer market. This momentum is also affecting the medical and health care sector. Although many of the theoretical and practical foundations of virtual reality (VR) were already researched and experienced in the 1980s, the vastly improved features of displays, sensors, interactivity, and computing power currently available in devices offer a new field of applications to the medical sector and also to urology in particular. The purpose of this review article is to review the extent to which VR technology has already influenced certain aspects of medicine, the applications that are currently in use in urology, and the future development trends that could be expected. PMID:27706017
ERIC Educational Resources Information Center
Ong, Alex
2010-01-01
The use of augmented reality (AR) tools, where virtual objects such as tables and graphs can be displayed and be interacted with in real scenes created from imaging devices, in mainstream school curriculum is uncommon, as they are potentially costly and sometimes bulky. Thus, such learning tools are mainly applied in tertiary institutions, such as…
NASA Astrophysics Data System (ADS)
Soler, Luc; Marescaux, Jacques
2006-04-01
Technological innovations of the 20 th century provided medicine and surgery with new tools, among which virtual reality and robotics belong to the most revolutionary ones. Our work aims at setting up new techniques for detection, 3D delineation and 4D time follow-up of small abdominal lesions from standard mecial images (CT scsan, MRI). It also aims at developing innovative systems making tumor resection or treatment easier with the use of augmented reality and robotized systems, increasing gesture precision. It also permits a realtime great distance connection between practitioners so they can share a same 3D reconstructed patient and interact on a same patient, virtually before the intervention and for real during the surgical procedure thanks to a telesurgical robot. In preclinical studies, our first results obtained from a micro-CT scanner show that these technologies provide an efficient and precise 3D modeling of anatomical and pathological structures of rats and mice. In clinical studies, our first results show the possibility to improve the therapeutic choice thanks to a better detection and and representation of the patient before performing the surgical gesture. They also show the efficiency of augmented reality that provides virtual transparency of the patient in real time during the operative procedure. In the near future, through the exploitation of these systems, surgeons will program and check on the virtual patient clone an optimal procedure without errors, which will be replayed on the real patient by the robot under surgeon control. This medical dream is today about to become reality.
A standardized set of 3-D objects for virtual reality research and applications.
Peeters, David
2018-06-01
The use of immersive virtual reality as a research tool is rapidly increasing in numerous scientific disciplines. By combining ecological validity with strict experimental control, immersive virtual reality provides the potential to develop and test scientific theories in rich environments that closely resemble everyday settings. This article introduces the first standardized database of colored three-dimensional (3-D) objects that can be used in virtual reality and augmented reality research and applications. The 147 objects have been normed for name agreement, image agreement, familiarity, visual complexity, and corresponding lexical characteristics of the modal object names. The availability of standardized 3-D objects for virtual reality research is important, because reaching valid theoretical conclusions hinges critically on the use of well-controlled experimental stimuli. Sharing standardized 3-D objects across different virtual reality labs will allow for science to move forward more quickly.
Augmenting the thermal flux experiment: A mixed reality approach with the HoloLens
NASA Astrophysics Data System (ADS)
Strzys, M. P.; Kapp, S.; Thees, M.; Kuhn, J.; Lukowicz, P.; Knierim, P.; Schmidt, A.
2017-09-01
In the field of Virtual Reality (VR) and Augmented Reality (AR), technologies have made huge progress during the last years and also reached the field of education. The virtuality continuum, ranging from pure virtuality on one side to the real world on the other, has been successfully covered by the use of immersive technologies like head-mounted displays, which allow one to embed virtual objects into the real surroundings, leading to a Mixed Reality (MR) experience. In such an environment, digital and real objects do not only coexist, but moreover are also able to interact with each other in real time. These concepts can be used to merge human perception of reality with digitally visualized sensor data, thereby making the invisible visible. As a first example, in this paper we introduce alongside the basic idea of this column an MR experiment in thermodynamics for a laboratory course for freshman students in physics or other science and engineering subjects that uses physical data from mobile devices for analyzing and displaying physical phenomena to students.
Augmented kinematic feedback from haptic virtual reality for dental skill acquisition.
Suebnukarn, Siriwan; Haddawy, Peter; Rhienmora, Phattanapon; Jittimanee, Pannapa; Viratket, Piyanuch
2010-12-01
We have developed a haptic virtual reality system for dental skill training. In this study we examined several kinds of kinematic information about the movement provided by the system supplement knowledge of results (KR) in dental skill acquisition. The kinematic variables examined involved force utilization (F) and mirror view (M). This created three experimental conditions that received augmented kinematic feedback (F, M, FM) and one control condition that did not (KR-only). Thirty-two dental students were randomly assigned to four groups. Their task was to perform access opening on the upper first molar with the haptic virtual reality system. An acquisition session consisted of two days of ten trials of practice in which augmented kinematic feedback was provided for the appropriate experimental conditions after each trial. One week after, a retention test consisting of two trials without augmented feedback was completed. The results showed that the augmented kinematic feedback groups had larger mean performance scores than the KR-only group in Day 1 of the acquisition and retention sessions (ANOVA, p<0.05). The apparent differences among feedback groups were not significant in Day 2 of the acquisition session (ANOVA, p>0.05). The trends in acquisition and retention sessions suggest that the augmented kinematic feedback can enhance the performance earlier in the skill acquisition and retention sessions.
Frames of Reference in Mobile Augmented Reality Displays
ERIC Educational Resources Information Center
Mou, Weimin; Biocca, Frank; Owen, Charles B.; Tang, Arthur; Xiao, Fan; Lim, Lynette
2004-01-01
In 3 experiments, the authors investigated spatial updating in augmented reality environments. Participants learned locations of virtual objects on the physical floor. They were turned to appropriate facing directions while blindfolded before making pointing judgments (e.g., "Imagine you are facing X. Point to Y"). Experiments manipulated the…
Augmented Reality: A Brand New Challenge for the Assessment and Treatment of Psychological Disorders
Chicchi Giglioli, Irene Alice; Pedroli, Elisa
2015-01-01
Augmented Reality is a new technological system that allows introducing virtual contents in the real world in order to run in the same representation and, in real time, enhancing the user's sensory perception of reality. From another point of view, Augmented Reality can be defined as a set of techniques and tools that add information to the physical reality. To date, Augmented Reality has been used in many fields, such as medicine, entertainment, maintenance, architecture, education, and cognitive and motor rehabilitation but very few studies and applications of AR exist in clinical psychology. In the treatment of psychological disorders, Augmented Reality has given preliminary evidence to be a useful tool due to its adaptability to the patient needs and therapeutic purposes and interactivity. Another relevant factor is the quality of the user's experience in the Augmented Reality system determined from emotional engagement and sense of presence. This experience could increase the AR ecological validity in the treatment of psychological disorders. This paper reviews the recent studies on the use of Augmented Reality in the evaluation and treatment of psychological disorders, focusing on current uses of this technology and on the specific features that delineate Augmented Reality a new technique useful for psychology. PMID:26339283
Chicchi Giglioli, Irene Alice; Pallavicini, Federica; Pedroli, Elisa; Serino, Silvia; Riva, Giuseppe
2015-01-01
Augmented Reality is a new technological system that allows introducing virtual contents in the real world in order to run in the same representation and, in real time, enhancing the user's sensory perception of reality. From another point of view, Augmented Reality can be defined as a set of techniques and tools that add information to the physical reality. To date, Augmented Reality has been used in many fields, such as medicine, entertainment, maintenance, architecture, education, and cognitive and motor rehabilitation but very few studies and applications of AR exist in clinical psychology. In the treatment of psychological disorders, Augmented Reality has given preliminary evidence to be a useful tool due to its adaptability to the patient needs and therapeutic purposes and interactivity. Another relevant factor is the quality of the user's experience in the Augmented Reality system determined from emotional engagement and sense of presence. This experience could increase the AR ecological validity in the treatment of psychological disorders. This paper reviews the recent studies on the use of Augmented Reality in the evaluation and treatment of psychological disorders, focusing on current uses of this technology and on the specific features that delineate Augmented Reality a new technique useful for psychology.
Baker, Benjamin; Amin, Kavit; Chan, Adrian; Patel, Ketan; Wong, Jason
2016-01-01
The continuing enhancement of the surgical environment in the digital age has led to a number of innovations being highlighted as potential disruptive technologies in the surgical workplace. Augmented reality (AR) and virtual reality (VR) are rapidly becoming increasingly available, accessible and importantly affordable, hence their application into healthcare to enhance the medical use of data is certain. Whether it relates to anatomy, intraoperative surgery, or post-operative rehabilitation, applications are already being investigated for their role in the surgeons armamentarium. Here we provide an introduction to the technology and the potential areas of development in the surgical arena. PMID:28090510
Khor, Wee Sim; Baker, Benjamin; Amin, Kavit; Chan, Adrian; Patel, Ketan; Wong, Jason
2016-12-01
The continuing enhancement of the surgical environment in the digital age has led to a number of innovations being highlighted as potential disruptive technologies in the surgical workplace. Augmented reality (AR) and virtual reality (VR) are rapidly becoming increasingly available, accessible and importantly affordable, hence their application into healthcare to enhance the medical use of data is certain. Whether it relates to anatomy, intraoperative surgery, or post-operative rehabilitation, applications are already being investigated for their role in the surgeons armamentarium. Here we provide an introduction to the technology and the potential areas of development in the surgical arena.
A telescope with augmented reality functions
NASA Astrophysics Data System (ADS)
Hou, Qichao; Cheng, Dewen; Wang, Qiwei; Wang, Yongtian
2016-10-01
This study introduces a telescope with virtual reality (VR) and augmented reality (AR) functions. In this telescope, information on the micro-display screen is integrated to the reticule of telescope through a beam splitter and is then received by the observer. The design and analysis of telescope optical system with AR and VR ability is accomplished and the opto-mechanical structure is designed. Finally, a proof-of-concept prototype is fabricated and demonstrated. The telescope has an exit pupil diameter of 6 mm at an eye relief of 19 mm, 6° field of view, 5 to 8 times visual magnification , and a 30° field of view of the virtual image.
NASA Astrophysics Data System (ADS)
Dörner, Ralf; Lok, Benjamin; Broll, Wolfgang
Backed by a large consumer market, entertainment and education applications have spurred developments in the fields of real-time rendering and interactive computer graphics. Relying on Computer Graphics methodologies, Virtual Reality and Augmented Reality benefited indirectly from this; however, there is no large scale demand for VR and AR in gaming and learning. What are the shortcomings of current VR/AR technology that prevent a widespread use in these application areas? What advances in VR/AR will be necessary? And what might future “VR-enhanced” gaming and learning look like? Which role can and will Virtual Humans play? Concerning these questions, this article analyzes the current situation and provides an outlook on future developments. The focus is on social gaming and learning.
On Location Learning: Authentic Applied Science with Networked Augmented Realities
NASA Astrophysics Data System (ADS)
Rosenbaum, Eric; Klopfer, Eric; Perry, Judy
2007-02-01
The learning of science can be made more like the practice of science through authentic simulated experiences. We have created a networked handheld Augmented Reality environment that combines the authentic role-playing of Augmented Realities and the underlying models of Participatory Simulations. This game, known as Outbreak @ The Institute, is played across a university campus where players take on the roles of doctors, medical technicians, and public health experts to contain a disease outbreak. Players can interact with virtual characters and employ virtual diagnostic tests and medicines. They are challenged to identify the source and prevent the spread of an infectious disease that can spread among real and/or virtual characters according to an underlying model. In this paper, we report on data from three high school classes who played the game. We investigate students' perception of the authenticity of the game in terms of their personal embodiment in the game, their experience playing different roles, and their understanding of the dynamic model underlying the game.
An Interactive Augmented Reality Implementation of Hijaiyah Alphabet for Children Education
NASA Astrophysics Data System (ADS)
Rahmat, R. F.; Akbar, F.; Syahputra, M. F.; Budiman, M. A.; Hizriadi, A.
2018-03-01
Hijaiyah alphabet is letters used in the Qur’an. An attractive and exciting learning process of Hijaiyah alphabet is necessary for the children. One of the alternatives to create attractive and interesting learning process of Hijaiyah alphabet is to develop it into a mobile application using augmented reality technology. Augmented reality is a technology that combines two-dimensional or three-dimensional virtual objects into actual three-dimensional circles and projects them in real time. The purpose of application aims to foster the children interest in learning Hijaiyah alphabet. This application is using Smartphone and marker as the medium. It was built using Unity and augmented reality library, namely Vuforia, then using Blender as the 3D object modeling software. The output generated from this research is the learning application of Hijaiyah letters using augmented reality. How to use it is as follows: first, place marker that has been registered and printed; second, the smartphone camera will track the marker. If the marker is invalid, the user should repeat the tracking process. If the marker is valid and identified, the marker will have projected the objects of Hijaiyah alphabet in three-dimensional form. Lastly, the user can learn and understand the shape and pronunciation of Hijaiyah alphabet by touching the virtual button on the marker
NASA Astrophysics Data System (ADS)
Starodubtsev, Illya
2017-09-01
The paper describes the implementation of the system of interaction with virtual objects based on gestures. The paper describes the common problems of interaction with virtual objects, specific requirements for the interfaces for virtual and augmented reality.
Advanced Visual and Instruction Systems for Maintenance Support (AVIS-MS)
2006-12-01
Hayashi , "Augmentable Reality: Situated Communication through Physical and Digital Spaces," Proc. 2nd Int’l Symp. Wearable Computers, IEEE CS Press...H. Ohno , "An Optical See-through Display for Mutual Occlusion of Real and Virtual Environments," Proc. Int’l Symp. Augmented Reality 2000 (ISARO0
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Sandia National Laboratories has developed a state-of-the-art augmented reality training system for close-quarters combat (CQB). This system uses a wearable augmented reality system to place the user in a real environment while engaging enemy combatants in virtual space (Boston Dynamics DI-Guy). Umbra modeling and simulation environment is used to integrate and control the AR system.
The Design of Immersive English Learning Environment Using Augmented Reality
ERIC Educational Resources Information Center
Li, Kuo-Chen; Chen, Cheng-Ting; Cheng, Shein-Yung; Tsai, Chung-Wei
2016-01-01
The study uses augmented reality (AR) technology to integrate virtual objects into the real learning environment for language learning. The English AR classroom is constructed using the system prototyping method and evaluated by semi-structured in-depth interviews. According to the flow theory by Csikszenmihalyi in 1975 along with the immersive…
Enhancing and Transforming Global Learning Communities with Augmented Reality
ERIC Educational Resources Information Center
Frydenberg, Mark; Andone, Diana
2018-01-01
Augmented and virtual reality applications bring new insights to real world objects and scenarios. This paper shares research results of the TalkTech project, an ongoing study investigating the impact of learning about new technologies as members of global communities. This study shares results of a collaborative learning project about augmented…
Understanding the Conics through Augmented Reality
ERIC Educational Resources Information Center
Salinas, Patricia; Pulido, Ricardo
2017-01-01
This paper discusses the production of a digital environment to foster the learning of conics through augmented reality. The name conic refers to curves obtained by the intersection of a plane with a right circular conical surface. The environment gives students the opportunity to interact with the cone and the plane as virtual objects in real…
2014-01-01
Background Cycling has been used in the rehabilitation of individuals with both chronic and post-surgical conditions. Among the challenges with implementing bicycling for rehabilitation is the recruitment of both extremities, in particular when one is weaker or less coordinated. Feedback embedded in virtual reality (VR) augmented cycling may serve to address the requirement for efficacious cycling; specifically recruitment of both extremities and exercising at a high intensity. Methods In this paper a mechatronic rehabilitation bicycling system with an interactive virtual environment, called Virtual Reality Augmented Cycling Kit (VRACK), is presented. Novel hardware components embedded with sensors were implemented on a stationary exercise bicycle to monitor physiological and biomechanical parameters of participants while immersing them in an augmented reality simulation providing the user with visual, auditory and haptic feedback. This modular and adaptable system attaches to commercially-available stationary bicycle systems and interfaces with a personal computer for simulation and data acquisition processes. The complete bicycle system includes: a) handle bars based on hydraulic pressure sensors; b) pedals that monitor pedal kinematics with an inertial measurement unit (IMU) and forces on the pedals while providing vibratory feedback; c) off the shelf electronics to monitor heart rate and d) customized software for rehabilitation. Bench testing for the handle and pedal systems is presented for calibration of the sensors detecting force and angle. Results The modular mechatronic kit for exercise bicycles was tested in bench testing and human tests. Bench tests performed on the sensorized handle bars and the instrumented pedals validated the measurement accuracy of these components. Rider tests with the VRACK system focused on the pedal system and successfully monitored kinetic and kinematic parameters of the rider’s lower extremities. Conclusions The VRACK system, a virtual reality mechatronic bicycle rehabilitation modular system was designed to convert most bicycles in virtual reality (VR) cycles. Preliminary testing of the augmented reality bicycle system was successful in demonstrating that a modular mechatronic kit can monitor and record kinetic and kinematic parameters of several riders. PMID:24902780
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hayes, Birchard P; Michel, Kelly D; Few, Douglas A
From stereophonic, positional sound to high-definition imagery that is crisp and clean, high fidelity computer graphics enhance our view, insight, and intuition regarding our environments and conditions. Contemporary 3-D modeling tools offer an open architecture framework that enables integration with other technologically innovative arenas. One innovation of great interest is Augmented Reality, the merging of virtual, digital environments with physical, real-world environments creating a mixed reality where relevant data and information augments the real or actual experience in real-time by spatial or semantic context. Pairing 3-D virtual immersive models with a dynamic platform such as semi-autonomous robotics or personnel odometrymore » systems to create a mixed reality offers a new and innovative design information verification inspection capability, evaluation accuracy, and information gathering capability for nuclear facilities. Our paper discusses the integration of two innovative technologies, 3-D visualizations with inertial positioning systems, and the resulting augmented reality offered to the human inspector. The discussion in the paper includes an exploration of human and non-human (surrogate) inspections of a nuclear facility, integrated safeguards knowledge within a synchronized virtual model operated, or worn, by a human inspector, and the anticipated benefits to safeguards evaluations of facility operations.« less
Magic cards: a new augmented-reality approach.
Demuynck, Olivier; Menendez, José Manuel
2013-01-01
Augmented reality (AR) commonly uses markers for detection and tracking. Such multimedia applications associate each marker with a virtual 3D model stored in the memory of the camera-equipped device running the application. Application users are limited in their interactions, which require knowing how to design and program 3D objects. This generally prevents them from developing their own entertainment AR applications. The Magic Cards application solves this problem by offering an easy way to create and manage an unlimited number of virtual objects that are encoded on special markers.
Fiia: A Model-Based Approach to Engineering Collaborative Augmented Reality
NASA Astrophysics Data System (ADS)
Wolfe, Christopher; Smith, J. David; Phillips, W. Greg; Graham, T. C. Nicholas
Augmented reality systems often involve collaboration among groups of people. While there are numerous toolkits that aid the development of such augmented reality groupware systems (e.g., ARToolkit and Groupkit), there remains an enormous gap between the specification of an AR groupware application and its implementation. In this chapter, we present Fiia, a toolkit which simplifies the development of collaborative AR applications. Developers specify the structure of their applications using the Fiia modeling language, which abstracts details of networking and provides high-level support for specifying adapters between the physical and virtual world. The Fiia.Net runtime system then maps this conceptual model to a runtime implementation. We illustrate Fiia via Raptor, an augmented reality application used to help small groups collaboratively prototype video games.
Optical methods for enabling focus cues in head-mounted displays for virtual and augmented reality
NASA Astrophysics Data System (ADS)
Hua, Hong
2017-05-01
Developing head-mounted displays (HMD) that offer uncompromised optical pathways to both digital and physical worlds without encumbrance and discomfort confronts many grand challenges, both from technological perspectives and human factors. Among the many challenges, minimizing visual discomfort is one of the key obstacles. One of the key contributing factors to visual discomfort is the lack of the ability to render proper focus cues in HMDs to stimulate natural eye accommodation responses, which leads to the well-known accommodation-convergence cue discrepancy problem. In this paper, I will provide a summary on the various optical methods approaches toward enabling focus cues in HMDs for both virtual reality (VR) and augmented reality (AR).
Mass production of holographic transparent components for augmented and virtual reality applications
NASA Astrophysics Data System (ADS)
Russo, Juan Manuel; Dimov, Fedor; Padiyar, Joy; Coe-Sullivan, Seth
2017-06-01
Diffractive optics such as holographic optical elements (HOEs) can provide transparent and narrow band components with arbitrary incident and diffracted angles for near-to-eye commercial electronic products for augmented reality (AR), virtual reality (VR), and smart glass applications. In this paper, we will summarize the operational parameters and general optical geometries relevant for near-to-eye displays, the holographic substrates available for these applications, and their performance characteristics and ease of manufacture. We will compare the holographic substrates available in terms of fabrication, manufacturability, and end-user performance characteristics. Luminit is currently emplacing the manufacturing capacity to serve this market, and this paper will discuss the capabilities and limitations of this unique facility.
Corrêa, Ana Grasielle Dionísio; de Assis, Gilda Aparecida; do Nascimento, Marilena; de Deus Lopes, Roseli
2017-04-01
Augmented Reality musical software (GenVirtual) is a technology, which primarily allows users to develop music activities for rehabilitation. This study aimed to analyse the perceptions of health care professionals regarding the clinical utility of GenVirtual. A second objective was to identify improvements to GenVirtual software and similar technologies. Music therapists, occupational therapists, physiotherapists and speech and language therapist who assist people with physical and cognitive disabilities were enrolled in three focus groups. The quantitative and qualitative data were collected through inductive thematic analysis. Three main themes were identified: the use of GenVirtual in health care areas; opportunities for realistic application of GenVirtual; and limitations in the use of GenVirtual. The registration units identified were: motor stimulation, cognitive stimulation, verbal learning, recreation activity, musicality, accessibility, motivation, sonic accuracy, interference of lighting, poor sound, children and adults. This research suggested that the GenVirtual is a complementary tool to conventional clinical practice and has great potential to motor and cognitive rehabilitation of children and adults. Implications for Rehabilitation Gaining health professional' perceptions of the Augmented Reality musical game (GenVirtual) give valuable information as to the clinical utility of the software. GenVirtual was perceived as a tool that could be used as enhancing the motor and cognitive rehabilitation process. GenVirtual was viewed as a tool that could enhance clinical practice and communication among various agencies, but it was suggested that it should be used with caution to avoid confusion and replacement of important services.
ERIC Educational Resources Information Center
Harley, Jason M.; Poitras, Eric G.; Jarrell, Amanda; Duffy, Melissa C.; Lajoie, Susanne P.
2016-01-01
Research on the effectiveness of augmented reality (AR) on learning exists, but there is a paucity of empirical work that explores the role that positive emotions play in supporting learning in such settings. To address this gap, this study compared undergraduate students' emotions and learning outcomes during a guided historical tour using mobile…
ERIC Educational Resources Information Center
Dunleavy, Matt; Dede, Chris; Mitchell, Rebecca
2009-01-01
The purpose of this study was to document how teachers and students describe and comprehend the ways in which participating in an augmented reality (AR) simulation aids or hinders teaching and learning. Like the multi-user virtual environment (MUVE) interface that underlies Internet games, AR is a good medium for immersive collaborative…
Tomson, Tanja; Zary, Nabil
2014-01-01
Introduction. Antimicrobial resistance is a global health issue. Studies have shown that improved antibiotic prescription education among healthcare professionals reduces mistakes during the antibiotic prescription process. The aim of this study was to investigate novel educational approaches that through the use of Augmented Reality technology could make use of the real physical context and thereby enrich the educational process of antibiotics prescription. The objective is to investigate which type of information related to antibiotics could be used in an augmented reality application for antibiotics education. Methods. This study followed the Design-Based Research Methodology composed of the following main steps: problem analysis, investigation of information that should be visualized for the training session, and finally the involvement of the end users the development and evaluation processes of the prototype. Results. Two of the most important aspects in the antibiotic prescription process, to represent in an augmented reality application, are the antibiotic guidelines and the side effects. Moreover, this study showed how this information could be visualized from a mobile device using an Augmented Reality scanner and antibiotic drug boxes as markers. Discussion. In this study we investigated the usage of objects from a real physical context such as drug boxes and how they could be used as educational resources. The logical next steps are to examine how this approach of combining physical and virtual contexts through Augmented Reality applications could contribute to the improvement of competencies among healthcare professionals and its impact on the decrease of antibiotics resistance. PMID:25548733
Nifakos, Sokratis; Tomson, Tanja; Zary, Nabil
2014-01-01
Introduction. Antimicrobial resistance is a global health issue. Studies have shown that improved antibiotic prescription education among healthcare professionals reduces mistakes during the antibiotic prescription process. The aim of this study was to investigate novel educational approaches that through the use of Augmented Reality technology could make use of the real physical context and thereby enrich the educational process of antibiotics prescription. The objective is to investigate which type of information related to antibiotics could be used in an augmented reality application for antibiotics education. Methods. This study followed the Design-Based Research Methodology composed of the following main steps: problem analysis, investigation of information that should be visualized for the training session, and finally the involvement of the end users the development and evaluation processes of the prototype. Results. Two of the most important aspects in the antibiotic prescription process, to represent in an augmented reality application, are the antibiotic guidelines and the side effects. Moreover, this study showed how this information could be visualized from a mobile device using an Augmented Reality scanner and antibiotic drug boxes as markers. Discussion. In this study we investigated the usage of objects from a real physical context such as drug boxes and how they could be used as educational resources. The logical next steps are to examine how this approach of combining physical and virtual contexts through Augmented Reality applications could contribute to the improvement of competencies among healthcare professionals and its impact on the decrease of antibiotics resistance.
Incorporating Technology in Teaching Musical Instruments
ERIC Educational Resources Information Center
Prodan, Angelica
2017-01-01
After discussing some of the drawbacks of using Skype for long distance music lessons, Angelica Prodan describes three different types of Artificial Reality (Virtual Reality, Augmented Reality and Mixed or Merged Reality). She goes on to describe the beneficial applications of technology, with results otherwise impossible to achieve in areas such…
Sun, Guo-Chen; Wang, Fei; Chen, Xiao-Lei; Yu, Xin-Guang; Ma, Xiao-Dong; Zhou, Ding-Biao; Zhu, Ru-Yuan; Xu, Bai-Nan
2016-12-01
The utility of virtual and augmented reality based on functional neuronavigation and intraoperative magnetic resonance imaging (MRI) for glioma surgery has not been previously investigated. The study population consisted of 79 glioma patients and 55 control subjects. Preoperatively, the lesion and related eloquent structures were visualized by diffusion tensor tractography and blood oxygen level-dependent functional MRI. Intraoperatively, microscope-based functional neuronavigation was used to integrate the reconstructed eloquent structure and the real head and brain, which enabled safe resection of the lesion. Intraoperative MRI was used to verify brain shift during the surgical process and provided quality control during surgery. The control group underwent surgery guided by anatomic neuronavigation. Virtual and augmented reality protocols based on functional neuronavigation and intraoperative MRI provided useful information for performing tailored and optimized surgery. Complete resection was achieved in 55 of 79 (69.6%) glioma patients and 20 of 55 (36.4%) control subjects, with average resection rates of 95.2% ± 8.5% and 84.9% ± 15.7%, respectively. Both the complete resection rate and average extent of resection differed significantly between the 2 groups (P < 0.01). Postoperatively, the rate of preservation of neural functions (motor, visual field, and language) was lower in controls than in glioma patients at 2 weeks and 3 months (P < 0.01). Combining virtual and augmented reality based on functional neuronavigation and intraoperative MRI can facilitate resection of gliomas involving eloquent areas. Copyright © 2016 Elsevier Inc. All rights reserved.
Using visuo-kinetic virtual reality to induce illusory spinal movement: the MoOVi Illusion
Smith, Ross T.; Hunter, Estin V.; Davis, Miles G.; Sterling, Michele; Moseley, G. Lorimer
2017-01-01
Background Illusions that alter perception of the body provide novel opportunities to target brain-based contributions to problems such as persistent pain. One example of this, mirror therapy, uses vision to augment perceived movement of a painful limb to treat pain. Since mirrors can’t be used to induce augmented neck or other spinal movement, we aimed to test whether such an illusion could be achieved using virtual reality, in advance of testing its potential therapeutic benefit. We hypothesised that perceived head rotation would depend on visually suggested movement. Method In a within-subjects repeated measures experiment, 24 healthy volunteers performed neck movements to 50o of rotation, while a virtual reality system delivered corresponding visual feedback that was offset by a factor of 50%–200%—the Motor Offset Visual Illusion (MoOVi)—thus simulating more or less movement than that actually occurring. At 50o of real-world head rotation, participants pointed in the direction that they perceived they were facing. The discrepancy between actual and perceived direction was measured and compared between conditions. The impact of including multisensory (auditory and visual) feedback, the presence of a virtual body reference, and the use of 360o immersive virtual reality with and without three-dimensional properties, was also investigated. Results Perception of head movement was dependent on visual-kinaesthetic feedback (p = 0.001, partial eta squared = 0.17). That is, altered visual feedback caused a kinaesthetic drift in the direction of the visually suggested movement. The magnitude of the drift was not moderated by secondary variables such as the addition of illusory auditory feedback, the presence of a virtual body reference, or three-dimensionality of the scene. Discussion Virtual reality can be used to augment perceived movement and body position, such that one can perform a small movement, yet perceive a large one. The MoOVi technique tested here has clear potential for assessment and therapy of people with spinal pain. PMID:28243537
Using visuo-kinetic virtual reality to induce illusory spinal movement: the MoOVi Illusion.
Harvie, Daniel S; Smith, Ross T; Hunter, Estin V; Davis, Miles G; Sterling, Michele; Moseley, G Lorimer
2017-01-01
Illusions that alter perception of the body provide novel opportunities to target brain-based contributions to problems such as persistent pain. One example of this, mirror therapy, uses vision to augment perceived movement of a painful limb to treat pain. Since mirrors can't be used to induce augmented neck or other spinal movement, we aimed to test whether such an illusion could be achieved using virtual reality, in advance of testing its potential therapeutic benefit. We hypothesised that perceived head rotation would depend on visually suggested movement. In a within-subjects repeated measures experiment, 24 healthy volunteers performed neck movements to 50 o of rotation, while a virtual reality system delivered corresponding visual feedback that was offset by a factor of 50%-200%-the Motor Offset Visual Illusion (MoOVi)-thus simulating more or less movement than that actually occurring. At 50 o of real-world head rotation, participants pointed in the direction that they perceived they were facing. The discrepancy between actual and perceived direction was measured and compared between conditions. The impact of including multisensory (auditory and visual) feedback, the presence of a virtual body reference, and the use of 360 o immersive virtual reality with and without three-dimensional properties, was also investigated. Perception of head movement was dependent on visual-kinaesthetic feedback ( p = 0.001, partial eta squared = 0.17). That is, altered visual feedback caused a kinaesthetic drift in the direction of the visually suggested movement. The magnitude of the drift was not moderated by secondary variables such as the addition of illusory auditory feedback, the presence of a virtual body reference, or three-dimensionality of the scene. Virtual reality can be used to augment perceived movement and body position, such that one can perform a small movement, yet perceive a large one. The MoOVi technique tested here has clear potential for assessment and therapy of people with spinal pain.
The virtual mirror: a new interaction paradigm for augmented reality environments.
Bichlmeier, Christoph; Heining, Sandro Michael; Feuerstein, Marco; Navab, Nassir
2009-09-01
Medical augmented reality (AR) has been widely discussed within the medical imaging as well as computer aided surgery communities. Different systems for exemplary medical applications have been proposed. Some of them produced promising results. One major issue still hindering AR technology to be regularly used in medical applications is the interaction between physician and the superimposed 3-D virtual data. Classical interaction paradigms, for instance with keyboard and mouse, to interact with visualized medical 3-D imaging data are not adequate for an AR environment. This paper introduces the concept of a tangible/controllable Virtual Mirror for medical AR applications. This concept intuitively augments the direct view of the surgeon with all desired views on volumetric medical imaging data registered with the operation site without moving around the operating table or displacing the patient. We selected two medical procedures to demonstrate and evaluate the potentials of the Virtual Mirror for the surgical workflow. Results confirm the intuitiveness of this new paradigm and its perceptive advantages for AR-based computer aided interventions.
Augmented Reality-Guided Lumbar Facet Joint Injections.
Agten, Christoph A; Dennler, Cyrill; Rosskopf, Andrea B; Jaberg, Laurenz; Pfirrmann, Christian W A; Farshad, Mazda
2018-05-08
The aim of this study was to assess feasibility and accuracy of augmented reality-guided lumbar facet joint injections. A spine phantom completely embedded in hardened opaque agar with 3 ring markers was built. A 3-dimensional model of the phantom was uploaded to an augmented reality headset (Microsoft HoloLens). Two radiologists independently performed 20 augmented reality-guided and 20 computed tomography (CT)-guided facet joint injections each: for each augmented reality-guided injection, the hologram was manually aligned with the phantom container using the ring markers. The radiologists targeted the virtual facet joint and tried to place the needle tip in the holographic joint space. Computed tomography was performed after each needle placement to document final needle tip position. Time needed from grabbing the needle to final needle placement was measured for each simulated injection. An independent radiologist rated images of all needle placements in a randomized order blinded to modality (augmented reality vs CT) and performer as perfect, acceptable, incorrect, or unsafe. Accuracy and time to place needles were compared between augmented reality-guided and CT-guided facet joint injections. In total, 39/40 (97.5%) of augmented reality-guided needle placements were either perfect or acceptable compared with 40/40 (100%) CT-guided needle placements (P = 0.5). One augmented reality-guided injection missed the facet joint space by 2 mm. No unsafe needle placements occurred. Time to final needle placement was substantially faster with augmented reality guidance (mean 14 ± 6 seconds vs 39 ± 15 seconds, P < 0.001 for both readers). Augmented reality-guided facet joint injections are feasible and accurate without potentially harmful needle placement in an experimental setting.
Augmented reality for anatomical education.
Thomas, Rhys Gethin; John, Nigel William; Delieu, John Michael
2010-03-01
The use of Virtual Environments has been widely reported as a method of teaching anatomy. Generally such environments only convey the shape of the anatomy to the student. We present the Bangor Augmented Reality Education Tool for Anatomy (BARETA), a system that combines Augmented Reality (AR) technology with models produced using Rapid Prototyping (RP) technology, to provide the student with stimulation for touch as well as sight. The principal aims of this work were to provide an interface more intuitive than a mouse and keyboard, and to evaluate such a system as a viable supplement to traditional cadaver based education.
Magnetosensitive e-skins with directional perception for augmented reality
Cañón Bermúdez, Gilbert Santiago; Karnaushenko, Dmitriy D.; Karnaushenko, Daniil; Lebanov, Ana; Bischoff, Lothar; Kaltenbrunner, Martin; Fassbender, Jürgen; Schmidt, Oliver G.; Makarov, Denys
2018-01-01
Electronic skins equipped with artificial receptors are able to extend our perception beyond the modalities that have naturally evolved. These synthetic receptors offer complimentary information on our surroundings and endow us with novel means of manipulating physical or even virtual objects. We realize highly compliant magnetosensitive skins with directional perception that enable magnetic cognition, body position tracking, and touchless object manipulation. Transfer printing of eight high-performance spin valve sensors arranged into two Wheatstone bridges onto 1.7-μm-thick polyimide foils ensures mechanical imperceptibility. This resembles a new class of interactive devices extracting information from the surroundings through magnetic tags. We demonstrate this concept in augmented reality systems with virtual knob-turning functions and the operation of virtual dialing pads, based on the interaction with magnetic fields. This technology will enable a cornucopia of applications from navigation, motion tracking in robotics, regenerative medicine, and sports and gaming to interaction in supplemented reality. PMID:29376121
Khademi, Maryam; Hondori, Hossein Mousavi; Dodakian, Lucy; Cramer, Steve; Lopes, Cristina V
2013-01-01
Introducing computer games to the rehabilitation market led to development of numerous Virtual Reality (VR) training applications. Although VR has provided tremendous benefit to the patients and caregivers, it has inherent limitations, some of which might be solved by replacing it with Augmented Reality (AR). The task of pick-and-place, which is part of many activities of daily living (ADL's), is one of the major affected functions stroke patients mainly expect to recover. We developed an exercise consisting of moving an object between various points, following a flash light that indicates the next target. The results show superior performance of subjects in spatial AR versus non-immersive VR setting. This could be due to the extraneous hand-eye coordination which exists in VR whereas it is eliminated in spatial AR.
Habanapp: Havana's Architectural Heritage a Click Away
NASA Astrophysics Data System (ADS)
Morganti, C.; Bartolomei, C.
2018-05-01
The research treats the application of technologies related with augmented and virtual reality to architectural and historical context in the city of Havana, Cuba, on the basis of historical studies and Range-Imaging techniques on buildings bordering old city's five main squares. The specific aim is to transfer all of the data received thanks to the most recent mobiles apps about Augmented Reality (AR) and Virtual reality (VR), in order to give birth to an innovative App never seen before in Cuba. The "Oficina del Historiador de la ciudad de La Habana", institution supervising architectural and cultural asset in Cuba, is widely interested in the topic in order to develop a new educational, cultural and artistic tool to be used both online and offline.
Creating a Vision Channel for Observing Deep-Seated Anatomy in Medical Augmented Reality
NASA Astrophysics Data System (ADS)
Wimmer, Felix; Bichlmeier, Christoph; Heining, Sandro M.; Navab, Nassir
The intent of medical Augmented Reality (AR) is to augment the surgeon's real view on the patient with the patient's interior anatomy resulting from a suitable visualization of medical imaging data. This paper presents a fast and user-defined clipping technique for medical AR allowing for cutting away any parts of the virtual anatomy and images of the real part of the AR scene hindering the surgeon's view onto the deepseated region of interest. Modeled on cut-away techniques from scientific illustrations and computer graphics, the method creates a fixed vision channel to the inside of the patient. It enables a clear view on the focussed virtual anatomy and moreover improves the perception of spatial depth.
Usability engineering: domain analysis activities for augmented-reality systems
NASA Astrophysics Data System (ADS)
Gabbard, Joseph; Swan, J. E., II; Hix, Deborah; Lanzagorta, Marco O.; Livingston, Mark; Brown, Dennis B.; Julier, Simon J.
2002-05-01
This paper discusses our usability engineering process for the Battlefield Augmented Reality System (BARS). Usability engineering is a structured, iterative, stepwise development process. Like the related disciplines of software and systems engineering, usability engineering is a combination of management principals and techniques, formal and semi- formal evaluation techniques, and computerized tools. BARS is an outdoor augmented reality system that displays heads- up battlefield intelligence information to a dismounted warrior. The paper discusses our general usability engineering process. We originally developed the process in the context of virtual reality applications, but in this work we are adapting the procedures to an augmented reality system. The focus of this paper is our work on domain analysis, the first activity of the usability engineering process. We describe our plans for and our progress to date on our domain analysis for BARS. We give results in terms of a specific urban battlefield use case we have designed.
Augmented reality and photogrammetry: A synergy to visualize physical and virtual city environments
NASA Astrophysics Data System (ADS)
Portalés, Cristina; Lerma, José Luis; Navarro, Santiago
2010-01-01
Close-range photogrammetry is based on the acquisition of imagery to make accurate measurements and, eventually, three-dimensional (3D) photo-realistic models. These models are a photogrammetric product per se. They are usually integrated into virtual reality scenarios where additional data such as sound, text or video can be introduced, leading to multimedia virtual environments. These environments allow users both to navigate and interact on different platforms such as desktop PCs, laptops and small hand-held devices (mobile phones or PDAs). In very recent years, a new technology derived from virtual reality has emerged: Augmented Reality (AR), which is based on mixing real and virtual environments to boost human interactions and real-life navigations. The synergy of AR and photogrammetry opens up new possibilities in the field of 3D data visualization, navigation and interaction far beyond the traditional static navigation and interaction in front of a computer screen. In this paper we introduce a low-cost outdoor mobile AR application to integrate buildings of different urban spaces. High-accuracy 3D photo-models derived from close-range photogrammetry are integrated in real (physical) urban worlds. The augmented environment that is presented herein requires for visualization a see-through video head mounted display (HMD), whereas user's movement navigation is achieved in the real world with the help of an inertial navigation sensor. After introducing the basics of AR technology, the paper will deal with real-time orientation and tracking in combined physical and virtual city environments, merging close-range photogrammetry and AR. There are, however, some software and complex issues, which are discussed in the paper.
Mixed Reality Technology at NASA JPL
2016-05-16
NASA's JPL is a center of innovation in virtual and augmented reality, producing groundbreaking applications of these technologies to support a variety of missions. This video is a collection of unedited scenes released to the media.
Augmented reality glass-free three-dimensional display with the stereo camera
NASA Astrophysics Data System (ADS)
Pang, Bo; Sang, Xinzhu; Chen, Duo; Xing, Shujun; Yu, Xunbo; Yan, Binbin; Wang, Kuiru; Yu, Chongxiu
2017-10-01
An improved method for Augmented Reality (AR) glass-free three-dimensional (3D) display based on stereo camera used for presenting parallax contents from different angle with lenticular lens array is proposed. Compared with the previous implementation method of AR techniques based on two-dimensional (2D) panel display with only one viewpoint, the proposed method can realize glass-free 3D display of virtual objects and real scene with 32 virtual viewpoints. Accordingly, viewers can get abundant 3D stereo information from different viewing angles based on binocular parallax. Experimental results show that this improved method based on stereo camera can realize AR glass-free 3D display, and both of virtual objects and real scene have realistic and obvious stereo performance.
Realistic Reflections for Marine Environments in Augmented Reality Training Systems
2009-09-01
Static Backgrounds. Top: Agua Background. Bottom: Blue Background.............48 Figure 27. Ship Textures Used to Generate Reflections. In Order from...Like virtual simulations, augmented reality trainers can be configured to meet specific training needs and can be restarted and reused to train...Wave Distortion, Blurring and Shadow Many of the same methods outlined in Full Reflection shader were reused for the Physics shader. The same
Shen, Xin; Javidi, Bahram
2018-03-01
We have developed a three-dimensional (3D) dynamic integral-imaging (InIm)-system-based optical see-through augmented reality display with enhanced depth range of a 3D augmented image. A focus-tunable lens is adopted in the 3D display unit to relay the elemental images with various positions to the micro lens array. Based on resolution priority integral imaging, multiple lenslet image planes are generated to enhance the depth range of the 3D image. The depth range is further increased by utilizing both the real and virtual 3D imaging fields. The 3D reconstructed image and the real-world scene are overlaid using an optical see-through display for augmented reality. The proposed system can significantly enhance the depth range of a 3D reconstructed image with high image quality in the micro InIm unit. This approach provides enhanced functionality for augmented information and adjusts the vergence-accommodation conflict of a traditional augmented reality display.
NASA Astrophysics Data System (ADS)
Chao, Jie; Chiu, Jennifer L.; DeJaegher, Crystal J.; Pan, Edward A.
2016-02-01
Deep learning of science involves integration of existing knowledge and normative science concepts. Past research demonstrates that combining physical and virtual labs sequentially or side by side can take advantage of the unique affordances each provides for helping students learn science concepts. However, providing simultaneously connected physical and virtual experiences has the potential to promote connections among ideas. This paper explores the effect of augmenting a virtual lab with physical controls on high school chemistry students' understanding of gas laws. We compared students using the augmented virtual lab to students using a similar sensor-based physical lab with teacher-led discussions. Results demonstrate that students in the augmented virtual lab condition made significant gains from pretest and posttest and outperformed traditional students on some but not all concepts. Results provide insight into incorporating mixed-reality technologies into authentic classroom settings.
Mirelman, Anat; Rochester, Lynn; Reelick, Miriam; Nieuwhof, Freek; Pelosin, Elisa; Abbruzzese, Giovanni; Dockx, Kim; Nieuwboer, Alice; Hausdorff, Jeffrey M
2013-02-06
Recent work has demonstrated that fall risk can be attributed to cognitive as well as motor deficits. Indeed, everyday walking in complex environments utilizes executive function, dual tasking, planning and scanning, all while walking forward. Pilot studies suggest that a multi-modal intervention that combines treadmill training to target motor function and a virtual reality obstacle course to address the cognitive components of fall risk may be used to successfully address the motor-cognitive interactions that are fundamental for fall risk reduction. The proposed randomized controlled trial will evaluate the effects of treadmill training augmented with virtual reality on fall risk. Three hundred older adults with a history of falls will be recruited to participate in this study. This will include older adults (n=100), patients with mild cognitive impairment (n=100), and patients with Parkinson's disease (n=100). These three sub-groups will be recruited in order to evaluate the effects of the intervention in people with a range of motor and cognitive deficits. Subjects will be randomly assigned to the intervention group (treadmill training with virtual reality) or to the active-control group (treadmill training without virtual reality). Each person will participate in a training program set in an outpatient setting 3 times per week for 6 weeks. Assessments will take place before, after, and 1 month and 6 months after the completion of the training. A falls calendar will be kept by each participant for 6 months after completing the training to assess fall incidence (i.e., the number of falls, multiple falls and falls rate). In addition, we will measure gait under usual and dual task conditions, balance, community mobility, health related quality of life, user satisfaction and cognitive function. This randomized controlled trial will demonstrate the extent to which an intervention that combines treadmill training augmented by virtual reality reduces fall risk, improves mobility and enhances cognitive function in a diverse group of older adults. In addition, the comparison to an active control group that undergoes treadmill training without virtual reality will provide evidence as to the added value of addressing motor cognitive interactions as an integrated unit. (NIH)-NCT01732653.
2013-01-01
Background Recent work has demonstrated that fall risk can be attributed to cognitive as well as motor deficits. Indeed, everyday walking in complex environments utilizes executive function, dual tasking, planning and scanning, all while walking forward. Pilot studies suggest that a multi-modal intervention that combines treadmill training to target motor function and a virtual reality obstacle course to address the cognitive components of fall risk may be used to successfully address the motor-cognitive interactions that are fundamental for fall risk reduction. The proposed randomized controlled trial will evaluate the effects of treadmill training augmented with virtual reality on fall risk. Methods/Design Three hundred older adults with a history of falls will be recruited to participate in this study. This will include older adults (n=100), patients with mild cognitive impairment (n=100), and patients with Parkinson’s disease (n=100). These three sub-groups will be recruited in order to evaluate the effects of the intervention in people with a range of motor and cognitive deficits. Subjects will be randomly assigned to the intervention group (treadmill training with virtual reality) or to the active-control group (treadmill training without virtual reality). Each person will participate in a training program set in an outpatient setting 3 times per week for 6 weeks. Assessments will take place before, after, and 1 month and 6 months after the completion of the training. A falls calendar will be kept by each participant for 6 months after completing the training to assess fall incidence (i.e., the number of falls, multiple falls and falls rate). In addition, we will measure gait under usual and dual task conditions, balance, community mobility, health related quality of life, user satisfaction and cognitive function. Discussion This randomized controlled trial will demonstrate the extent to which an intervention that combines treadmill training augmented by virtual reality reduces fall risk, improves mobility and enhances cognitive function in a diverse group of older adults. In addition, the comparison to an active control group that undergoes treadmill training without virtual reality will provide evidence as to the added value of addressing motor cognitive interactions as an integrated unit. Trial Registration (NIH)–NCT01732653 PMID:23388087
NASA Astrophysics Data System (ADS)
Lin, Chien-Liang; Su, Yu-Zheng; Hung, Min-Wei; Huang, Kuo-Cheng
2010-08-01
In recent years, Augmented Reality (AR)[1][2][3] is very popular in universities and research organizations. The AR technology has been widely used in Virtual Reality (VR) fields, such as sophisticated weapons, flight vehicle development, data model visualization, virtual training, entertainment and arts. AR has characteristics to enhance the display output as a real environment with specific user interactive functions or specific object recognitions. It can be use in medical treatment, anatomy training, precision instrument casting, warplane guidance, engineering and distance robot control. AR has a lot of vantages than VR. This system developed combines sensors, software and imaging algorithms to make users feel real, actual and existing. Imaging algorithms include gray level method, image binarization method, and white balance method in order to make accurate image recognition and overcome the effects of light.
Real-Time Occlusion Handling in Augmented Reality Based on an Object Tracking Approach
Tian, Yuan; Guan, Tao; Wang, Cheng
2010-01-01
To produce a realistic augmentation in Augmented Reality, the correct relative positions of real objects and virtual objects are very important. In this paper, we propose a novel real-time occlusion handling method based on an object tracking approach. Our method is divided into three steps: selection of the occluding object, object tracking and occlusion handling. The user selects the occluding object using an interactive segmentation method. The contour of the selected object is then tracked in the subsequent frames in real-time. In the occlusion handling step, all the pixels on the tracked object are redrawn on the unprocessed augmented image to produce a new synthesized image in which the relative position between the real and virtual object is correct. The proposed method has several advantages. First, it is robust and stable, since it remains effective when the camera is moved through large changes of viewing angles and volumes or when the object and the background have similar colors. Second, it is fast, since the real object can be tracked in real-time. Last, a smoothing technique provides seamless merging between the augmented and virtual object. Several experiments are provided to validate the performance of the proposed method. PMID:22319278
NASA Astrophysics Data System (ADS)
Hua, Hong
2017-02-01
Head-mounted light field displays render a true 3D scene by sampling either the projections of the 3D scene at different depths or the directions of the light rays apparently emitted by the 3D scene and viewed from different eye positions. They are capable of rendering correct or nearly correct focus cues and addressing the very well-known vergence-accommodation mismatch problem in conventional virtual and augmented reality displays. In this talk, I will focus on reviewing recent advancements of head-mounted light field displays for VR and AR applications. I will demonstrate examples of HMD systems developed in my group.
Augmenting your own reality: student authoring of science-based augmented reality games.
Klopfer, Eric; Sheldon, Josh
2010-01-01
Augmented Reality (AR) simulations superimpose a virtual overlay of data and interactions onto a real-world context. The simulation engine at the heart of this technology is built to afford elements of game play that support explorations and learning in students' natural context--their own community and surroundings. In one of the more recent games, TimeLab 2100, players role-play citizens of the early 22nd century when global climate change is out of control. Through AR, they see their community as it might be nearly one hundred years in the future. TimeLab and other similar AR games balance location specificity and portability--they are games that are tied to a location and games that are movable from place to place. Focusing students on developing their own AR games provides the best of both virtual and physical worlds: a more portable solution that deeply connects young people to their own surroundings. A series of initiatives has focused on technical and pedagogical solutions to supporting students authoring their own games.
Virtual reality and planetary exploration
NASA Technical Reports Server (NTRS)
Mcgreevy, Michael W.
1992-01-01
Exploring planetary environments is central to NASA's missions and goals. A new computing technology called Virtual Reality has much to offer in support of planetary exploration. This technology augments and extends human presence within computer-generated and remote spatial environments. Historically, NASA has been a leader in many of the fundamental concepts and technologies that comprise Virtual Reality. Indeed, Ames Research Center has a central role in the development of this rapidly emerging approach to using computers. This ground breaking work has inspired researchers in academia, industry, and the military. Further, NASA's leadership in this technology has spun off new businesses, has caught the attention of the international business community, and has generated several years of positive international media coverage. In the future, Virtual Reality technology will enable greatly improved human-machine interactions for more productive planetary surface exploration. Perhaps more importantly, Virtual Reality technology will democratize the experience of planetary exploration and thereby broaden understanding of, and support for, this historic enterprise.
Virtual reality and planetary exploration
NASA Astrophysics Data System (ADS)
McGreevy, Michael W.
Exploring planetary environments is central to NASA's missions and goals. A new computing technology called Virtual Reality has much to offer in support of planetary exploration. This technology augments and extends human presence within computer-generated and remote spatial environments. Historically, NASA has been a leader in many of the fundamental concepts and technologies that comprise Virtual Reality. Indeed, Ames Research Center has a central role in the development of this rapidly emerging approach to using computers. This ground breaking work has inspired researchers in academia, industry, and the military. Further, NASA's leadership in this technology has spun off new businesses, has caught the attention of the international business community, and has generated several years of positive international media coverage. In the future, Virtual Reality technology will enable greatly improved human-machine interactions for more productive planetary surface exploration. Perhaps more importantly, Virtual Reality technology will democratize the experience of planetary exploration and thereby broaden understanding of, and support for, this historic enterprise.
Ponce, Brent A; Menendez, Mariano E; Oladeji, Lasun O; Fryberger, Charles T; Dantuluri, Phani K
2014-11-01
The authors describe the first surgical case adopting the combination of real-time augmented reality and wearable computing devices such as Google Glass (Google Inc, Mountain View, California). A 66-year-old man presented to their institution for a total shoulder replacement after 5 years of progressive right shoulder pain and decreased range of motion. Throughout the surgical procedure, Google Glass was integrated with the Virtual Interactive Presence and Augmented Reality system (University of Alabama at Birmingham, Birmingham, Alabama), enabling the local surgeon to interact with the remote surgeon within the local surgical field. Surgery was well tolerated by the patient and early surgical results were encouraging, with an improvement of shoulder pain and greater range of motion. The combination of real-time augmented reality and wearable computing devices such as Google Glass holds much promise in the field of surgery. Copyright 2014, SLACK Incorporated.
Plessas, Anastasios
2017-10-01
In preclinical dental education, the acquisition of clinical, technical skills, and the transfer of these skills to the clinic are paramount. Phantom heads provide an efficient way to teach preclinical students dental procedures safely while increasing their dexterity skills considerably. Modern computerized phantom head training units incorporate features of virtual reality technology and the ability to offer concurrent augmented feedback. The aims of this review were to examine and evaluate the dental literature for evidence supporting their use and to discuss the role of augmented feedback versus the facilitator's instruction. Adjunctive training in these units seems to enhance student's learning and skill acquisition and reduce the required faculty supervision time. However, the virtual augmented feedback cannot be used as the sole method of feedback, and the facilitator's input is still critical. Well-powered longitudinal randomized trials exploring the impact of these units on student's clinical performance and issues of cost-effectiveness are warranted.
Augmentation of Cognition and Perception Through Advanced Synthetic Vision Technology
NASA Technical Reports Server (NTRS)
Prinzel, Lawrence J., III; Kramer, Lynda J.; Bailey, Randall E.; Arthur, Jarvis J.; Williams, Steve P.; McNabb, Jennifer
2005-01-01
Synthetic Vision System technology augments reality and creates a virtual visual meteorological condition that extends a pilot's cognitive and perceptual capabilities during flight operations when outside visibility is restricted. The paper describes the NASA Synthetic Vision System for commercial aviation with an emphasis on how the technology achieves Augmented Cognition objectives.
Virtual Rehabilitation with Children: Challenges for Clinical Adoption [From the Field].
Glegg, Stephanie
2017-01-01
Virtual, augmented, and mixed reality environments are increasingly being developed and used to address functional rehabilitation goals related to physical, cognitive, social, and psychological impairments. For example, a child with an acquired brain injury may participate in virtual rehabilitation to address impairments in balance, attention, turn taking, and engagement in therapy. The trend toward virtual rehabilitation first gained momentum with the adoption of commercial off-the-shelf active video gaming consoles (e.g., Nintendo Wii and XBox). Now, we are seeing the rapid emergence of customized rehabilitation-specific systems that integrate technological advances in virtual reality, visual effects, motion tracking, physiological monitoring, and robotics.
Ben-Moussa, Maher; Rubo, Marius; Debracque, Coralie; Lange, Wolf-Gero
2017-01-01
The present paper explores the benefits and the capabilities of various emerging state-of-the-art interactive 3D and Internet of Things technologies and investigates how these technologies can be exploited to develop a more effective technology supported exposure therapy solution for social anxiety disorder. "DJINNI" is a conceptual design of an in vivo augmented reality (AR) exposure therapy mobile support system that exploits several capturing technologies and integrates the patient's state and situation by vision-based, audio-based, and physiology-based analysis as well as by indoor/outdoor localization techniques. DJINNI also comprises an innovative virtual reality exposure therapy system that is adaptive and customizable to the demands of the in vivo experience and therapeutic progress. DJINNI follows a gamification approach where rewards and achievements are utilized to motivate the patient to progress in her/his treatment. The current paper reviews the state of the art of technologies needed for such a solution and recommends how these technologies could be integrated in the development of an individually tailored and yet feasible and effective AR/virtual reality-based exposure therapy. Finally, the paper outlines how DJINNI could be part of classical cognitive behavioral treatment and how to validate such a setup.
Ben-Moussa, Maher; Rubo, Marius; Debracque, Coralie; Lange, Wolf-Gero
2017-01-01
The present paper explores the benefits and the capabilities of various emerging state-of-the-art interactive 3D and Internet of Things technologies and investigates how these technologies can be exploited to develop a more effective technology supported exposure therapy solution for social anxiety disorder. “DJINNI” is a conceptual design of an in vivo augmented reality (AR) exposure therapy mobile support system that exploits several capturing technologies and integrates the patient’s state and situation by vision-based, audio-based, and physiology-based analysis as well as by indoor/outdoor localization techniques. DJINNI also comprises an innovative virtual reality exposure therapy system that is adaptive and customizable to the demands of the in vivo experience and therapeutic progress. DJINNI follows a gamification approach where rewards and achievements are utilized to motivate the patient to progress in her/his treatment. The current paper reviews the state of the art of technologies needed for such a solution and recommends how these technologies could be integrated in the development of an individually tailored and yet feasible and effective AR/virtual reality-based exposure therapy. Finally, the paper outlines how DJINNI could be part of classical cognitive behavioral treatment and how to validate such a setup. PMID:28503155
Meldrum, Dara; Herdman, Susan; Moloney, Roisin; Murray, Deirdre; Duffy, Douglas; Malone, Kareena; French, Helen; Hone, Stephen; Conroy, Ronan; McConn-Walsh, Rory
2012-03-26
Unilateral peripheral vestibular loss results in gait and balance impairment, dizziness and oscillopsia. Vestibular rehabilitation benefits patients but optimal treatment remains unknown. Virtual reality is an emerging tool in rehabilitation and provides opportunities to improve both outcomes and patient satisfaction with treatment. The Nintendo Wii Fit Plus® (NWFP) is a low cost virtual reality system that challenges balance and provides visual and auditory feedback. It may augment the motor learning that is required to improve balance and gait, but no trials to date have investigated efficacy. In a single (assessor) blind, two centre randomised controlled superiority trial, 80 patients with unilateral peripheral vestibular loss will be randomised to either conventional or virtual reality based (NWFP) vestibular rehabilitation for 6 weeks. The primary outcome measure is gait speed (measured with three dimensional gait analysis). Secondary outcomes include computerised posturography, dynamic visual acuity, and validated questionnaires on dizziness, confidence and anxiety/depression. Outcome will be assessed post treatment (8 weeks) and at 6 months. Advances in the gaming industry have allowed mass production of highly sophisticated low cost virtual reality systems that incorporate technology previously not accessible to most therapists and patients. Importantly, they are not confined to rehabilitation departments, can be used at home and provide an accurate record of adherence to exercise. The benefits of providing augmented feedback, increasing intensity of exercise and accurately measuring adherence may improve conventional vestibular rehabilitation but efficacy must first be demonstrated. Clinical trials.gov identifier: NCT01442623.
Immersive Technologies and Language Learning
ERIC Educational Resources Information Center
Blyth, Carl
2018-01-01
This article briefly traces the historical conceptualization of linguistic and cultural immersion through technological applications, from the early days of locally networked computers to the cutting-edge technologies known as virtual reality and augmented reality. Next, the article explores the challenges of immersive technologies for the field…
Virtual reality and hallucination: a technoetic perspective
NASA Astrophysics Data System (ADS)
Slattery, Diana R.
2008-02-01
Virtual Reality (VR), especially in a technologically focused discourse, is defined by a class of hardware and software, among them head-mounted displays (HMDs), navigation and pointing devices; and stereoscopic imaging. This presentation examines the experiential aspect of VR. Putting "virtual" in front of "reality" modifies the ontological status of a class of experience-that of "reality." Reality has also been modified [by artists, new media theorists, technologists and philosophers] as augmented, mixed, simulated, artificial, layered, and enhanced. Modifications of reality are closely tied to modifications of perception. Media theorist Roy Ascott creates a model of three "VR's": Verifiable Reality, Virtual Reality, and Vegetal (entheogenically induced) Reality. The ways in which we shift our perceptual assumptions, create and verify illusions, and enter "the willing suspension of disbelief" that allows us entry into imaginal worlds is central to the experience of VR worlds, whether those worlds are explicitly representational (robotic manipulations by VR) or explicitly imaginal (VR artistic creations). The early rhetoric surrounding VR was interwoven with psychedelics, a perception amplified by Timothy Leary's presence on the historic SIGGRAPH panel, and the Wall Street Journal's tag of VR as "electronic LSD." This paper discusses the connections-philosophical, social-historical, and psychological-perceptual between these two domains.
Virtual reality gaming in the rehabilitation of the upper extremities post-stroke.
Yates, Michael; Kelemen, Arpad; Sik Lanyi, Cecilia
2016-01-01
Occurrences of strokes often result in unilateral upper limb dysfunction. Dysfunctions of this nature frequently persist and can present chronic limitations to activities of daily living. Research into applying virtual reality gaming systems to provide rehabilitation therapy have seen resurgence. Themes explored in stroke rehab for paretic limbs are action observation and imitation, versatility, intensity and repetition and preservation of gains. Fifteen articles were ultimately selected for review. The purpose of this literature review is to compare the various virtual reality gaming modalities in the current literature and ascertain their efficacy. The literature supports the use of virtual reality gaming rehab therapy as equivalent to traditional therapies or as successful augmentation to those therapies. While some degree of rigor was displayed in the literature, small sample sizes, variation in study lengths and therapy durations and unequal controls reduce generalizability and comparability. Future studies should incorporate larger sample sizes and post-intervention follow-up measures.
Comparative evaluation of monocular augmented-reality display for surgical microscopes.
Rodriguez Palma, Santiago; Becker, Brian C; Lobes, Louis A; Riviere, Cameron N
2012-01-01
Medical augmented reality has undergone much development recently. However, there is a lack of studies quantitatively comparing the different display options available. This paper compares the effects of different graphical overlay systems in a simple micromanipulation task with "soft" visual servoing. We compared positioning accuracy in a real-time visually-guided task using Micron, an active handheld tremor-canceling microsurgical instrument, using three different displays: 2D screen, 3D screen, and microscope with monocular image injection. Tested with novices and an experienced vitreoretinal surgeon, display of virtual cues in the microscope via an augmented reality injection system significantly decreased 3D error (p < 0.05) compared to the 2D and 3D monitors when confounding factors such as magnification level were normalized.
Augmented reality on poster presentations, in the field and in the classroom
NASA Astrophysics Data System (ADS)
Hawemann, Friedrich; Kolawole, Folarin
2017-04-01
Augmented reality (AR) is the direct addition of virtual information through an interface to a real-world environment. In practice, through a mobile device such as a tablet or smartphone, information can be projected onto a target- for example, an image on a poster. Mobile devices are widely distributed today such that augmented reality is easily accessible to almost everyone. Numerous studies have shown that multi-dimensional visualization is essential for efficient perception of the spatial, temporal and geometrical configuration of geological structures and processes. Print media, such as posters and handouts lack the ability to display content in the third and fourth dimensions, which might be in space-domain as seen in three-dimensional (3-D) objects, or time-domain (four-dimensional, 4-D) expressible in the form of videos. Here, we show that augmented reality content can be complimentary to geoscience poster presentations, hands-on material and in the field. In the latter example, location based data is loaded and for example, a virtual geological profile can be draped over a real-world landscape. In object based AR, the application is trained to recognize an image or object through the camera of the user's mobile device, such that specific content is automatically downloaded and displayed on the screen of the device, and positioned relative to the trained image or object. We used ZapWorks, a commercially-available software application to create and present examples of content that is poster-based, in which important supplementary information is presented as interactive virtual images, videos and 3-D models. We suggest that the flexibility and real-time interactivity offered by AR makes it an invaluable tool for effective geoscience poster presentation, class-room and field geoscience learning.
Real-time 3D image reconstruction guidance in liver resection surgery.
Soler, Luc; Nicolau, Stephane; Pessaux, Patrick; Mutter, Didier; Marescaux, Jacques
2014-04-01
Minimally invasive surgery represents one of the main evolutions of surgical techniques. However, minimally invasive surgery adds difficulty that can be reduced through computer technology. From a patient's medical image [US, computed tomography (CT) or MRI], we have developed an Augmented Reality (AR) system that increases the surgeon's intraoperative vision by providing a virtual transparency of the patient. AR is based on two major processes: 3D modeling and visualization of anatomical or pathological structures appearing in the medical image, and the registration of this visualization onto the real patient. We have thus developed a new online service, named Visible Patient, providing efficient 3D modeling of patients. We have then developed several 3D visualization and surgical planning software tools to combine direct volume rendering and surface rendering. Finally, we have developed two registration techniques, one interactive and one automatic providing intraoperative augmented reality view. From January 2009 to June 2013, 769 clinical cases have been modeled by the Visible Patient service. Moreover, three clinical validations have been realized demonstrating the accuracy of 3D models and their great benefit, potentially increasing surgical eligibility in liver surgery (20% of cases). From these 3D models, more than 50 interactive AR-assisted surgical procedures have been realized illustrating the potential clinical benefit of such assistance to gain safety, but also current limits that automatic augmented reality will overcome. Virtual patient modeling should be mandatory for certain interventions that have now to be defined, such as liver surgery. Augmented reality is clearly the next step of the new surgical instrumentation but remains currently limited due to the complexity of organ deformations during surgery. Intraoperative medical imaging used in new generation of automated augmented reality should solve this issue thanks to the development of Hybrid OR.
Beam steering for virtual/augmented reality displays with a cycloidal diffractive waveplate.
Chen, Haiwei; Weng, Yishi; Xu, Daming; Tabiryan, Nelson V; Wu, Shin-Tson
2016-04-04
We proposed a switchable beam steering device with cycloidal diffractive waveplate (CDW) for eye tracking in a virtual reality (VR) or augmented reality (AR) display system. Such a CDW diffracts the incident circularly polarized light to the first order with over 95% efficiency. To convert the input linearly polarized light to right-handed or left-handed circular polarization, we developed a broadband polarization switch consisting of a twisted nematic liquid crystal cell and an achromatic quarter-wave retardation film. By cascading 2-3 CDWs together, multiple diffraction angles can be achieved. To suppress the color dispersion, we proposed two approaches to obtain the same diffraction angle for red, green, and blue LEDs-based full color displays. Our device exhibits several advantages, such as high diffraction efficiency, fast response time, low power consumption, and low cost. It holds promise for the emerging VR/AR displays.
a Framework for Low-Cost Multi-Platform VR and AR Site Experiences
NASA Astrophysics Data System (ADS)
Wallgrün, J. O.; Huang, J.; Zhao, J.; Masrur, A.; Oprean, D.; Klippel, A.
2017-11-01
Low-cost consumer-level immersive solutions have the potential to revolutionize education and research in many fields by providing virtual experiences of sites that are either inaccessible, too dangerous, or too expensive to visit, or by augmenting in-situ experiences using augmented and mixed reality methods. We present our approach for creating low-cost multi-platform virtual and augmented reality site experiences of real world places for education and research purposes, making extensive use of Structure-from-Motion methods as well as 360° photography and videography. We discuss several example projects, for the Mayan City of Cahal Pech, Iceland's Thrihnukar volcano, the Santa Marta informal settlement in Rio, and for the Penn State Campus, and we propose a framework for creating and maintaining such applications by combining declarative content specification methods with a central linked-data based spatio-temporal information system.
Enhancing a Multi-body Mechanism with Learning-Aided Cues in an Augmented Reality Environment
NASA Astrophysics Data System (ADS)
Singh Sidhu, Manjit
2013-06-01
Augmented Reality (AR) is a potential area of research for education, covering issues such as tracking and calibration, and realistic rendering of virtual objects. The ability to augment real world with virtual information has opened the possibility of using AR technology in areas such as education and training as well. In the domain of Computer Aided Learning (CAL), researchers have long been looking into enhancing the effectiveness of the teaching and learning process by providing cues that could assist learners to better comprehend the materials presented. Although a number of works were done looking into the effectiveness of learning-aided cues, but none has really addressed this issue for AR-based learning solutions. This paper discusses the design and model of an AR based software that uses visual cues to enhance the learning process and the outcome perception results of the cues.
Yoo, Ji Won; Lee, Dong Ryul; Cha, Young Joo; You, Sung Hyun
2017-01-01
The purpose of the present study was to compare therapeutic effects of an electromyography (EMG) biofeedback augmented by virtual reality (VR) and EMG biofeedback alone on the triceps and biceps (T:B) muscle activity imbalance and elbow joint movement coordination during a reaching motor taskOBJECTIVE: To compare therapeutic effects of an electromyography (EMG) biofeedback augmented by virtual reality (VR) and EMG biofeedback alone on the triceps and biceps muscle activity imbalance and elbow joint movement coordination during a reaching motor task in normal children and children with spastic cerebral palsy (CP). 18 children with spastic CP (2 females; mean±standard deviation = 9.5 ± 1.96 years) and 8 normal children (3 females; mean ± standard deviation = 9.75 ± 2.55 years) were recruited from a local community center. All children with CP first underwent one intensive session of EMG feedback (30 minutes), followed by one session of the EMG-VR feedback (30 minutes) after a 1-week washout period. Clinical tests included elbow extension range of motion (ROM), biceps muscle strength, and box and block test. EMG triceps and biceps (T:B) muscle activity imbalance and reaching movement acceleration coordination were concurrently determined by EMG and 3-axis accelerometer measurements respectively. Independent t-test and one-way repeated analysis of variance (ANOVA) were performed at p < 0.05. The one-way repeated ANOVA was revealed to be significantly effective in elbow extension ROM (p = 0.01), biceps muscle strength (p = 0.01), and box and block test (p = 0.03). The one-way repeated ANOVA also revealed to be significantly effective in the peak triceps muscle activity (p = 0.01). However, one-way repeated ANOVA produced no statistical significance in the composite 3-dimensional movement acceleration coordination data (p = 0.12). The present study is a first clinical trial that demonstrated the superior benefits of the EMG biofeedback when augmented by virtual reality exercise games in children with spastic CP. The augmented EMG and VR feedback produced better neuromuscular balance control in the elbow joint than the EMG biofeedback alone.
E-Learning Application of Tarsier with Virtual Reality using Android Platform
NASA Astrophysics Data System (ADS)
Oroh, H. N.; Munir, R.; Paseru, D.
2017-01-01
Spectral Tarsier is a primitive primate that can only be found in the province of North Sulawesi. To study these primates can be used an e-learning application with Augmented Reality technology that uses a marker to confronted the camera computer to interact with three dimensions Tarsier object. But that application only shows tarsier object in three dimensions without habitat and requires a lot of resources because it runs on a Personal Computer. The same technology can be shown three dimensions’ objects is Virtual Reality to excess can make the user like venturing into the virtual world with Android platform that requires fewer resources. So, put on Virtual Reality technology using the Android platform that can make users not only to view and interact with the tarsiers but also the habitat. The results of this research indicate that the user can learn the Tarsier and habitat with good. Thus, the use of Virtual Reality technology in the e-learning application of tarsiers can help people to see, know, and learn about Spectral Tarsier.
Augmented reality in the surgery of cerebral aneurysms: a technical report.
Cabrilo, Ivan; Bijlenga, Philippe; Schaller, Karl
2014-06-01
Augmented reality is the overlay of computer-generated images on real-world structures. It has previously been used for image guidance during surgical procedures, but it has never been used in the surgery of cerebral aneurysms. To report our experience of cerebral aneurysm surgery aided by augmented reality. Twenty-eight patients with 39 unruptured aneurysms were operated on in a prospective manner with augmented reality. Preoperative 3-dimensional image data sets (angio-magnetic resonance imaging, angio-computed tomography, and 3-dimensional digital subtraction angiography) were used to create virtual segmentations of patients' vessels, aneurysms, aneurysm necks, skulls, and heads. These images were injected intraoperatively into the eyepiece of the operating microscope. An example case of an unruptured posterior communicating artery aneurysm clipping is illustrated in a video. The described operating procedure allowed continuous monitoring of the accuracy of patient registration with neuronavigation data and assisted in the performance of tailored surgical approaches and optimal clipping with minimized exposition. Augmented reality may add to the performance of a minimally invasive approach, although further studies need to be performed to evaluate whether certain groups of aneurysms are more likely to benefit from it. Further technological development is required to improve its user friendliness.
Learning Application of Astronomy Based Augmented Reality using Android Platform
NASA Astrophysics Data System (ADS)
Maleke, B.; Paseru, D.; Padang, R.
2018-02-01
Astronomy is a branch of science involving observations of celestial bodies such as stars, planets, nebular comets, star clusters, and galaxies as well as natural phenomena occurring outside the Earth’s atmosphere. The way of learning of Astronomy is quite varied, such as by using a book or observe directly with a telescope. But both ways of learning have shortcomings, for example learning through books is only presented in the form of interesting 2D drawings. While learning with a telescope requires a fairly expensive cost to buy the equipment. This study will present a more interesting way of learning from the previous one, namely through Augmented Reality (AR) application using Android platform. Augmented Reality is a combination of virtual world (virtual) and real world (real) made by computer. Virtual objects can be text, animation, 3D models or videos that are combined with the actual environment so that the user feels the virtual object is in his environment. With the use of the Android platform, this application makes the learning method more interesting because it can be used on various Android smartphones so that learning can be done anytime and anywhere. The methodology used in making applications is Multimedia Lifecycle, along with C # language for AR programming and flowchart as a modelling tool. The results of research on some users stated that this application can run well and can be used as an alternative way of learning Astronomy with more interesting.
Implementation of augmented reality to models sultan deli
NASA Astrophysics Data System (ADS)
Syahputra, M. F.; Lumbantobing, N. P.; Siregar, B.; Rahmat, R. F.; Andayani, U.
2018-03-01
Augmented reality is a technology that can provide visualization in the form of 3D virtual model. With the utilization of augmented reality technology hence image-based modeling to produce 3D model of Sultan Deli Istana Maimun can be applied to restore photo of Sultan of Deli into three dimension model. This is due to the Sultan of Deli which is one of the important figures in the history of the development of the city of Medan is less known by the public because the image of the Sultanate of Deli is less clear and has been very long. To achieve this goal, augmented reality applications are used with image processing methodologies into 3D models through several toolkits. The output generated from this method is the visitor’s photos Maimun Palace with 3D model of Sultan Deli with the detection of markers 20-60 cm apart so as to provide convenience for the public to recognize the Sultan Deli who had ruled in Maimun Palace.
The development, assessment and validation of virtual reality for human anatomy instruction
NASA Technical Reports Server (NTRS)
Marshall, Karen Benn
1996-01-01
This research project seeks to meet the objective of science training by developing, assessing, validating and utilizing VR as a human anatomy training medium. Current anatomy instruction is primarily in the form of lectures and usage of textbooks. In ideal situations, anatomic models, computer-based instruction, and cadaver dissection are utilized to augment traditional methods of instruction. At many institutions, lack of financial resources limits anatomy instruction to textbooks and lectures. However, human anatomy is three-dimensional, unlike the one-dimensional depiction found in textbooks and the two-dimensional depiction found on the computer. Virtual reality allows one to step through the computer screen into a 3-D artificial world. The primary objective of this project is to produce a virtual reality application of the abdominopelvic region of a human cadaver that can be taken back to the classroom. The hypothesis is that an immersive learning environment affords quicker anatomic recognition and orientation and a greater level of retention in human anatomy instruction. The goal is to augment not replace traditional modes of instruction.
An augmented reality system validation for the treatment of cockroach phobia.
Bretón-López, Juani; Quero, Soledad; Botella, Cristina; García-Palacios, Azucena; Baños, Rosa Maria; Alcañiz, Mariano
2010-12-01
Augmented reality (AR) is a new technology in which various virtual elements are incorporated into the user's perception of the real world. The most significant aspect of AR is that the virtual elements add relevant and helpful information to the real scene. AR shares some important characteristics with virtual reality as applied in clinical psychology. However, AR offers additional features that might be crucial for treating certain problems. An AR system designed to treat insect phobia has been used for treating phobia of small animals, and positive preliminary data about the global efficacy of the system have been obtained. However, it is necessary to determine the capacity of similar AR systems and their elements that are designed to evoke anxiety in participants; this is achieved by testing the correspondence between the inclusion of feared stimuli and the induction of anxiety. The objective of the present work is to validate whether the stimuli included in the AR-Insect Phobia system are capable of inducing anxiety in six participants diagnosed with cockroach phobia. Results support the adequacy of each element of the system in inducing anxiety in all participants.
Augmented Virtual Reality Laboratory
NASA Technical Reports Server (NTRS)
Tully-Hanson, Benjamin
2015-01-01
Real time motion tracking hardware has for the most part been cost prohibitive for research to regularly take place until recently. With the release of the Microsoft Kinect in November 2010, researchers now have access to a device that for a few hundred dollars is capable of providing redgreenblue (RGB), depth, and skeleton data. It is also capable of tracking multiple people in real time. For its original intended purposes, i.e. gaming, being used with the Xbox 360 and eventually Xbox One, it performs quite well. However, researchers soon found that although the sensor is versatile, it has limitations in real world applications. I was brought aboard this summer by William Little in the Augmented Virtual Reality (AVR) Lab at Kennedy Space Center to find solutions to these limitations.
Botella, Cristina; Pérez-Ara, M Ángeles; Bretón-López, Juana; Quero, Soledad; García-Palacios, Azucena; Baños, Rosa María
2016-01-01
Although in vivo exposure is the treatment of choice for specific phobias, some acceptability problems have been associated with it. Virtual Reality exposure has been shown to be as effective as in vivo exposure, and it is widely accepted for the treatment of specific phobias, but only preliminary data are available in the literature about the efficacy of Augmented Reality. The purpose of the present study was to examine the efficacy and acceptance of two treatment conditions for specific phobias in which the exposure component was applied in different ways: In vivo exposure (N = 31) versus an Augmented Reality system (N = 32) in a randomized controlled trial. "One-session treatment" guidelines were followed. Participants in the Augmented Reality condition significantly improved on all the outcome measures at post-treatment and follow-ups. When the two treatment conditions were compared, some differences were found at post-treatment, favoring the participants who received in vivo exposure. However, these differences disappeared at the 3- and 6-month follow-ups. Regarding participants' expectations and satisfaction with the treatment, very positive ratings were reported in both conditions. In addition, participants from in vivo exposure condition considered the treatment more useful for their problem whereas participants from Augmented Reality exposure considered the treatment less aversive. Results obtained in this study indicate that Augmented Reality exposure is an effective treatment for specific phobias and well accepted by the participants.
Botella, Cristina; Pérez-Ara, M. Ángeles; Bretón-López, Juana; Quero, Soledad; García-Palacios, Azucena; Baños, Rosa María
2016-01-01
Although in vivo exposure is the treatment of choice for specific phobias, some acceptability problems have been associated with it. Virtual Reality exposure has been shown to be as effective as in vivo exposure, and it is widely accepted for the treatment of specific phobias, but only preliminary data are available in the literature about the efficacy of Augmented Reality. The purpose of the present study was to examine the efficacy and acceptance of two treatment conditions for specific phobias in which the exposure component was applied in different ways: In vivo exposure (N = 31) versus an Augmented Reality system (N = 32) in a randomized controlled trial. “One-session treatment” guidelines were followed. Participants in the Augmented Reality condition significantly improved on all the outcome measures at post-treatment and follow-ups. When the two treatment conditions were compared, some differences were found at post-treatment, favoring the participants who received in vivo exposure. However, these differences disappeared at the 3- and 6-month follow-ups. Regarding participants’ expectations and satisfaction with the treatment, very positive ratings were reported in both conditions. In addition, participants from in vivo exposure condition considered the treatment more useful for their problem whereas participants from Augmented Reality exposure considered the treatment less aversive. Results obtained in this study indicate that Augmented Reality exposure is an effective treatment for specific phobias and well accepted by the participants. PMID:26886423
de Bruin, E D; Schoene, D; Pichierri, G; Smith, S T
2010-08-01
Virtual augmented exercise, an emerging technology that can help to promote physical activity and combine the strengths of indoor and outdoor exercise, has recently been proposed as having the potential to increase exercise behavior in older adults. By creating a strong presence in a virtual, interactive environment, distraction can be taken to greater levels while maintaining the benefits of indoor exercises which may result in a shift from negative to positive thoughts about exercise. Recent findings on young participants show that virtual reality training enhances mood, thus, increasing enjoyment and energy. For older adults virtual, interactive environments can influence postural control and fall events by stimulating the sensory cues that are responsible in maintaining balance and orientation. However, the potential of virtual reality training has yet to be explored for older adults. This manuscript describes the potential of dance pad training protocols in the elderly and reports on the theoretical rationale of combining physical game-like exercises with sensory and cognitive challenges in a virtual environment.
Virtual and augmented reality in the treatment of phantom limb pain: A literature review.
Dunn, Justin; Yeo, Elizabeth; Moghaddampour, Parisah; Chau, Brian; Humbert, Sarah
2017-01-01
Phantom limb pain (PLP), the perception of discomfort in a limb no longer present, commonly occurs following amputation. A variety of interventions have been employed for PLP, including mirror therapy. Virtual Reality (VR) and augmented reality (AR) mirror therapy treatments have also been utilized and have the potential to provide an even greater immersive experience for the amputee. However, there is not currently a consensus on the efficacy of VR and AR therapy. The aim of this review is to evaluate and summarize the current research on the effect of immersive VR and AR in the treatment of PLP. A comprehensive literature search was conducted utilizing PubMed and Google Scholar in order to collect all available studies concerning the use of VR and/or AR in the treatment of PLP using the search terms "virtual reality," "augmented reality," and "phantom limb pain." Eight studies in total were evaluated, with six of those reporting quantitative data and the other two reporting qualitative findings. All studies located were of low-level evidence. Each noted improved pain with VR and AR treatment for phantom limb pain, through quantitative or qualitative reporting. Additionally, adverse effects were limited only to simulator sickness occurring in one trial for one patient. Despite the positive findings, all of the studies were confined purely to case studies and case report series. No studies of higher evidence have been conducted, thus considerably limiting the strength of the findings. As such, the current use of VR and AR for PLP management, while attractive due to the increasing levels of immersion, customizable environments, and decreasing cost, is yet to be fully proven and continues to need further research with higher quality studies to fully explore its benefits.
Telemedicine with mobile devices and augmented reality for early postoperative care.
Ponce, Brent A; Brabston, Eugene W; Shin Zu; Watson, Shawna L; Baker, Dustin; Winn, Dennis; Guthrie, Barton L; Shenai, Mahesh B
2016-08-01
Advanced features are being added to telemedicine paradigms to enhance usability and usefulness. Virtual Interactive Presence (VIP) is a technology that allows a surgeon and patient to interact in a "merged reality" space, to facilitate both verbal, visual, and manual interaction. In this clinical study, a mobile VIP iOS application was introduced into routine post-operative orthopedic and neurosurgical care. Survey responses endorse the usefulness of this tool, as it relates to The virtual interaction provides needed virtual follow-up in instances where in-person follow-up may be limited, and enhances the subjective patient experience.
Use of display technologies for augmented reality enhancement
NASA Astrophysics Data System (ADS)
Harding, Kevin
2016-06-01
Augmented reality (AR) is seen as an important tool for the future of user interfaces as well as training applications. An important application area for AR is expected to be in the digitization of training and worker instructions used in the Brilliant Factory environment. The transition of work instructions methods from printed pages in a book or taped to a machine to virtual simulations is a long step with many challenges along the way. A variety of augmented reality tools are being explored today for industrial applications that range from simple programmable projections in the work space to 3D displays and head mounted gear. This paper will review where some of these tool are today and some of the pros and cons being considered for the future worker environment.
Virtual reality and planetary exploration
NASA Technical Reports Server (NTRS)
Mcgreevy, Michael W.
1992-01-01
NASA-Ames is intensively developing virtual-reality (VR) capabilities that can extend and augment computer-generated and remote spatial environments. VR is envisioned not only as a basis for improving human/machine interactions involved in planetary exploration, but also as a medium for the more widespread sharing of the experience of exploration, thereby broadening the support-base for the lunar and planetary-exploration endeavors. Imagery representative of Mars are being gathered for VR presentation at such terrestrial sites as Antarctica and Death Valley.
Augmented Reality versus Virtual Reality for 3D Object Manipulation.
Krichenbauer, Max; Yamamoto, Goshiro; Taketom, Takafumi; Sandor, Christian; Kato, Hirokazu
2018-02-01
Virtual Reality (VR) Head-Mounted Displays (HMDs) are on the verge of becoming commodity hardware available to the average user and feasible to use as a tool for 3D work. Some HMDs include front-facing cameras, enabling Augmented Reality (AR) functionality. Apart from avoiding collisions with the environment, interaction with virtual objects may also be affected by seeing the real environment. However, whether these effects are positive or negative has not yet been studied extensively. For most tasks it is unknown whether AR has any advantage over VR. In this work we present the results of a user study in which we compared user performance measured in task completion time on a 9 degrees of freedom object selection and transformation task performed either in AR or VR, both with a 3D input device and a mouse. Our results show faster task completion time in AR over VR. When using a 3D input device, a purely VR environment increased task completion time by 22.5 percent on average compared to AR ( ). Surprisingly, a similar effect occurred when using a mouse: users were about 17.3 percent slower in VR than in AR ( ). Mouse and 3D input device produced similar task completion times in each condition (AR or VR) respectively. We further found no differences in reported comfort.
White Paper for Virtual Control Room
NASA Technical Reports Server (NTRS)
Little, William; Tully-Hanson, Benjamin
2015-01-01
The Virtual Control Room (VCR) Proof of Concept (PoC) project is the result of an award given by the Fourth Annual NASA T&I Labs Challenge Project Call. This paper will outline the work done over the award period to build and enhance the capabilities of the Augmented/Virtual Reality (AVR) Lab at NASA's Kennedy Space Center (KSC) to create the VCR.
Virtual reality and robotics for stroke rehabilitation: where do we go from here?
Wade, Eric; Winstein, Carolee J
2011-01-01
Promoting functional recovery after stroke requires collaborative and innovative approaches to neurorehabilitation research. Task-oriented training (TOT) approaches that include challenging, adaptable, and meaningful activities have led to successful outcomes in several large-scale multisite definitive trials. This, along with recent technological advances of virtual reality and robotics, provides a fertile environment for furthering clinical research in neurorehabilitation. Both virtual reality and robotics make use of multimodal sensory interfaces to affect human behavior. In the therapeutic setting, these systems can be used to quantitatively monitor, manipulate, and augment the users' interaction with their environment, with the goal of promoting functional recovery. This article describes recent advances in virtual reality and robotics and the synergy with best clinical practice. Additionally, we describe the promise shown for automated assessments and in-home activity-based interventions. Finally, we propose a broader approach to ensuring that technology-based assessment and intervention complement evidence-based practice and maintain a patient-centered perspective.
A Collaborative Augmented Campus Based on Location-Aware Mobile Technology
ERIC Educational Resources Information Center
De Lucia, A.; Francese, R.; Passero, I.; Tortora, G.
2012-01-01
Mobile devices are changing the way people work and communicate. Most of the innovative devices offer the opportunity to integrate augmented reality in mobile applications, permitting the combination of the real world with virtual information. This feature can be particularly useful to enhance informal and formal didactic actions based on student…
Evaluation of Wearable Haptic Systems for the Fingers in Augmented Reality Applications.
Maisto, Maurizio; Pacchierotti, Claudio; Chinello, Francesco; Salvietti, Gionata; De Luca, Alessandro; Prattichizzo, Domenico
2017-01-01
Although Augmented Reality (AR) has been around for almost five decades, only recently we have witnessed AR systems and applications entering in our everyday life. Representative examples of this technological revolution are the smartphone games "Pokémon GO" and "Ingress" or the Google Translate real-time sign interpretation app. Even if AR applications are already quite compelling and widespread, users are still not able to physically interact with the computer-generated reality. In this respect, wearable haptics can provide the compelling illusion of touching the superimposed virtual objects without constraining the motion or the workspace of the user. In this paper, we present the experimental evaluation of two wearable haptic interfaces for the fingers in three AR scenarios, enrolling 38 participants. In the first experiment, subjects were requested to write on a virtual board using a real chalk. The haptic devices provided the interaction forces between the chalk and the board. In the second experiment, subjects were asked to pick and place virtual and real objects. The haptic devices provided the interaction forces due to the weight of the virtual objects. In the third experiment, subjects were asked to balance a virtual sphere on a real cardboard. The haptic devices provided the interaction forces due to the weight of the virtual sphere rolling on the cardboard. Providing haptic feedback through the considered wearable device significantly improved the performance of all the considered tasks. Moreover, subjects significantly preferred conditions providing wearable haptic feedback.
Advanced helmet mounted display (AHMD)
NASA Astrophysics Data System (ADS)
Sisodia, Ashok; Bayer, Michael; Townley-Smith, Paul; Nash, Brian; Little, Jay; Cassarly, William; Gupta, Anurag
2007-04-01
Due to significantly increased U.S. military involvement in deterrent, observer, security, peacekeeping and combat roles around the world, the military expects significant future growth in the demand for deployable virtual reality trainers with networked simulation capability of the battle space visualization process. The use of HMD technology in simulated virtual environments has been initiated by the demand for more effective training tools. The AHMD overlays computer-generated data (symbology, synthetic imagery, enhanced imagery) augmented with actual and simulated visible environment. The AHMD can be used to support deployable reconfigurable training solutions as well as traditional simulation requirements, UAV augmented reality, air traffic control and Command, Control, Communications, Computers, Intelligence, Surveillance, and Reconnaissance (C4ISR) applications. This paper will describe the design improvements implemented for production of the AHMD System.
Wearable computer for mobile augmented-reality-based controlling of an intelligent robot
NASA Astrophysics Data System (ADS)
Turunen, Tuukka; Roening, Juha; Ahola, Sami; Pyssysalo, Tino
2000-10-01
An intelligent robot can be utilized to perform tasks that are either hazardous or unpleasant for humans. Such tasks include working in disaster areas or conditions that are, for example, too hot. An intelligent robot can work on its own to some extent, but in some cases the aid of humans will be needed. This requires means for controlling the robot from somewhere else, i.e. teleoperation. Mobile augmented reality can be utilized as a user interface to the environment, as it enhances the user's perception of the situation compared to other interfacing methods and allows the user to perform other tasks while controlling the intelligent robot. Augmented reality is a method that combines virtual objects into the user's perception of the real world. As computer technology evolves, it is possible to build very small devices that have sufficient capabilities for augmented reality applications. We have evaluated the existing wearable computers and mobile augmented reality systems to build a prototype of a future mobile terminal- the CyPhone. A wearable computer with sufficient system resources for applications, wireless communication media with sufficient throughput and enough interfaces for peripherals has been built at the University of Oulu. It is self-sustained in energy, with enough operating time for the applications to be useful, and uses accurate positioning systems.
Optical augmented reality assisted navigation system for neurosurgery teaching and planning
NASA Astrophysics Data System (ADS)
Wu, Hui-Qun; Geng, Xing-Yun; Wang, Li; Zhang, Yuan-Peng; Jiang, Kui; Tang, Le-Min; Zhou, Guo-Min; Dong, Jian-Cheng
2013-07-01
This paper proposed a convenient navigation system for neurosurgeon's pre-operative planning and teaching with augmented reality (AR) technique, which maps the three-dimensional reconstructed virtual anatomy structures onto a skull model. This system included two parts, a virtual reality system and a skull model scence. In our experiment, a 73 year old right-handed man initially diagnosed with astrocytoma was selected as an example to vertify our system. His imaging data from different modalities were registered and the skull soft tissue, brain and inside vessels as well as tumor were reconstructed. Then the reconstructed models were overlayed on the real scence. Our findings showed that the reconstructed tissues were augmented into the real scence and the registration results were in good alignment. The reconstructed brain tissue was well distributed in the skull cavity. The probe was used by a neurosurgeon to explore the surgical pathway which could be directly posed into the tumor while not injuring important vessels. In this way, the learning cost for students and patients' education about surgical risks reduced. Therefore, this system could be a selective protocol for image guided surgery(IGS), and is promising for neurosurgeon's pre-operative planning and teaching.
Deutsch, Judith E
2009-01-01
Improving walking for individuals with musculoskeletal and neuromuscular conditions is an important aspect of rehabilitation. The capabilities of clinicians who address these rehabilitation issues could be augmented with innovations such as virtual reality gaming based technologies. The chapter provides an overview of virtual reality gaming based technologies currently being developed and tested to improve motor and cognitive elements required for ambulation and mobility in different patient populations. Included as well is a detailed description of a single VR system, consisting of the rationale for development and iterative refinement of the system based on clinical science. These concepts include: neural plasticity, part-task training, whole task training, task specific training, principles of exercise and motor learning, sensorimotor integration, and visual spatial processing.
Perform light and optic experiments in Augmented Reality
NASA Astrophysics Data System (ADS)
Wozniak, Peter; Vauderwange, Oliver; Curticapean, Dan; Javahiraly, Nicolas; Israel, Kai
2015-10-01
In many scientific studies lens experiments are part of the curriculum. The conducted experiments are meant to give the students a basic understanding for the laws of optics and its applications. Most of the experiments need special hardware like e.g. an optical bench, light sources, apertures and different lens types. Therefore it is not possible for the students to conduct any of the experiments outside of the university's laboratory. Simple optical software simulators enabling the students to virtually perform lens experiments already exist, but are mostly desktop or web browser based. Augmented Reality (AR) is a special case of mediated and mixed reality concepts, where computers are used to add, subtract or modify one's perception of reality. As a result of the success and widespread availability of handheld mobile devices, like e.g. tablet computers and smartphones, mobile augmented reality applications are easy to use. Augmented reality can be easily used to visualize a simulated optical bench. The students can interactively modify properties like e.g. lens type, lens curvature, lens diameter, lens refractive index and the positions of the instruments in space. Light rays can be visualized and promote an additional understanding of the laws of optics. An AR application like this is ideally suited to prepare the actual laboratory sessions and/or recap the teaching content. The authors will present their experience with handheld augmented reality applications and their possibilities for light and optic experiments without the needs for specialized optical hardware.
The assessment of virtual reality for human anatomy instruction
NASA Technical Reports Server (NTRS)
Benn, Karen P.
1994-01-01
This research project seeks to meet the objective of science training by developing, assessing, and validating virtual reality as a human anatomy training medium. In ideal situations, anatomic models, computer-based instruction, and cadaver dissection are utilized to augment the traditional methods of instruction. At many institutions, lack of financial resources limits anatomy instruction to textbooks and lectures. However, human anatomy is three dimensional, unlike the one dimensional depiction found in textbooks and the two dimensional depiction found on the computer. Virtual reality is a breakthrough technology that allows one to step through the computer screen into a three dimensional world. This technology offers many opportunities to enhance science education. Therefore, a virtual testing environment of the abdominopelvic region of a human cadaver was created to study the placement of body parts within the nine anatomical divisions of the abdominopelvic region and the four abdominal quadrants.
3D interactive augmented reality-enhanced digital learning systems for mobile devices
NASA Astrophysics Data System (ADS)
Feng, Kai-Ten; Tseng, Po-Hsuan; Chiu, Pei-Shuan; Yang, Jia-Lin; Chiu, Chun-Jie
2013-03-01
With enhanced processing capability of mobile platforms, augmented reality (AR) has been considered a promising technology for achieving enhanced user experiences (UX). Augmented reality is to impose virtual information, e.g., videos and images, onto a live-view digital display. UX on real-world environment via the display can be e ectively enhanced with the adoption of interactive AR technology. Enhancement on UX can be bene cial for digital learning systems. There are existing research works based on AR targeting for the design of e-learning systems. However, none of these work focuses on providing three-dimensional (3-D) object modeling for en- hanced UX based on interactive AR techniques. In this paper, the 3-D interactive augmented reality-enhanced learning (IARL) systems will be proposed to provide enhanced UX for digital learning. The proposed IARL systems consist of two major components, including the markerless pattern recognition (MPR) for 3-D models and velocity-based object tracking (VOT) algorithms. Realistic implementation of proposed IARL system is conducted on Android-based mobile platforms. UX on digital learning can be greatly improved with the adoption of proposed IARL systems.
Augmented Reality Based Doppler Lidar Data Visualization: Promises and Challenges
NASA Astrophysics Data System (ADS)
Cherukuru, N. W.; Calhoun, R.
2016-06-01
Augmented reality (AR) is a technology in which the enables the user to view virtual content as if it existed in real world. We are exploring the possibility of using this technology to view radial velocities or processed wind vectors from a Doppler wind lidar, thus giving the user an ability to see the wind in a literal sense. This approach could find possible applications in aviation safety, atmospheric data visualization as well as in weather education and public outreach. As a proof of concept, we used the lidar data from a recent field campaign and developed a smartphone application to view the lidar scan in augmented reality. In this paper, we give a brief methodology of this feasibility study, present the challenges and promises of using AR technology in conjunction with Doppler wind lidars.
Virtually Nursing: Emerging Technologies in Nursing Education.
Foronda, Cynthia L; Alfes, Celeste M; Dev, Parvati; Kleinheksel, A J; Nelson, Douglas A; OʼDonnell, John M; Samosky, Joseph T
Augmented reality and virtual simulation technologies in nursing education are burgeoning. Preliminary evidence suggests that these innovative pedagogical approaches are effective. The aim of this article is to present 6 newly emerged products and systems that may improve nursing education. Technologies may present opportunities to improve teaching efforts, better engage students, and transform nursing education.
Embodied information behavior, mixed reality and big data
NASA Astrophysics Data System (ADS)
West, Ruth; Parola, Max J.; Jaycen, Amelia R.; Lueg, Christopher P.
2015-03-01
A renaissance in the development of virtual (VR), augmented (AR), and mixed reality (MR) technologies with a focus on consumer and industrial applications is underway. As data becomes ubiquitous in our lives, a need arises to revisit the role of our bodies, explicitly in relation to data or information. Our observation is that VR/AR/MR technology development is a vision of the future framed in terms of promissory narratives. These narratives develop alongside the underlying enabling technologies and create new use contexts for virtual experiences. It is a vision rooted in the combination of responsive, interactive, dynamic, sharable data streams, and augmentation of the physical senses for capabilities beyond those normally humanly possible. In parallel to the varied definitions of information and approaches to elucidating information behavior, a myriad of definitions and methods of measuring and understanding presence in virtual experiences exist. These and other ideas will be tested by designers, developers and technology adopters as the broader ecology of head-worn devices for virtual experiences evolves in order to reap the full potential and benefits of these emerging technologies.
Borrel, Alexandre; Fourches, Denis
2017-12-01
There is a growing interest for the broad use of Augmented Reality (AR) and Virtual Reality (VR) in the fields of bioinformatics and cheminformatics to visualize complex biological and chemical structures. AR and VR technologies allow for stunning and immersive experiences, offering untapped opportunities for both research and education purposes. However, preparing 3D models ready to use for AR and VR is time-consuming and requires a technical expertise that severely limits the development of new contents of potential interest for structural biologists, medicinal chemists, molecular modellers and teachers. Herein we present the RealityConvert software tool and associated website, which allow users to easily convert molecular objects to high quality 3D models directly compatible for AR and VR applications. For chemical structures, in addition to the 3D model generation, RealityConvert also generates image trackers, useful to universally call and anchor that particular 3D model when used in AR applications. The ultimate goal of RealityConvert is to facilitate and boost the development and accessibility of AR and VR contents for bioinformatics and cheminformatics applications. http://www.realityconvert.com. dfourch@ncsu.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Real-time 3D image reconstruction guidance in liver resection surgery
Nicolau, Stephane; Pessaux, Patrick; Mutter, Didier; Marescaux, Jacques
2014-01-01
Background Minimally invasive surgery represents one of the main evolutions of surgical techniques. However, minimally invasive surgery adds difficulty that can be reduced through computer technology. Methods From a patient’s medical image [US, computed tomography (CT) or MRI], we have developed an Augmented Reality (AR) system that increases the surgeon’s intraoperative vision by providing a virtual transparency of the patient. AR is based on two major processes: 3D modeling and visualization of anatomical or pathological structures appearing in the medical image, and the registration of this visualization onto the real patient. We have thus developed a new online service, named Visible Patient, providing efficient 3D modeling of patients. We have then developed several 3D visualization and surgical planning software tools to combine direct volume rendering and surface rendering. Finally, we have developed two registration techniques, one interactive and one automatic providing intraoperative augmented reality view. Results From January 2009 to June 2013, 769 clinical cases have been modeled by the Visible Patient service. Moreover, three clinical validations have been realized demonstrating the accuracy of 3D models and their great benefit, potentially increasing surgical eligibility in liver surgery (20% of cases). From these 3D models, more than 50 interactive AR-assisted surgical procedures have been realized illustrating the potential clinical benefit of such assistance to gain safety, but also current limits that automatic augmented reality will overcome. Conclusions Virtual patient modeling should be mandatory for certain interventions that have now to be defined, such as liver surgery. Augmented reality is clearly the next step of the new surgical instrumentation but remains currently limited due to the complexity of organ deformations during surgery. Intraoperative medical imaging used in new generation of automated augmented reality should solve this issue thanks to the development of Hybrid OR. PMID:24812598
A Context-Aware Method for Authentically Simulating Outdoors Shadows for Mobile Augmented Reality.
Barreira, Joao; Bessa, Maximino; Barbosa, Luis; Magalhaes, Luis
2018-03-01
Visual coherence between virtual and real objects is a major issue in creating convincing augmented reality (AR) applications. To achieve this seamless integration, actual light conditions must be determined in real time to ensure that virtual objects are correctly illuminated and cast consistent shadows. In this paper, we propose a novel method to estimate daylight illumination and use this information in outdoor AR applications to render virtual objects with coherent shadows. The illumination parameters are acquired in real time from context-aware live sensor data. The method works under unprepared natural conditions. We also present a novel and rapid implementation of a state-of-the-art skylight model, from which the illumination parameters are derived. The Sun's position is calculated based on the user location and time of day, with the relative rotational differences estimated from a gyroscope, compass and accelerometer. The results illustrated that our method can generate visually credible AR scenes with consistent shadows rendered from recovered illumination.
Augmented Reality to Preserve Hidden Vestiges in Historical Cities. a Case Study
NASA Astrophysics Data System (ADS)
Martínez, J. L.; Álvareza, S.; Finat, J.; Delgado, F. J.; Finat, J.
2015-02-01
Mobile devices provide an increasingly sophisticated support to enhanced experiences and understanding the remote past in an interactive way. The use of augmented reality technologies allows to develop mobile applications for indoor exploration of virtually reconstructed archaeological places. In our work we have built a virtual reconstruction of a Roman Villa with data arising from an urgent partial excavation which were performed in order to build a car parking in the historical city of Valladolid (Spain). In its current state, the archaeological site is covered by an urban garden. Localization and tracking are performed using a combination of GPS and inertial sensors of the mobile device. In this work we prove how to perform an interactive navigation around the 3D virtual model showing an interpretation of the way it was. The user experience is enhanced by answering some simple questions, performing minor tasks and puzzles which are presented with multimedia contents linked to key features of the archaeological site.
Augmented reality visualization of deformable tubular structures for surgical simulation.
Ferrari, Vincenzo; Viglialoro, Rosanna Maria; Nicoli, Paola; Cutolo, Fabrizio; Condino, Sara; Carbone, Marina; Siesto, Mentore; Ferrari, Mauro
2016-06-01
Surgical simulation based on augmented reality (AR), mixing the benefits of physical and virtual simulation, represents a step forward in surgical training. However, available systems are unable to update the virtual anatomy following deformations impressed on actual anatomy. A proof-of-concept solution is described providing AR visualization of hidden deformable tubular structures using nitinol tubes sensorized with electromagnetic sensors. This system was tested in vitro on a setup comprised of sensorized cystic, left and right hepatic, and proper hepatic arteries. In the trial session, the surgeon deformed the tubular structures with surgical forceps in 10 positions. The mean, standard deviation, and maximum misalignment between virtual and real arteries were 0.35, 0.22, and 0.99 mm, respectively. The alignment accuracy obtained demonstrates the feasibility of the approach, which can be adopted in advanced AR simulations, in particular as an aid to the identification and isolation of tubular structures. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Applied virtual reality at the Research Triangle Institute
NASA Technical Reports Server (NTRS)
Montoya, R. Jorge
1994-01-01
Virtual Reality (VR) is a way for humans to use computers in visualizing, manipulating and interacting with large geometric data bases. This paper describes a VR infrastructure and its application to marketing, modeling, architectural walk through, and training problems. VR integration techniques used in these applications are based on a uniform approach which promotes portability and reusability of developed modules. For each problem, a 3D object data base is created using data captured by hand or electronically. The object's realism is enhanced through either procedural or photo textures. The virtual environment is created and populated with the data base using software tools which also support interactions with and immersivity in the environment. These capabilities are augmented by other sensory channels such as voice recognition, 3D sound, and tracking. Four applications are presented: a virtual furniture showroom, virtual reality models of the North Carolina Global TransPark, a walk through the Dresden Fraunenkirche, and the maintenance training simulator for the National Guard.
Immersive Collaboration Simulations: Multi-User Virtual Environments and Augmented Realities
NASA Technical Reports Server (NTRS)
Dede, Chris
2008-01-01
Emerging information technologies are reshaping the following: shifts in the knowledge and skills society values, development of new methods of teaching and learning, and changes in the characteristics of learning.
Augmenting the access grid using augmented reality
NASA Astrophysics Data System (ADS)
Li, Ying
2012-01-01
The Access Grid (AG) targets an advanced collaboration environment, with which multi-party group of people from remote sites can collaborate over high-performance networks. However, current AG still employs VIC (Video Conferencing Tool) to offer only pure video for remote communication, while most AG users expect to collaboratively refer and manipulate the 3D geometric models of grid services' results in live videos of AG session. Augmented Reality (AR) technique can overcome the deficiencies with its characteristics of combining virtual and real, real-time interaction and 3D registration, so it is necessary for AG to utilize AR to better assist the advanced collaboration environment. This paper introduces an effort to augment the AG by adding support for AR capability, which is encapsulated in the node service infrastructure, named as Augmented Reality Service (ARS). The ARS can merge the 3D geometric models of grid services' results and real video scene of AG into one AR environment, and provide the opportunity for distributed AG users to interactively and collaboratively participate in the AR environment with better experience.
Visual Stability of Objects and Environments Viewed through Head-Mounted Displays
NASA Technical Reports Server (NTRS)
Ellis, Stephen R.; Adelstein, Bernard D.
2015-01-01
Virtual Environments (aka Virtual Reality) is again catching the public imagination and a number of startups (e.g. Oculus) and even not-so-startup companies (e.g. Microsoft) are trying to develop display systems to capitalize on this renewed interest. All acknowledge that this time they will get it right by providing the required dynamic fidelity, visual quality, and interesting content for the concept of VR to take off and change the world in ways it failed to do so in past incarnations. Some of the surprisingly long historical background of the technology that the form of direct simulation that underlies virtual environment and augmented reality displays will be briefly reviewed. An example of a mid 1990's augmented reality display system with good dynamic performance from our lab will be used to illustrate some of the underlying phenomena and technology concerning visual stability of virtual environments and objects during movement. In conclusion some idealized performance characteristics for a reference system will be proposed. Interestingly, many systems more or less on the market now may actually meet many of these proposed technical requirements. This observation leads to the conclusion that the current success of the IT firms trying to commercialize the technology will depend on the hidden costs of using the systems as well as the development of interesting and compelling content.
Stereoscopic augmented reality with pseudo-realistic global illumination effects
NASA Astrophysics Data System (ADS)
de Sorbier, Francois; Saito, Hideo
2014-03-01
Recently, augmented reality has become very popular and has appeared in our daily life with gaming, guiding systems or mobile phone applications. However, inserting object in such a way their appearance seems natural is still an issue, especially in an unknown environment. This paper presents a framework that demonstrates the capabilities of Kinect for convincing augmented reality in an unknown environment. Rather than pre-computing a reconstruction of the scene like proposed by most of the previous method, we propose a dynamic capture of the scene that allows adapting to live changes of the environment. Our approach, based on the update of an environment map, can also detect the position of the light sources. Combining information from the environment map, the light sources and the camera tracking, we can display virtual objects using stereoscopic devices with global illumination effects such as diffuse and mirror reflections, refractions and shadows in real time.
Augmented reality social story for autism spectrum disorder
NASA Astrophysics Data System (ADS)
Syahputra, M. F.; Arisandi, D.; Lumbanbatu, A. F.; Kemit, L. F.; Nababan, E. B.; Sheta, O.
2018-03-01
Augmented Reality is a technique that can bring social story therapy into virtual world to increase intrinsic motivation of children with Autism Spectrum Disorder(ASD). By looking at the behaviour of ASD who will be difficult to get the focus, the lack of sensory and motor nerves in the use of loads on the hands or other organs will be very distressing children with ASD in doing the right activities, and interpret and understand the social situation in determining a response appropriately. Required method to be able to apply social story on therapy of children with ASD that is implemented with Augmented Reality. The output resulting from this method is 3D animation (three-dimensional animation) of social story by detecting marker located in special book and some simple game which done by using leap motion controller which is useful in reading hand movement in real-time.
ChemPreview: an augmented reality-based molecular interface.
Zheng, Min; Waller, Mark P
2017-05-01
Human computer interfaces make computational science more comprehensible and impactful. Complex 3D structures such as proteins or DNA are magnified by digital representations and displayed on two-dimensional monitors. Augmented reality has recently opened another door to access the virtual three-dimensional world. Herein, we present an augmented reality application called ChemPreview with the potential to manipulate bio-molecular structures at an atomistic level. ChemPreview is available at https://github.com/wallerlab/chem-preview/releases, and is built on top of the Meta 1 platform https://www.metavision.com/. ChemPreview can be used to interact with a protein in an intuitive way using natural hand gestures, thereby making it appealing to computational chemists or structural biologists. The ability to manipulate atoms in real world could eventually provide new and more efficient ways of extracting structural knowledge, or designing new molecules in silico. Copyright © 2017 Elsevier Inc. All rights reserved.
Virtual reality and physical rehabilitation: a new toy or a new research and rehabilitation tool?
Keshner, Emily A
2004-01-01
Virtual reality (VR) technology is rapidly becoming a popular application for physical rehabilitation and motor control research. But questions remain about whether this technology really extends our ability to influence the nervous system or whether moving within a virtual environment just motivates the individual to perform. I served as guest editor of this month's issue of the Journal of NeuroEngineering and Rehabilitation (JNER) for a group of papers on augmented and virtual reality in rehabilitation. These papers demonstrate a variety of approaches taken for applying VR technology to physical rehabilitation. The papers by Kenyon et al. and Sparto et al. address critical questions about how this technology can be applied to physical rehabilitation and research. The papers by Sveistrup and Viau et al. explore whether action within a virtual environment is equivalent to motor performance within the physical environment. Finally, papers by Riva et al. and Weiss et al. discuss the important characteristics of a virtual environment that will be most effective for obtaining changes in the motor system. PMID:15679943
Augmented Reality, Virtual Reality and Their Effect on Learning Style in the Creative Design Process
ERIC Educational Resources Information Center
Chandrasekera, Tilanka; Yoon, So-Yeon
2018-01-01
Research has shown that user characteristics such as preference for using an interface can result in effective use of the interface. Research has also suggested that there is a relationship between learner preference and creativity. This study uses the VARK learning styles inventory to assess students learning style then explores how this learning…
Riva, Giuseppe; Baños, Rosa M.; Botella, Cristina; Mantovani, Fabrizia; Gaggioli, Andrea
2016-01-01
During life, many personal changes occur. These include changing house, school, work, and even friends and partners. However, the daily experience shows clearly that, in some situations, subjects are unable to change even if they want to. The recent advances in psychology and neuroscience are now providing a better view of personal change, the change affecting our assumptive world: (a) the focus of personal change is reducing the distance between self and reality (conflict); (b) this reduction is achieved through (1) an intense focus on the particular experience creating the conflict or (2) an internal or external reorganization of this experience; (c) personal change requires a progression through a series of different stages that however happen in discontinuous and non-linear ways; and (d) clinical psychology is often used to facilitate personal change when subjects are unable to move forward. Starting from these premises, the aim of this paper is to review the potential of virtuality for enhancing the processes of personal and clinical change. First, the paper focuses on the two leading virtual technologies – augmented reality (AR) and virtual reality (VR) – exploring their current uses in behavioral health and the outcomes of the 28 available systematic reviews and meta-analyses. Then the paper discusses the added value provided by VR and AR in transforming our external experience by focusing on the high level of personal efficacy and self-reflectiveness generated by their sense of presence and emotional engagement. Finally, it outlines the potential future use of virtuality for transforming our inner experience by structuring, altering, and/or replacing our bodily self-consciousness. The final outcome may be a new generation of transformative experiences that provide knowledge that is epistemically inaccessible to the individual until he or she has that experience, while at the same time transforming the individual’s worldview. PMID:27746747
Riva, Giuseppe; Baños, Rosa M; Botella, Cristina; Mantovani, Fabrizia; Gaggioli, Andrea
2016-01-01
During life, many personal changes occur. These include changing house, school, work, and even friends and partners. However, the daily experience shows clearly that, in some situations, subjects are unable to change even if they want to. The recent advances in psychology and neuroscience are now providing a better view of personal change, the change affecting our assumptive world: (a) the focus of personal change is reducing the distance between self and reality (conflict); (b) this reduction is achieved through (1) an intense focus on the particular experience creating the conflict or (2) an internal or external reorganization of this experience; (c) personal change requires a progression through a series of different stages that however happen in discontinuous and non-linear ways; and (d) clinical psychology is often used to facilitate personal change when subjects are unable to move forward. Starting from these premises, the aim of this paper is to review the potential of virtuality for enhancing the processes of personal and clinical change. First, the paper focuses on the two leading virtual technologies - augmented reality (AR) and virtual reality (VR) - exploring their current uses in behavioral health and the outcomes of the 28 available systematic reviews and meta-analyses. Then the paper discusses the added value provided by VR and AR in transforming our external experience by focusing on the high level of personal efficacy and self-reflectiveness generated by their sense of presence and emotional engagement. Finally, it outlines the potential future use of virtuality for transforming our inner experience by structuring, altering, and/or replacing our bodily self-consciousness. The final outcome may be a new generation of transformative experiences that provide knowledge that is epistemically inaccessible to the individual until he or she has that experience, while at the same time transforming the individual's worldview.
Webizing mobile augmented reality content
NASA Astrophysics Data System (ADS)
Ahn, Sangchul; Ko, Heedong; Yoo, Byounghyun
2014-01-01
This paper presents a content structure for building mobile augmented reality (AR) applications in HTML5 to achieve a clean separation of the mobile AR content and the application logic for scaling as on the Web. We propose that the content structure contains the physical world as well as virtual assets for mobile AR applications as document object model (DOM) elements and that their behaviour and user interactions are controlled through DOM events by representing objects and places with a uniform resource identifier. Our content structure enables mobile AR applications to be seamlessly developed as normal HTML documents under the current Web eco-system.
Fluet, Gerard G; Deutsch, Judith E
2013-03-01
Developments over the past 2 years in virtual reality (VR) augmented sensorimotor rehabilitation of upper limb use and gait post-stroke were reviewed. Studies were included if they evaluated comparative efficacy between VR and standard of care, and or differences in VR delivery methods; and were CEBM (center for evidence based medicine) level 2 or higher. Eight upper limb and two gait studies were included and described using the following categories hardware (input and output), software (virtual task and feedback and presentation) intervention (progression and dose), and outcomes. Trends in the field were commented on, gaps in knowledge identified, and areas of future research and translation of VR to practice were suggested.
ERIC Educational Resources Information Center
Civelek, Turhan; Ucar, Erdem; Ustunel, Hakan; Aydin, Mehmet Kemal
2014-01-01
The current research aims to explore the effects of a haptic augmented simulation on students' achievement and their attitudes towards Physics in an immersive virtual reality environment (VRE). A quasi-experimental post-test design was employed utilizing experiment and control groups. The participants were 215 students from a K-12 school in…
Virtual surgery in a (tele-)radiology framework.
Glombitza, G; Evers, H; Hassfeld, S; Engelmann, U; Meinzer, H P
1999-09-01
This paper presents telemedicine as an extension of a teleradiology framework through tools for virtual surgery. To classify the described methods and applications, the research field of virtual reality (VR) is broadly reviewed. Differences with respect to technical equipment, methodological requirements and areas of application are pointed out. Desktop VR, augmented reality, and virtual reality are differentiated and discussed in some typical contexts of diagnostic support, surgical planning, therapeutic procedures, simulation and training. Visualization techniques are compared as a prerequisite for virtual reality and assigned to distinct levels of immersion. The advantage of a hybrid visualization kernel is emphasized with respect to the desktop VR applications that are subsequently shown. Moreover, software design aspects are considered by outlining functional openness in the architecture of the host system. Here, a teleradiology workstation was extended by dedicated tools for surgical planning through a plug-in mechanism. Examples of recent areas of application are introduced such as liver tumor resection planning, diagnostic support in heart surgery, and craniofacial surgery planning. In the future, surgical planning systems will become more important. They will benefit from improvements in image acquisition and communication, new image processing approaches, and techniques for data presentation. This will facilitate preoperative planning and intraoperative applications.
Bosc, R; Fitoussi, A; Pigneur, F; Tacher, V; Hersant, B; Meningaud, J-P
2017-08-01
The augmented reality on smart glasses allows the surgeon to visualize three-dimensional virtual objects during surgery, superimposed in real time to the anatomy of the patient. This makes it possible to preserve the vision of the surgical field and to dispose of added computerized information without the need to use a physical surgical guide or a deported screen. The three-dimensional objects that we used and visualized in augmented reality came from the reconstructions made from the CT-scans of the patients. These objects have been transferred through a dedicated application on stereoscopic smart glasses. The positioning and the stabilization of the virtual layers on the anatomy of the patients were obtained thanks to the recognition, by the glasses, of a tracker placed on the skin. We used this technology, in addition to the usual locating methods for preoperative planning and the selection of perforating vessels for 12 patients operated on a breast reconstruction, by perforating flap of deep lower epigastric artery. The "hands-free" smart glasses with two stereoscopic screens make it possible to provide the reconstructive surgeon with binocular visualization in the operative field of the vessels identified with the CT-scan. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
An Augmented Reality Nanomanipulator for Learning Nanophysics: The "NanoLearner" Platform
NASA Astrophysics Data System (ADS)
Marchi, Florence; Marliere, Sylvain; Florens, Jean Loup; Luciani, Annie; Chevrier, Joel
The work focuses on the description and evaluation of an augmented reality nanomanipulator, called "NanoLearner" platform used as educational tool in practical works of nanophysics. Through virtual reality associated to multisensory renderings, students are immersed in the nanoworld where they can interact in real time with a sample surface or an object, using their senses as hearing, seeing and touching. The role of each sensorial rendering in the understanding and control of the "approach-retract" interaction has been determined thanks to statistical studies obtained during the practical works. Finally, we present two extensions of the use of this innovative tool for investigating nano effects in living organisms and for allowing grand public to have access to a natural understanding of nanophenomena.
Kin, Taichi; Nakatomi, Hirofumi; Shono, Naoyuki; Nomura, Seiji; Saito, Toki; Oyama, Hiroshi; Saito, Nobuhito
2017-10-15
Simulation and planning of surgery using a virtual reality model is becoming common with advances in computer technology. In this study, we conducted a literature search to find trends in virtual simulation of surgery for brain tumors. A MEDLINE search for "neurosurgery AND (simulation OR virtual reality)" retrieved a total of 1,298 articles published in the past 10 years. After eliminating studies designed solely for education and training purposes, 28 articles about the clinical application remained. The finding that the vast majority of the articles were about education and training rather than clinical applications suggests that several issues need be addressed for clinical application of surgical simulation. In addition, 10 of the 28 articles were from Japanese groups. In general, the 28 articles demonstrated clinical benefits of virtual surgical simulation. Simulation was particularly useful in better understanding complicated spatial relations of anatomical landmarks and in examining surgical approaches. In some studies, Virtual reality models were used on either surgical navigation system or augmented reality technology, which projects virtual reality images onto the operating field. Reported problems were difficulties in standardized, objective evaluation of surgical simulation systems; inability to respond to tissue deformation caused by surgical maneuvers; absence of the system functionality to reflect features of tissue (e.g., hardness and adhesion); and many problems with image processing. The amount of description about image processing tended to be insufficient, indicating that the level of evidence, risk of bias, precision, and reproducibility need to be addressed for further advances and ultimately for full clinical application.
Open multi-agent control architecture to support virtual-reality-based man-machine interfaces
NASA Astrophysics Data System (ADS)
Freund, Eckhard; Rossmann, Juergen; Brasch, Marcel
2001-10-01
Projective Virtual Reality is a new and promising approach to intuitively operable man machine interfaces for the commanding and supervision of complex automation systems. The user interface part of Projective Virtual Reality heavily builds on latest Virtual Reality techniques, a task deduction component and automatic action planning capabilities. In order to realize man machine interfaces for complex applications, not only the Virtual Reality part has to be considered but also the capabilities of the underlying robot and automation controller are of great importance. This paper presents a control architecture that has proved to be an ideal basis for the realization of complex robotic and automation systems that are controlled by Virtual Reality based man machine interfaces. The architecture does not just provide a well suited framework for the real-time control of a multi robot system but also supports Virtual Reality metaphors and augmentations which facilitate the user's job to command and supervise a complex system. The developed control architecture has already been used for a number of applications. Its capability to integrate sensor information from sensors of different levels of abstraction in real-time helps to make the realized automation system very responsive to real world changes. In this paper, the architecture will be described comprehensively, its main building blocks will be discussed and one realization that is built based on an open source real-time operating system will be presented. The software design and the features of the architecture which make it generally applicable to the distributed control of automation agents in real world applications will be explained. Furthermore its application to the commanding and control of experiments in the Columbus space laboratory, the European contribution to the International Space Station (ISS), is only one example which will be described.
NASA Technical Reports Server (NTRS)
Schulte, Erin
2017-01-01
As augmented and virtual reality grows in popularity, and more researchers focus on its development, other fields of technology have grown in the hopes of integrating with the up-and-coming hardware currently on the market. Namely, there has been a focus on how to make an intuitive, hands-free human-computer interaction (HCI) utilizing AR and VR that allows users to control their technology with little to no physical interaction with hardware. Computer vision, which is utilized in devices such as the Microsoft Kinect, webcams and other similar hardware has shown potential in assisting with the development of a HCI system that requires next to no human interaction with computing hardware and software. Object and facial recognition are two subsets of computer vision, both of which can be applied to HCI systems in the fields of medicine, security, industrial development and other similar areas.
Virtual reality applications for motor rehabilitation after stroke.
Sisto, Sue Ann; Forrest, Gail F; Glendinning, Diana
2002-01-01
Hemiparesis is the primary physical impairment underlying functional disability after stroke. A goal of rehabilitation is to enhance motor skill acquisition, which is a direct result of practice. However, frequency and duration of practice are limited in rehabilitation. Virtual reality (VR) is a computer technology that simulates real-life learning while providing augmented feedback and increased frequency, duration, and intensity of practiced tasks. The rate and extent of relearning of motor tasks could affect the duration, effectiveness, and cost of patient care. The purpose of this article is to review the use of VR training for motor rehabilitation after stroke.
Shema-Shiratzky, Shirley; Brozgol, Marina; Cornejo-Thumm, Pablo; Geva-Dayan, Karen; Rotstein, Michael; Leitner, Yael; Hausdorff, Jeffrey M; Mirelman, Anat
2018-05-17
To examine the feasibility and efficacy of a combined motor-cognitive training using virtual reality to enhance behavior, cognitive function and dual-tasking in children with Attention-Deficit/Hyperactivity Disorder (ADHD). Fourteen non-medicated school-aged children with ADHD, received 18 training sessions during 6 weeks. Training included walking on a treadmill while negotiating virtual obstacles. Behavioral symptoms, cognition and gait were tested before and after the training and at 6-weeks follow-up. Based on parental report, there was a significant improvement in children's social problems and psychosomatic behavior after the training. Executive function and memory were improved post-training while attention was unchanged. Gait regularity significantly increased during dual-task walking. Long-term training effects were maintained in memory and executive function. Treadmill-training augmented with virtual-reality is feasible and may be an effective treatment to enhance behavior, cognitive function and dual-tasking in children with ADHD.
Using Augmented Reality and Virtual Environments in Historic Places to Scaffold Historical Empathy
ERIC Educational Resources Information Center
Sweeney, Sara K.; Newbill, Phyllis; Ogle, Todd; Terry, Krista
2018-01-01
The authors explore how 3D visualizations of historical sites can be used as pedagogical tools to support historical empathy. They provide three visualizations created by a team at Virginia Tech as examples. They discuss virtual environments and how the digital restoration process is applied. They also define historical empathy, explain why it is…
Interactive Molecular Graphics for Augmented Reality Using HoloLens.
Müller, Christoph; Krone, Michael; Huber, Markus; Biener, Verena; Herr, Dominik; Koch, Steffen; Reina, Guido; Weiskopf, Daniel; Ertl, Thomas
2018-06-13
Immersive technologies like stereo rendering, virtual reality, or augmented reality (AR) are often used in the field of molecular visualisation. Modern, comparably lightweight and affordable AR headsets like Microsoft's HoloLens open up new possibilities for immersive analytics in molecular visualisation. A crucial factor for a comprehensive analysis of molecular data in AR is the rendering speed. HoloLens, however, has limited hardware capabilities due to requirements like battery life, fanless cooling and weight. Consequently, insights from best practises for powerful desktop hardware may not be transferable. Therefore, we evaluate the capabilities of the HoloLens hardware for modern, GPU-enabled, high-quality rendering methods for the space-filling model commonly used in molecular visualisation. We also assess the scalability for large molecular data sets. Based on the results, we discuss ideas and possibilities for immersive molecular analytics. Besides more obvious benefits like the stereoscopic rendering offered by the device, this specifically includes natural user interfaces that use physical navigation instead of the traditional virtual one. Furthermore, we consider different scenarios for such an immersive system, ranging from educational use to collaborative scenarios.
Fisher, J Brian; Porter, Susan M
2002-01-01
This paper describes an application of a display approach which uses chromakey techniques to composite real and computer-generated images allowing a user to see his hands and medical instruments collocated with the display of virtual objects during a medical training simulation. Haptic feedback is provided through the use of a PHANTOM force feedback device in addition to tactile augmentation, which allows the user to touch virtual objects by introducing corresponding real objects in the workspace. A simplified catheter introducer insertion simulation was developed to demonstrate the capabilities of this approach.
Virtual reality in surgery and medicine.
Chinnock, C
1994-01-01
This report documents the state of development of enhanced and virtual reality-based systems in medicine. Virtual reality systems seek to simulate a surgical procedure in a computer-generated world in order to improve training. Enhanced reality systems seek to augment or enhance reality by providing improved imaging alternatives for specific patient data. Virtual reality represents a paradigm shift in the way we teach and evaluate the skills of medical personnel. Driving the development of virtual reality-based simulators is laparoscopic abdominal surgery, where there is a perceived need for better training techniques; within a year, systems will be fielded for second-year residency students. Further refinements over perhaps the next five years should allow surgeons to evaluate and practice new techniques in a simulator before using them on patients. Technical developments are rapidly improving the realism of these machines to an amazing degree, as well as bringing the price down to affordable levels. In the next five years, many new anatomical models, procedures, and skills are likely to become available on simulators. Enhanced reality systems are generally being developed to improve visualization of specific patient data. Three-dimensional (3-D) stereovision systems for endoscopic applications, head-mounted displays, and stereotactic image navigation systems are being fielded now, with neurosurgery and laparoscopic surgery being major driving influences. Over perhaps the next five years, enhanced and virtual reality systems are likely to merge. This will permit patient-specific images to be used on virtual reality simulators or computer-generated landscapes to be input into surgical visualization instruments. Percolating all around these activities are developments in robotics and telesurgery. An advanced information infrastructure eventually will permit remote physicians to share video, audio, medical records, and imaging data with local physicians in real time. Surgical robots are likely to be deployed for specific tasks in the operating room (OR) and to support telesurgery applications. Technical developments in robotics and motion control are key components of many virtual reality systems. Since almost all of the virtual reality and enhanced reality systems will be digitally based, they are also capable of being put "on-line" for tele-training, consulting, and even surgery. Advancements in virtual and enhanced reality systems will be driven in part by consumer applications of this technology. Many of the companies that will supply systems for medical applications are also working on commercial products. A big consumer hit can benefit the entire industry by increasing volumes and bringing down costs.(ABSTRACT TRUNCATED AT 400 WORDS)
Gibby, Jacob T; Swenson, Samuel A; Cvetko, Steve; Rao, Raj; Javan, Ramin
2018-06-22
Augmented reality has potential to enhance surgical navigation and visualization. We determined whether head-mounted display augmented reality (HMD-AR) with superimposed computed tomography (CT) data could allow the wearer to percutaneously guide pedicle screw placement in an opaque lumbar model with no real-time fluoroscopic guidance. CT imaging was obtained of a phantom composed of L1-L3 Sawbones vertebrae in opaque silicone. Preprocedural planning was performed by creating virtual trajectories of appropriate angle and depth for ideal approach into the pedicle, and these data were integrated into the Microsoft HoloLens using the Novarad OpenSight application allowing the user to view the virtual trajectory guides and CT images superimposed on the phantom in two and three dimensions. Spinal needles were inserted following the virtual trajectories to the point of contact with bone. Repeat CT revealed actual needle trajectory, allowing comparison with the ideal preprocedural paths. Registration of AR to phantom showed a roughly circular deviation with maximum average radius of 2.5 mm. Users took an average of 200 s to place a needle. Extrapolation of needle trajectory into the pedicle showed that of 36 needles placed, 35 (97%) would have remained within the pedicles. Needles placed approximated a mean distance of 4.69 mm in the mediolateral direction and 4.48 mm in the craniocaudal direction from pedicle bone edge. To our knowledge, this is the first peer-reviewed report and evaluation of HMD-AR with superimposed 3D guidance utilizing CT for spinal pedicle guide placement for the purpose of cannulation without the use of fluoroscopy.
Systematic review on the effectiveness of augmented reality applications in medical training.
Barsom, E Z; Graafland, M; Schijven, M P
2016-10-01
Computer-based applications are increasingly used to support the training of medical professionals. Augmented reality applications (ARAs) render an interactive virtual layer on top of reality. The use of ARAs is of real interest to medical education because they blend digital elements with the physical learning environment. This will result in new educational opportunities. The aim of this systematic review is to investigate to which extent augmented reality applications are currently used to validly support medical professionals training. PubMed, Embase, INSPEC and PsychInfo were searched using predefined inclusion criteria for relevant articles up to August 2015. All study types were considered eligible. Articles concerning AR applications used to train or educate medical professionals were evaluated. Twenty-seven studies were found relevant, describing a total of seven augmented reality applications. Applications were assigned to three different categories. The first category is directed toward laparoscopic surgical training, the second category toward mixed reality training of neurosurgical procedures and the third category toward training echocardiography. Statistical pooling of data could not be performed due to heterogeneity of study designs. Face-, construct- and concurrent validity was proven for two applications directed at laparoscopic training, face- and construct validity for neurosurgical procedures and face-, content- and construct validity in echocardiography training. In the literature, none of the ARAs completed a full validation process for the purpose of use. Augmented reality applications that support blended learning in medical training have gained public and scientific interest. In order to be of value, applications must be able to transfer information to the user. Although promising, the literature to date is lacking to support such evidence.
Besharati Tabrizi, Leila; Mahvash, Mehran
2015-07-01
An augmented reality system has been developed for image-guided neurosurgery to project images with regions of interest onto the patient's head, skull, or brain surface in real time. The aim of this study was to evaluate system accuracy and to perform the first intraoperative application. Images of segmented brain tumors in different localizations and sizes were created in 10 cases and were projected to a head phantom using a video projector. Registration was performed using 5 fiducial markers. After each registration, the distance of the 5 fiducial markers from the visualized tumor borders was measured on the virtual image and on the phantom. The difference was considered a projection error. Moreover, the image projection technique was intraoperatively applied in 5 patients and was compared with a standard navigation system. Augmented reality visualization of the tumors succeeded in all cases. The mean time for registration was 3.8 minutes (range 2-7 minutes). The mean projection error was 0.8 ± 0.25 mm. There were no significant differences in accuracy according to the localization and size of the tumor. Clinical feasibility and reliability of the augmented reality system could be proved intraoperatively in 5 patients (projection error 1.2 ± 0.54 mm). The augmented reality system is accurate and reliable for the intraoperative projection of images to the head, skull, and brain surface. The ergonomic advantage of this technique improves the planning of neurosurgical procedures and enables the surgeon to use direct visualization for image-guided neurosurgery.
Fluet, Gerard G.
2013-01-01
Developments over the past 2 years in virtual reality (VR) augmented sensorimotor rehabilitation of upper limb use and gait post-stroke were reviewed. Studies were included if they evaluated comparative efficacy between VR and standard of care, and or differences in VR delivery methods; and were CEBM (center for evidence based medicine) level 2 or higher. Eight upper limb and two gait studies were included and described using the following categories hardware (input and output), software (virtual task and feedback and presentation) intervention (progression and dose), and outcomes. Trends in the field were commented on, gaps in knowledge identified, and areas of future research and translation of VR to practice were suggested. PMID:24579058
Mobile viewer system for virtual 3D space using infrared LED point markers and camera
NASA Astrophysics Data System (ADS)
Sakamoto, Kunio; Taneji, Shoto
2006-09-01
The authors have developed a 3D workspace system using collaborative imaging devices. A stereoscopic display enables this system to project 3D information. In this paper, we describe the position detecting system for a see-through 3D viewer. A 3D display system is useful technology for virtual reality, mixed reality and augmented reality. We have researched spatial imaging and interaction system. We have ever proposed 3D displays using the slit as a parallax barrier, the lenticular screen and the holographic optical elements(HOEs) for displaying active image 1)2)3)4). The purpose of this paper is to propose the interactive system using these 3D imaging technologies. The observer can view virtual images in the real world when the user watches the screen of a see-through 3D viewer. The goal of our research is to build the display system as follows; when users see the real world through the mobile viewer, the display system gives users virtual 3D images, which is floating in the air, and the observers can touch these floating images and interact them such that kids can make play clay. The key technologies of this system are the position recognition system and the spatial imaging display. The 3D images are presented by the improved parallax barrier 3D display. Here the authors discuss the measuring method of the mobile viewer using infrared LED point markers and a camera in the 3D workspace (augmented reality world). The authors show the geometric analysis of the proposed measuring method, which is the simplest method using a single camera not the stereo camera, and the results of our viewer system.
Rohmer, Kai; Jendersie, Johannes; Grosch, Thorsten
2017-11-01
Augmented Reality offers many applications today, especially on mobile devices. Due to the lack of mobile hardware for illumination measurements, photorealistic rendering with consistent appearance of virtual objects is still an area of active research. In this paper, we present a full two-stage pipeline for environment acquisition and augmentation of live camera images using a mobile device with a depth sensor. We show how to directly work on a recorded 3D point cloud of the real environment containing high dynamic range color values. For unknown and automatically changing camera settings, a color compensation method is introduced. Based on this, we show photorealistic augmentations using variants of differential light simulation techniques. The presented methods are tailored for mobile devices and run at interactive frame rates. However, our methods are scalable to trade performance for quality and can produce quality renderings on desktop hardware.
Immersive realities: articulating the shift from VR to mobile AR through artistic practice
NASA Astrophysics Data System (ADS)
Margolis, Todd; Cornish, Tracy; Berry, Rodney; DeFanti, Thomas A.
2012-03-01
Our contemporary imaginings of technological engagement with digital environments has transitioned from flying through Virtual Reality to mobile interactions with the physical world through personal media devices. Experiences technologically mediated through social interactivity within physical environments are now being preferenced over isolated environments such as CAVEs or HMDs. Examples of this trend can be seen in early tele-collaborative artworks which strove to use advanced networking to join multiple participants in shared virtual environments. Recent developments in mobile AR allow untethered access to such shared realities in places far removed from labs and home entertainment environments, and without the bulky and expensive technologies attached to our bodies that accompany most VR. This paper addresses the emerging trend favoring socially immersive artworks via mobile Augmented Reality rather than sensorially immersive Virtual Reality installations. With particular focus on AR as a mobile, locative technology, we will discuss how concepts of immersion and interactivity are evolving with this new medium. Immersion in context of mobile AR can be redefined to describe socially interactive experiences. Having distinctly different sensory, spatial and situational properties, mobile AR offers a new form for remixing elements from traditional virtual reality with physically based social experiences. This type of immersion offers a wide array of potential for mobile AR art forms. We are beginning to see examples of how artists can use mobile AR to create social immersive and interactive experiences.
Yang, Yea-Ru; Tsai, Meng-Pin; Chuang, Tien-Yow; Sung, Wen-Hsu; Wang, Ray-Yau
2008-08-01
This is a single blind randomized controlled trial to examine the effect of virtual reality-based training on the community ambulation in individuals with stroke. Twenty subjects with stroke were assigned randomly to either the control group (n=9) or the experimental group (n=11). Subjects in the control group received the treadmill training. Subjects in the experimental group underwent the virtual reality-based treadmill training. Walking speed, community walking time, walking ability questionnaire (WAQ), and activities-specific balance confidence (ABC) scale were evaluated. Subjects in the experimental group improved significantly in walking speed, community walking time, and WAQ score at posttraining and 1-month follow-up periods. Their ABC score also significantly increased at posttraining but did not maintain at follow-up period. Regarding the between-group comparisons, the experimental group improved significantly more than control group in walking speed (P=0.03) and community walking time (P=0.04) at posttraining period and in WAQ score (P=0.03) at follow-up period. Our results support the perceived benefits of gait training programs that incorporate virtual reality to augment the community ambulation of individuals with stroke.
NASA Astrophysics Data System (ADS)
Demir, I.
2014-12-01
Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The hydrological simulation system is a web-based 3D interactive learning environment for teaching hydrological processes and concepts. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create or load predefined scenarios, control environmental parameters, and evaluate environmental mitigation alternatives. The web-based simulation system provides an environment for students to learn about the hydrological processes (e.g. flooding and flood damage), and effects of development and human activity in the floodplain. The system utilizes latest web technologies and graphics processing unit (GPU) for water simulation and object collisions on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various visualization and interaction modes.
Jiang, Taoran; Zhu, Ming; Zan, Tao; Gu, Bin; Li, Qingfeng
2017-08-01
In perforator flap transplantation, dissection of the perforator is an important but difficult procedure because of the high variability in vascular anatomy. Preoperative imaging techniques could provide substantial information about vascular anatomy; however, it cannot provide direct guidance for surgeons during the operation. In this study, a navigation system (NS) was established to overlie a vascular map on surgical sites to further provide a direct guide for perforator flap transplantation. The NS was established based on computed tomographic angiography and augmented reality techniques. A virtual vascular map was reconstructed according to computed tomographic angiography data and projected onto real patient images using ARToolKit software. Additionally, a screw-fixation marker holder was created to facilitate registration. With the use of a tracking and display system, we conducted the NS on an animal model and measured the system error on a rapid prototyping model. The NS assistance allowed for correct identification, as well as a safe and precise dissection of the perforator. The mean value of the system error was determined to be 3.474 ± 1.546 mm. Augmented reality-based NS can provide precise navigation information by directly displaying a 3-dimensional individual anatomical virtual model onto the operative field in real time. It will allow rapid identification and safe dissection of a perforator in free flap transplantation surgery.
Hybrid diffractive-refractive optical system design of head-mounted display for augmented reality
NASA Astrophysics Data System (ADS)
Zhang, Huijuan
2005-02-01
An optical see-through head-mounted display for augmented reality is designed in this paper. Considering the factors, such as the optical performance, the utilization ratios of energy of real world and virtual world, the feelings of users when he wears it and etc., a structure of the optical see-through is adopted. With the characteristics of the particular negative dispersive and the power of realizing random-phase modulation, the diffractive surface is helpful for optical system of reducing weight, simplifying structure and etc., and a diffractive surface is introduced in our optical system. The optical system with 25 mm eye relief, 12 mm exit pupil and 20° (H)x15.4° (V) field-of-view is designed. The utilization ratios of energy of real world and virtual world are 1/4 and 1/2, respectively. The angular resolution of display is 0.27 mrad and it less than that of the minimum of human eyes. The diameter of this system is less than 46mm, and it applies the binocular. This diffractive-refractive optical system of see-through head-mounted display not only satisfies the demands of user"s factors in structure, but also with high resolution, very small chromatic aberration and distortion, and satisfies the need of augmented reality. In the end, the parameters of the diffractive surface are discussed.
On-the-fly augmented reality for orthopedic surgery using a multimodal fiducial.
Andress, Sebastian; Johnson, Alex; Unberath, Mathias; Winkler, Alexander Felix; Yu, Kevin; Fotouhi, Javad; Weidert, Simon; Osgood, Greg; Navab, Nassir
2018-04-01
Fluoroscopic x-ray guidance is a cornerstone for percutaneous orthopedic surgical procedures. However, two-dimensional (2-D) observations of the three-dimensional (3-D) anatomy suffer from the effects of projective simplification. Consequently, many x-ray images from various orientations need to be acquired for the surgeon to accurately assess the spatial relations between the patient's anatomy and the surgical tools. We present an on-the-fly surgical support system that provides guidance using augmented reality and can be used in quasiunprepared operating rooms. The proposed system builds upon a multimodality marker and simultaneous localization and mapping technique to cocalibrate an optical see-through head mounted display to a C-arm fluoroscopy system. Then, annotations on the 2-D x-ray images can be rendered as virtual objects in 3-D providing surgical guidance. We quantitatively evaluate the components of the proposed system and, finally, design a feasibility study on a semianthropomorphic phantom. The accuracy of our system was comparable to the traditional image-guided technique while substantially reducing the number of acquired x-ray images as well as procedure time. Our promising results encourage further research on the interaction between virtual and real objects that we believe will directly benefit the proposed method. Further, we would like to explore the capabilities of our on-the-fly augmented reality support system in a larger study directed toward common orthopedic interventions.
Liu, Miao; Yang, Shourui; Wang, Zhangying; Huang, Shujun; Liu, Yue; Niu, Zhenqi; Zhang, Xiaoxuan; Zhu, Jigui; Zhang, Zonghua
2016-05-30
Augmented reality system can be applied to provide precise guidance for various kinds of manual works. The adaptability and guiding accuracy of such systems are decided by the computational model and the corresponding calibration method. In this paper, a novel type of augmented reality guiding system and the corresponding designing scheme are proposed. Guided by external positioning equipment, the proposed system can achieve high relative indication accuracy in a large working space. Meanwhile, the proposed system is realized with a digital projector and the general back projection model is derived with geometry relationship between digitized 3D model and the projector in free space. The corresponding calibration method is also designed for the proposed system to obtain the parameters of projector. To validate the proposed back projection model, the coordinate data collected by a 3D positioning equipment is used to calculate and optimize the extrinsic parameters. The final projecting indication accuracy of the system is verified with subpixel pattern projecting technique.
The Virtual Tablet: Virtual Reality as a Control System
NASA Technical Reports Server (NTRS)
Chronister, Andrew
2016-01-01
In the field of human-computer interaction, Augmented Reality (AR) and Virtual Reality (VR) have been rapidly growing areas of interest and concerted development effort thanks to both private and public research. At NASA, a number of groups have explored the possibilities afforded by AR and VR technology, among which is the IT Advanced Concepts Lab (ITACL). Within ITACL, the AVR (Augmented/Virtual Reality) Lab focuses on VR technology specifically for its use in command and control. Previous work in the AVR lab includes the Natural User Interface (NUI) project and the Virtual Control Panel (VCP) project, which created virtual three-dimensional interfaces that users could interact with while wearing a VR headset thanks to body- and hand-tracking technology. The Virtual Tablet (VT) project attempts to improve on these previous efforts by incorporating a physical surrogate which is mirrored in the virtual environment, mitigating issues with difficulty of visually determining the interface location and lack of tactile feedback discovered in the development of previous efforts. The physical surrogate takes the form of a handheld sheet of acrylic glass with several infrared-range reflective markers and a sensor package attached. Using the sensor package to track orientation and a motion-capture system to track the marker positions, a model of the surrogate is placed in the virtual environment at a position which corresponds with the real-world location relative to the user's VR Head Mounted Display (HMD). A set of control mechanisms is then projected onto the surface of the surrogate such that to the user, immersed in VR, the control interface appears to be attached to the object they are holding. The VT project was taken from an early stage where the sensor package, motion-capture system, and physical surrogate had been constructed or tested individually but not yet combined or incorporated into the virtual environment. My contribution was to combine the pieces of hardware, write software to incorporate each piece of position or orientation data into a coherent description of the object's location in space, place the virtual analogue accordingly, and project the control interface onto it, resulting in a functioning object which has both a physical and a virtual presence. Additionally, the virtual environment was enhanced with two live video feeds from cameras mounted on the robotic device being used as an example target of the virtual interface. The working VT allows users to naturally interact with a control interface with little to no training and without the issues found in previous efforts.
NASA Astrophysics Data System (ADS)
Rahman, Hameedur; Arshad, Haslina; Mahmud, Rozi; Mahayuddin, Zainal Rasyid
2017-10-01
Breast Cancer patients who require breast biopsy has increased over the past years. Augmented Reality guided core biopsy of breast has become the method of choice for researchers. However, this cancer visualization has limitations to the extent of superimposing the 3D imaging data only. In this paper, we are introducing an Augmented Reality visualization framework that enables breast cancer biopsy image guidance by using X-Ray vision technique on a mobile display. This framework consists of 4 phases where it initially acquires the image from CT/MRI and process the medical images into 3D slices, secondly it will purify these 3D grayscale slices into 3D breast tumor model using 3D modeling reconstruction technique. Further, in visualization processing this virtual 3D breast tumor model has been enhanced using X-ray vision technique to see through the skin of the phantom and the final composition of it is displayed on handheld device to optimize the accuracy of the visualization in six degree of freedom. The framework is perceived as an improved visualization experience because the Augmented Reality x-ray vision allowed direct understanding of the breast tumor beyond the visible surface and direct guidance towards accurate biopsy targets.
Chen, Xiaojun; Xu, Lu; Wang, Yiping; Wang, Huixiang; Wang, Fang; Zeng, Xiangsen; Wang, Qiugen; Egger, Jan
2015-06-01
The surgical navigation system has experienced tremendous development over the past decades for minimizing the risks and improving the precision of the surgery. Nowadays, Augmented Reality (AR)-based surgical navigation is a promising technology for clinical applications. In the AR system, virtual and actual reality are mixed, offering real-time, high-quality visualization of an extensive variety of information to the users (Moussa et al., 2012) [1]. For example, virtual anatomical structures such as soft tissues, blood vessels and nerves can be integrated with the real-world scenario in real time. In this study, an AR-based surgical navigation system (AR-SNS) is developed using an optical see-through HMD (head-mounted display), aiming at improving the safety and reliability of the surgery. With the use of this system, including the calibration of instruments, registration, and the calibration of HMD, the 3D virtual critical anatomical structures in the head-mounted display are aligned with the actual structures of patient in real-world scenario during the intra-operative motion tracking process. The accuracy verification experiment demonstrated that the mean distance and angular errors were respectively 0.809±0.05mm and 1.038°±0.05°, which was sufficient to meet the clinical requirements. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Demir, I.
2015-12-01
Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. This presentation showcase information communication interfaces, games, and virtual and immersive reality applications for supporting teaching and learning of concepts in atmospheric and hydrological sciences. The information communication platforms utilizes latest web technologies and allow accessing and visualizing large scale data on the web. The simulation system is a web-based 3D interactive learning environment for teaching hydrological and atmospheric processes and concepts. The simulation systems provides a visually striking platform with realistic terrain and weather information, and water simulation. The web-based simulation system provides an environment for students to learn about the earth science processes, and effects of development and human activity on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users.
The effectiveness of virtual and augmented reality in health sciences and medical anatomy.
Moro, Christian; Štromberga, Zane; Raikos, Athanasios; Stirling, Allan
2017-11-01
Although cadavers constitute the gold standard for teaching anatomy to medical and health science students, there are substantial financial, ethical, and supervisory constraints on their use. In addition, although anatomy remains one of the fundamental areas of medical education, universities have decreased the hours allocated to teaching gross anatomy in favor of applied clinical work. The release of virtual (VR) and augmented reality (AR) devices allows learning to occur through hands-on immersive experiences. The aim of this research was to assess whether learning structural anatomy utilizing VR or AR is as effective as tablet-based (TB) applications, and whether these modes allowed enhanced student learning, engagement and performance. Participants (n = 59) were randomly allocated to one of the three learning modes: VR, AR, or TB and completed a lesson on skull anatomy, after which they completed an anatomical knowledge assessment. Student perceptions of each learning mode and any adverse effects experienced were recorded. No significant differences were found between mean assessment scores in VR, AR, or TB. During the lessons however, VR participants were more likely to exhibit adverse effects such as headaches (25% in VR P < 0.05), dizziness (40% in VR, P < 0.001), or blurred vision (35% in VR, P < 0.01). Both VR and AR are as valuable for teaching anatomy as tablet devices, but also promote intrinsic benefits such as increased learner immersion and engagement. These outcomes show great promise for the effective use of virtual and augmented reality as means to supplement lesson content in anatomical education. Anat Sci Educ 10: 549-559. © 2017 American Association of Anatomists. © 2017 American Association of Anatomists.
Evaluating the use of augmented reality to support undergraduate student learning in geomorphology
NASA Astrophysics Data System (ADS)
Ockelford, A.; Bullard, J. E.; Burton, E.; Hackney, C. R.
2016-12-01
Augmented Reality (AR) supports the understanding of complex phenomena by providing unique visual and interactive experiences that combine real and virtual information and help communicate abstract problems to learners. With AR, designers can superimpose virtual graphics over real objects, allowing users to interact with digital content through physical manipulation. One of the most significant pedagogic features of AR is that it provides an essentially student-centred and flexible space in which students can learn. By actively engaging participants using a design-thinking approach, this technology has the potential to provide a more productive and engaging learning environment than real or virtual learning environments alone. AR is increasingly being used in support of undergraduate learning and public engagement activities across engineering, medical and humanities disciplines but it is not widely used across the geosciences disciplines despite the obvious applicability. This paper presents preliminary results from a multi-institutional project which seeks to evaluate the benefits and challenges of using an augmented reality sand box to support undergraduate learning in geomorphology. The sandbox enables users to create and visualise topography. As the sand is sculpted, contours are projected onto the miniature landscape. By hovering a hand over the box, users can make it `rain' over the landscape and the water `flows' down in to rivers and valleys. At undergraduate level, the sand-box is an ideal focus for problem-solving exercises, for example exploring how geomorphology controls hydrological processes, how such processes can be altered and the subsequent impacts of the changes for environmental risk. It is particularly valuable for students who favour a visual or kinesthetic learning style. Results presented in this paper discuss how the sandbox provides a complex interactive environment that encourages communication, collaboration and co-design.
Vemuri, Anant S; Wu, Jungle Chi-Hsiang; Liu, Kai-Che; Wu, Hurng-Sheng
2012-12-01
Surgical procedures have undergone considerable advancement during the last few decades. More recently, the availability of some imaging methods intraoperatively has added a new dimension to minimally invasive techniques. Augmented reality in surgery has been a topic of intense interest and research. Augmented reality involves usage of computer vision algorithms on video from endoscopic cameras or cameras mounted in the operating room to provide the surgeon additional information that he or she otherwise would have to recognize intuitively. One of the techniques combines a virtual preoperative model of the patient with the endoscope camera using natural or artificial landmarks to provide an augmented reality view in the operating room. The authors' approach is to provide this with the least number of changes to the operating room. Software architecture is presented to provide interactive adjustment in the registration of a three-dimensional (3D) model and endoscope video. Augmented reality including adrenalectomy, ureteropelvic junction obstruction, and retrocaval ureter and pancreas was used to perform 12 surgeries. The general feedback from the surgeons has been very positive not only in terms of deciding the positions for inserting points but also in knowing the least change in anatomy. The approach involves providing a deformable 3D model architecture and its application to the operating room. A 3D model with a deformable structure is needed to show the shape change of soft tissue during the surgery. The software architecture to provide interactive adjustment in registration of the 3D model and endoscope video with adjustability of every 3D model is presented.
NASA Astrophysics Data System (ADS)
Abercrombie, S. P.; Menzies, A.; Goddard, C.
2017-12-01
Virtual and augmented reality enable scientists to visualize environments that are very difficult, or even impossible to visit, such as the surface of Mars. A useful immersive visualization begins with a high quality reconstruction of the environment under study. This presentation will discuss a photogrammetry pipeline developed at the Jet Propulsion Laboratory to reconstruct 3D models of the surface of Mars using stereo images sent back to Earth by the Curiosity Mars rover. The resulting models are used to support a virtual reality tool (OnSight) that allows scientists and engineers to visualize the surface of Mars as if they were standing on the red planet. Images of Mars present challenges to existing scene reconstruction solutions. Surface images of Mars are sparse with minimal overlap, and are often taken from extremely different viewpoints. In addition, the specialized cameras used by Mars rovers are significantly different than consumer cameras, and GPS localization data is not available on Mars. This presentation will discuss scene reconstruction with an emphasis on coping with limited input data, and on creating models suitable for rendering in virtual reality at high frame rate.
Wang, Yu-Jen; Chen, Po-Ju; Liang, Xiao; Lin, Yi-Hsin
2017-03-27
Augmented reality (AR), which use computer-aided projected information to augment our sense, has important impact on human life, especially for the elder people. However, there are three major challenges regarding the optical system in the AR system, which are registration, vision correction, and readability under strong ambient light. Here, we solve three challenges simultaneously for the first time using two liquid crystal (LC) lenses and polarizer-free attenuator integrated in optical-see-through AR system. One of the LC lens is used to electrically adjust the position of the projected virtual image which is so-called registration. The other LC lens with larger aperture and polarization independent characteristic is in charge of vision correction, such as myopia and presbyopia. The linearity of lens powers of two LC lenses is also discussed. The readability of virtual images under strong ambient light is solved by electrically switchable transmittance of the LC attenuator originating from light scattering and light absorption. The concept demonstrated in this paper could be further extended to other electro-optical devices as long as the devices exhibit the capability of phase modulations and amplitude modulations.
Holographic and light-field imaging for augmented reality
NASA Astrophysics Data System (ADS)
Lee, Byoungho; Hong, Jong-Young; Jang, Changwon; Jeong, Jinsoo; Lee, Chang-Kun
2017-02-01
We discuss on the recent state of the augmented reality (AR) display technology. In order to realize AR, various seethrough three-dimensional (3D) display techniques have been reported. We describe the AR display with 3D functionality such as light-field display and holography. See-through light-field display can be categorized by the optical elements which are used for see-through property: optical elements controlling path of the light-fields and those generating see-through light-field. Holographic display can be also a good candidate for AR display because it can reconstruct wavefront information and provide realistic virtual information. We introduce the see-through holographic display using various optical techniques.
Virtual Reality-Based Center of Mass-Assisted Personalized Balance Training System.
Kumar, Deepesh; González, Alejandro; Das, Abhijit; Dutta, Anirban; Fraisse, Philippe; Hayashibe, Mitsuhiro; Lahiri, Uttama
2017-01-01
Poststroke hemiplegic patients often show altered weight distribution with balance disorders, increasing their risk of fall. Conventional balance training, though powerful, suffers from scarcity of trained therapists, frequent visits to clinics to get therapy, one-on-one therapy sessions, and monotony of repetitive exercise tasks. Thus, technology-assisted balance rehabilitation can be an alternative solution. Here, we chose virtual reality as a technology-based platform to develop motivating balance tasks. This platform was augmented with off-the-shelf available sensors such as Nintendo Wii balance board and Kinect to estimate one's center of mass (CoM). The virtual reality-based CoM-assisted balance tasks (Virtual CoMBaT) was designed to be adaptive to one's individualized weight-shifting capability quantified through CoM displacement. Participants were asked to interact with Virtual CoMBaT that offered tasks of varying challenge levels while adhering to ankle strategy for weight shifting. To facilitate the patients to use ankle strategy during weight-shifting, we designed a heel lift detection module. A usability study was carried out with 12 hemiplegic patients. Results indicate the potential of our system to contribute to improving one's overall performance in balance-related tasks belonging to different difficulty levels.
Ketelhut, Diane Jass; Niemi, Steven M
2007-01-01
This article examines several new and exciting communication technologies. Many of the technologies were developed by the entertainment industry; however, other industries are adopting and modifying them for their own needs. These new technologies allow people to collaborate across distance and time and to learn in simulated work contexts. The article explores the potential utility of these technologies for advancing laboratory animal care and use through better education and training. Descriptions include emerging technologies such as augmented reality and multi-user virtual environments, which offer new approaches with different capabilities. Augmented reality interfaces, characterized by the use of handheld computers to infuse the virtual world into the real one, result in deeply immersive simulations. In these simulations, users can access virtual resources and communicate with real and virtual participants. Multi-user virtual environments enable multiple participants to simultaneously access computer-based three-dimensional virtual spaces, called "worlds," and to interact with digital tools. They allow for authentic experiences that promote collaboration, mentoring, and communication. Because individuals may learn or train differently, it is advantageous to combine the capabilities of these technologies and applications with more traditional methods to increase the number of students who are served by using current methods alone. The use of these technologies in animal care and use programs can create detailed training and education environments that allow students to learn the procedures more effectively, teachers to assess their progress more objectively, and researchers to gain insights into animal care.
Fernández-Caballero, Antonio; Navarro, Elena; Fernández-Sotos, Patricia; González, Pascual; Ricarte, Jorge J.; Latorre, José M.; Rodriguez-Jimenez, Roberto
2017-01-01
This perspective paper faces the future of alternative treatments that take advantage of a social and cognitive approach with regards to pharmacological therapy of auditory verbal hallucinations (AVH) in patients with schizophrenia. AVH are the perception of voices in the absence of auditory stimulation and represents a severe mental health symptom. Virtual/augmented reality (VR/AR) and brain computer interfaces (BCI) are technologies that are growing more and more in different medical and psychological applications. Our position is that their combined use in computer-based therapies offers still unforeseen possibilities for the treatment of physical and mental disabilities. This is why, the paper expects that researchers and clinicians undergo a pathway toward human-avatar symbiosis for AVH by taking full advantage of new technologies. This outlook supposes to address challenging issues in the understanding of non-pharmacological treatment of schizophrenia-related disorders and the exploitation of VR/AR and BCI to achieve a real human-avatar symbiosis. PMID:29209193
[Registration technology for mandibular angle osteotomy based on augmented reality].
Zhu, Ming; Chai, Gang; Zhang, Yan; Ma, Xiao-Fei; Yu, Zhe-Yuan; Zhu, Yi-Jia
2010-12-01
To establish an effective path to register the operative plan to the real model of mandible made by rapid prototyping (RP) technology. Computerize tomography (CT) was performed on 20 patients to create 3D images, and computer aided operation planning information can be merged with the 3D images. Then dental cast was used to fix the signal which can be recognized by the software. The dental cast was transformed to 3D data with a laser scanner and a programmer that run on a personal computer named Rapidform matching the dental cast and the mandible image to generate the virtual image. Then the registration was achieved by video monitoring system. By using this technology, the virtual image of mandible and the cutting planes both can overlay the real model of mandible made by RP. This study found an effective way for registration by using dental cast, and this way might be a powerful option for the registration of augmented reality. Supported by Program for Innovation Research Team of Shanghai Municipal Education Commission.
Fernández-Caballero, Antonio; Navarro, Elena; Fernández-Sotos, Patricia; González, Pascual; Ricarte, Jorge J; Latorre, José M; Rodriguez-Jimenez, Roberto
2017-01-01
This perspective paper faces the future of alternative treatments that take advantage of a social and cognitive approach with regards to pharmacological therapy of auditory verbal hallucinations (AVH) in patients with schizophrenia. AVH are the perception of voices in the absence of auditory stimulation and represents a severe mental health symptom. Virtual/augmented reality (VR/AR) and brain computer interfaces (BCI) are technologies that are growing more and more in different medical and psychological applications. Our position is that their combined use in computer-based therapies offers still unforeseen possibilities for the treatment of physical and mental disabilities. This is why, the paper expects that researchers and clinicians undergo a pathway toward human-avatar symbiosis for AVH by taking full advantage of new technologies. This outlook supposes to address challenging issues in the understanding of non-pharmacological treatment of schizophrenia-related disorders and the exploitation of VR/AR and BCI to achieve a real human-avatar symbiosis.
On the use of virtual and augmented reality for upper limb prostheses training and simulation.
Lamounier, Edgard; Lopes, Kenedy; Cardoso, Alexandre; Andrade, Adriano; Soares, Alcimar
2010-01-01
Accidents happen and unfortunately people may loose part of their body members. Studies have shown that in this case, most individuals suffer physically and psychologically. For this reason, actions to restore the patient's freedom and mobility are imperative. Traditional solutions require ways to adapt the individual to prosthetic devices. This idea is also applied to patients who have congenital limitations. However, one of the major difficulties faced by those who are fitted with these devices is the great mental effort needed during first stages of training. As a result, a meaningful number of patients give up the use of theses devices very soon. Thus, this article reports on a solution designed by the authors to help patients during the learning phases, without actually having to wear the prosthesis. This solution considers Virtual (VR) and Augmented Reality (AR) techniques to mimic the prosthesis natural counterparts. Thus, it is expected that problems such as weight, heat and pain should not contribute to an already hard task.
FlyAR: augmented reality supported micro aerial vehicle navigation.
Zollmann, Stefanie; Hoppe, Christof; Langlotz, Tobias; Reitmayr, Gerhard
2014-04-01
Micro aerial vehicles equipped with high-resolution cameras can be used to create aerial reconstructions of an area of interest. In that context automatic flight path planning and autonomous flying is often applied but so far cannot fully replace the human in the loop, supervising the flight on-site to assure that there are no collisions with obstacles. Unfortunately, this workflow yields several issues, such as the need to mentally transfer the aerial vehicles position between 2D map positions and the physical environment, and the complicated depth perception of objects flying in the distance. Augmented Reality can address these issues by bringing the flight planning process on-site and visualizing the spatial relationship between the planned or current positions of the vehicle and the physical environment. In this paper, we present Augmented Reality supported navigation and flight planning of micro aerial vehicles by augmenting the users view with relevant information for flight planning and live feedback for flight supervision. Furthermore, we introduce additional depth hints supporting the user in understanding the spatial relationship of virtual waypoints in the physical world and investigate the effect of these visualization techniques on the spatial understanding.
Fast Markerless Tracking for Augmented Reality in Planar Environment
NASA Astrophysics Data System (ADS)
Basori, Ahmad Hoirul; Afif, Fadhil Noer; Almazyad, Abdulaziz S.; AbuJabal, Hamza Ali S.; Rehman, Amjad; Alkawaz, Mohammed Hazim
2015-12-01
Markerless tracking for augmented reality should not only be accurate but also fast enough to provide a seamless synchronization between real and virtual beings. Current reported methods showed that a vision-based tracking is accurate but requires high computational power. This paper proposes a real-time hybrid-based method for tracking unknown environments in markerless augmented reality. The proposed method provides collaboration of vision-based approach with accelerometers and gyroscopes sensors as camera pose predictor. To align the augmentation relative to camera motion, the tracking method is done by substituting feature-based camera estimation with combination of inertial sensors with complementary filter to provide more dynamic response. The proposed method managed to track unknown environment with faster processing time compared to available feature-based approaches. Moreover, the proposed method can sustain its estimation in a situation where feature-based tracking loses its track. The collaboration of sensor tracking managed to perform the task for about 22.97 FPS, up to five times faster than feature-based tracking method used as comparison. Therefore, the proposed method can be used to track unknown environments without depending on amount of features on scene, while requiring lower computational cost.
NASA Astrophysics Data System (ADS)
Damayanti, Latifah Adelina; Ikhsan, Jaslin
2017-05-01
Integration of information technology in education more rapidly performed in a medium of learning. Three-dimensional (3D) molecular modeling was performed in Augmented Reality as a tangible manifestation of increasingly modern technology utilization. Based on augmented reality, three-dimensional virtual object is projected in real time and the exact environment. This paper reviewed the uses of chemical learning supplement book of aldehydes and ketones which are equipped with three-dimensional molecular modeling by which students can inspect molecules from various viewpoints. To plays the 3D illustration printed on the book, smartphones with the open-source software of the technology based integrated Augmented Reality can be used. The aims of this research were to develop the monograph of aldehydes and ketones with 3 dimensional (3D) illustrations, to determine the specification of the monograph, and to determine the quality of the monograph. The quality of the monograph is evaluated by experiencing chemistry teachers on the five aspects of contents/materials, presentations, language and images, graphs, and software engineering, resulted in the result that the book has a very good quality to be used as a chemistry learning supplement book.
Wagner, A; Ploder, O; Enislidis, G; Truppe, M; Ewers, R
1996-04-01
Interventional video tomography (IVT), a new imaging modality, achieves virtual visualization of anatomic structures in three dimensions for intraoperative stereotactic navigation. Partial immersion into a virtual data space, which is orthotopically coregistered to the surgical field, enhances, by means of a see-through head-mounted display (HMD), the surgeon's visual perception and technique by providing visual access to nonvisual data of anatomy, physiology, and function. The presented cases document the potential of augmented reality environments in maxillofacial surgery.
NASA's Hybrid Reality Lab: One Giant Leap for Full Dive
NASA Technical Reports Server (NTRS)
Delgado, Francisco J.; Noyes, Matthew
2017-01-01
This presentation demonstrates how NASA is using consumer VR headsets, game engine technology and NVIDIA's GPUs to create highly immersive future training systems augmented with extremely realistic haptic feedback, sound, additional sensory information, and how these can be used to improve the engineering workflow. Include in this presentation is an environment simulation of the ISS, where users can interact with virtual objects, handrails, and tracked physical objects while inside VR, integration of consumer VR headsets with the Active Response Gravity Offload System, and a space habitat architectural evaluation tool. Attendees will learn how the best elements of real and virtual worlds can be combined into a hybrid reality environment with tangible engineering and scientific applications.
Lin, Yen-Kun; Yau, Hong-Tzong; Wang, I-Chung; Zheng, Cheng; Chung, Kwok-Hung
2015-06-01
Stereoscopic visualization concept combined with head-mounted displays may increase the accuracy of computer-aided implant surgery. The aim of this study was to develop an augmented reality-based dental implant placement system and evaluate the accuracy of the virtually planned versus the actual prepared implant site created in vitro. Four fully edentulous mandibular and four partially edentulous maxillary duplicated casts were used. Six implants were planned in the mandibular and four in the maxillary casts. A total of 40 osteotomy sites were prepared in the casts using stereolithographic template integrated with augmented reality-based surgical simulation. During the surgery, the dentist could be guided accurately through a head-mounted display by superimposing the virtual auxiliary line and the drill stop. The deviation between planned and prepared positions of the implants was measured via postoperative computer tomography generated scan images. Mean and standard deviation of the discrepancy between planned and prepared sites at the entry point, apex, angle, depth, and lateral locations were 0.50 ± 0.33 mm, 0.96 ± 0.36 mm, 2.70 ± 1.55°, 0.33 ± 0.27 mm, and 0.86 ± 0.34 mm, respectively, for the fully edentulous mandible, and 0.46 ± 0.20 mm, 1.23 ± 0.42 mm, 3.33 ± 1.42°, 0.48 ± 0.37 mm, and 1.1 ± 0.39 mm, respectively, for the partially edentulous maxilla. There was a statistically significant difference in the apical deviation between maxilla and mandible in this surgical simulation (p < .05). Deviation of implant placement from planned position was significantly reduced by integrating surgical template and augmented reality technology. © 2013 Wiley Periodicals, Inc.
The segmentation of the HMD market: optics for smart glasses, smart eyewear, AR and VR headsets
NASA Astrophysics Data System (ADS)
Kress, Bernard; Saeedi, Ehsan; Brac-de-la-Perriere, Vincent
2014-09-01
This paper reviews the various optical technologies that have been developed to implement HMDs (Head Mounted Displays), both as AR (Augmented Reality) devices, VR (Virtual Reality) devices and more recently as smart glasses, smart eyewear or connected glasses. We review the typical requirements and optical performances of such devices and categorize them into distinct groups, which are suited for different (and constantly evolving) market segments, and analyze such market segmentation.
2015-09-01
Training System ARB Aircraft Recovery Bulletins AR Augmented Reality CAG Carrier Air Group CATCC Carrier Air Traffic Control Center COTS...in integration of an optical lens systems into the aircraft carrier. The current generation of optical lens systems integrated into aircraft ...The use of MOVLAS on an aircraft carrier represents a direct communication link between the LSO and pilot. As a backup landing aid system to
Combining 3D structure of real video and synthetic objects
NASA Astrophysics Data System (ADS)
Kim, Man-Bae; Song, Mun-Sup; Kim, Do-Kyoon
1998-04-01
This paper presents a new approach of combining real video and synthetic objects. The purpose of this work is to use the proposed technology in the fields of advanced animation, virtual reality, games, and so forth. Computer graphics has been used in the fields previously mentioned. Recently, some applications have added real video to graphic scenes for the purpose of augmenting the realism that the computer graphics lacks in. This approach called augmented or mixed reality can produce more realistic environment that the entire use of computer graphics. Our approach differs from the virtual reality and augmented reality in the manner that computer- generated graphic objects are combined to 3D structure extracted from monocular image sequences. The extraction of the 3D structure requires the estimation of 3D depth followed by the construction of a height map. Graphic objects are then combined to the height map. The realization of our proposed approach is carried out in the following steps: (1) We derive 3D structure from test image sequences. The extraction of the 3D structure requires the estimation of depth and the construction of a height map. Due to the contents of the test sequence, the height map represents the 3D structure. (2) The height map is modeled by Delaunay triangulation or Bezier surface and each planar surface is texture-mapped. (3) Finally, graphic objects are combined to the height map. Because 3D structure of the height map is already known, Step (3) is easily manipulated. Following this procedure, we produced an animation video demonstrating the combination of the 3D structure and graphic models. Users can navigate the realistic 3D world whose associated image is rendered on the display monitor.
Augmented reality-assisted bypass surgery: embracing minimal invasiveness.
Cabrilo, Ivan; Schaller, Karl; Bijlenga, Philippe
2015-04-01
The overlay of virtual images on the surgical field, defined as augmented reality, has been used for image guidance during various neurosurgical procedures. Although this technology could conceivably address certain inherent problems of extracranial-to-intracranial bypass procedures, this potential has not been explored to date. We evaluate the usefulness of an augmented reality-based setup, which could help in harvesting donor vessels through their precise localization in real-time, in performing tailored craniotomies, and in identifying preoperatively selected recipient vessels for the purpose of anastomosis. Our method was applied to 3 patients with Moya-Moya disease who underwent superficial temporal artery-to-middle cerebral artery anastomoses and 1 patient who underwent an occipital artery-to-posteroinferior cerebellar artery bypass because of a dissecting aneurysm of the vertebral artery. Patients' heads, skulls, and extracranial and intracranial vessels were segmented preoperatively from 3-dimensional image data sets (3-dimensional digital subtraction angiography, angio-magnetic resonance imaging, angio-computed tomography), and injected intraoperatively into the operating microscope's eyepiece for image guidance. In each case, the described setup helped in precisely localizing donor and recipient vessels and in tailoring craniotomies to the injected images. The presented system based on augmented reality can optimize the workflow of extracranial-to-intracranial bypass procedures by providing essential anatomical information, entirely integrated to the surgical field, and help to perform minimally invasive procedures. Copyright © 2015 Elsevier Inc. All rights reserved.
van Duren, B H; Sugand, K; Wescott, R; Carrington, R; Hart, A
2018-05-01
Hip fractures contribute to a significant clinical burden globally with over 1.6 million cases per annum and up to 30% mortality rate within the first year. Insertion of a dynamic hip screw (DHS) is a frequently performed procedure to treat extracapsular neck of femur fractures. Poorly performed DHS fixation of extracapsular neck of femur fractures can result in poor mobilisation, chronic pain, and increased cut-out rate requiring revision surgery. A realistic, affordable, and portable fluoroscopic simulation system can improve performance metrics in trainees, including the tip-apex distance (the only clinically validated outcome), and improve outcomes. We developed a digital fluoroscopic imaging simulator using orthogonal cameras to track coloured markers attached to the guide-wire which created a virtual overlay on fluoroscopic images of the hip. To test the accuracy with which the augmented reality system could track a guide-wire, a standard workshop femur was used to calibrate the system with a positional marker fixed to indicate the apex; this allowed for comparison between guide-wire tip-apex distance (TAD) calculated by the system to be compared to that physically measured. Tests were undertaken to determine: (1) how well the apex could be targeted; (2) the accuracy of the calculated TAD. (3) The number of iterations through the algorithm giving the optimal accuracy-time relationship. The calculated TAD was found to have an average root mean square error of 4.2 mm. The accuracy of the algorithm was shown to increase with the number of iterations up to 20 beyond which the error asymptotically converged to an error of 2 mm. This work demonstrates a novel augmented reality simulation of guide-wire insertion in DHS surgery. To our knowledge this has not been previously achieved. In contrast to virtual reality, augmented reality is able to simulate fluoroscopy while allowing the trainee to interact with real instrumentation and performing the procedure on workshop bone models. Copyright © 2018 IPEM. Published by Elsevier Ltd. All rights reserved.
Virtual reality for health care: a survey.
Moline, J
1997-01-01
This report surveys the state of the art in applications of virtual environments and related technologies for health care. Applications of these technologies are being developed for health care in the following areas: surgical procedures (remote surgery or telepresence, augmented or enhanced surgery, and planning and simulation of procedures before surgery); medical therapy; preventive medicine and patient education; medical education and training; visualization of massive medical databases; skill enhancement and rehabilitation; and architectural design for health-care facilities. To date, such applications have improved the quality of health care, and in the future they will result in substantial cost savings. Tools that respond to the needs of present virtual environment systems are being refined or developed. However, additional large-scale research is necessary in the following areas: user studies, use of robots for telepresence procedures, enhanced system reality, and improved system functionality.
Lin, Chien-Yu; Chang, Yu-Ming
2015-02-01
This study uses a body motion interactive game developed in Scratch 2.0 to enhance the body strength of children with disabilities. Scratch 2.0, using an augmented-reality function on a program platform, creates real world and virtual reality displays at the same time. This study uses a webcam integration that tracks movements and allows participants to interact physically with the project, to enhance the motivation of children with developmental disabilities to perform physical activities. This study follows a single-case research using an ABAB structure, in which A is the baseline and B is the intervention. The experimental period was 2 months. The experimental results demonstrated that the scores for 3 children with developmental disabilities increased considerably during the intervention phrases. The developmental applications of these results are also discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.
Simulation and augmented reality in endovascular neurosurgery: lessons from aviation.
Mitha, Alim P; Almekhlafi, Mohammed A; Janjua, Major Jameel J; Albuquerque, Felipe C; McDougall, Cameron G
2013-01-01
Endovascular neurosurgery is a discipline strongly dependent on imaging. Therefore, technology that improves how much useful information we can garner from a single image has the potential to dramatically assist decision making during endovascular procedures. Furthermore, education in an image-enhanced environment, especially with the incorporation of simulation, can improve the safety of the procedures and give interventionalists and trainees the opportunity to study or perform simulated procedures before the intervention, much like what is practiced in the field of aviation. Here, we examine the use of simulators in the training of fighter pilots and discuss how similar benefits can compensate for current deficiencies in endovascular training. We describe the types of simulation used for endovascular procedures, including virtual reality, and discuss the relevant data on its utility in training. Finally, the benefit of augmented reality during endovascular procedures is discussed, along with future computerized image enhancement techniques.
Towards multi-platform software architecture for Collaborative Teleoperation
NASA Astrophysics Data System (ADS)
Domingues, Christophe; Otmane, Samir; Davesne, Frederic; Mallem, Malik
2009-03-01
Augmented Reality (AR) can provide to a Human Operator (HO) a real help in achieving complex tasks, such as remote control of robots and cooperative teleassistance. Using appropriate augmentations, the HO can interact faster, safer and easier with the remote real world. In this paper, we present an extension of an existing distributed software and network architecture for collaborative teleoperation based on networked human-scaled mixed reality and mobile platform. The first teleoperation system was composed by a VR application and a Web application. However the 2 systems cannot be used together and it is impossible to control a distant robot simultaneously. Our goal is to update the teleoperation system to permit a heterogeneous collaborative teleoperation between the 2 platforms. An important feature of this interface is based on the use of different Virtual Reality platforms and different Mobile platforms to control one or many robots.
Choi, Hyunseok; Cho, Byunghyun; Masamune, Ken; Hashizume, Makoto; Hong, Jaesung
2016-03-01
Depth perception is a major issue in augmented reality (AR)-based surgical navigation. We propose an AR and virtual reality (VR) switchable visualization system with distance information, and evaluate its performance in a surgical navigation set-up. To improve depth perception, seamless switching from AR to VR was implemented. In addition, the minimum distance between the tip of the surgical tool and the nearest organ was provided in real time. To evaluate the proposed techniques, five physicians and 20 non-medical volunteers participated in experiments. Targeting error, time taken, and numbers of collisions were measured in simulation experiments. There was a statistically significant difference between a simple AR technique and the proposed technique. We confirmed that depth perception in AR could be improved by the proposed seamless switching between AR and VR, and providing an indication of the minimum distance also facilitated the surgical tasks. Copyright © 2015 John Wiley & Sons, Ltd.
Towards multi-platform software architecture for Collaborative Teleoperation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Domingues, Christophe; Otmane, Samir; Davesne, Frederic
2009-03-05
Augmented Reality (AR) can provide to a Human Operator (HO) a real help in achieving complex tasks, such as remote control of robots and cooperative teleassistance. Using appropriate augmentations, the HO can interact faster, safer and easier with the remote real world. In this paper, we present an extension of an existing distributed software and network architecture for collaborative teleoperation based on networked human-scaled mixed reality and mobile platform. The first teleoperation system was composed by a VR application and a Web application. However the 2 systems cannot be used together and it is impossible to control a distant robotmore » simultaneously. Our goal is to update the teleoperation system to permit a heterogeneous collaborative teleoperation between the 2 platforms. An important feature of this interface is based on the use of different Virtual Reality platforms and different Mobile platforms to control one or many robots.« less
Detection of Nuclear Sources by UAV Teleoperation Using a Visuo-Haptic Augmented Reality Interface
Micconi, Giorgio; Caselli, Stefano; Benassi, Giacomo; Zambelli, Nicola; Bettelli, Manuele
2017-01-01
A visuo-haptic augmented reality (VHAR) interface is presented enabling an operator to teleoperate an unmanned aerial vehicle (UAV) equipped with a custom CdZnTe-based spectroscopic gamma-ray detector in outdoor environments. The task is to localize nuclear radiation sources, whose location is unknown to the user, without the close exposure of the operator. The developed detector also enables identification of the localized nuclear sources. The aim of the VHAR interface is to increase the situation awareness of the operator. The user teleoperates the UAV using a 3DOF haptic device that provides an attractive force feedback around the location of the most intense detected radiation source. Moreover, a fixed camera on the ground observes the environment where the UAV is flying. A 3D augmented reality scene is displayed on a computer screen accessible to the operator. Multiple types of graphical overlays are shown, including sensor data acquired by the nuclear radiation detector, a virtual cursor that tracks the UAV and geographical information, such as buildings. Experiments performed in a real environment are reported using an intense nuclear source. PMID:28961198
Detection of Nuclear Sources by UAV Teleoperation Using a Visuo-Haptic Augmented Reality Interface.
Aleotti, Jacopo; Micconi, Giorgio; Caselli, Stefano; Benassi, Giacomo; Zambelli, Nicola; Bettelli, Manuele; Zappettini, Andrea
2017-09-29
A visuo-haptic augmented reality (VHAR) interface is presented enabling an operator to teleoperate an unmanned aerial vehicle (UAV) equipped with a custom CdZnTe-based spectroscopic gamma-ray detector in outdoor environments. The task is to localize nuclear radiation sources, whose location is unknown to the user, without the close exposure of the operator. The developed detector also enables identification of the localized nuclear sources. The aim of the VHAR interface is to increase the situation awareness of the operator. The user teleoperates the UAV using a 3DOF haptic device that provides an attractive force feedback around the location of the most intense detected radiation source. Moreover, a fixed camera on the ground observes the environment where the UAV is flying. A 3D augmented reality scene is displayed on a computer screen accessible to the operator. Multiple types of graphical overlays are shown, including sensor data acquired by the nuclear radiation detector, a virtual cursor that tracks the UAV and geographical information, such as buildings. Experiments performed in a real environment are reported using an intense nuclear source.
Boosting physics education through mobile augmented reality
NASA Astrophysics Data System (ADS)
Crǎciun, Dana; Bunoiu, Mǎdǎlin
2017-12-01
The integration of collaborative applications, based on modern learning technologies and the Internet, of various visualization techniques and digital strategies in open, flexible modern learning environments which facilitate access to resources, represents a challenge for physics teachers in Romania in general, and for novice teachers in particular. Although large efforts have been made worldwide to invest in educational technologies, their impact on the students' learning outcomes is quite modest. In this paper, we describe and analyze various curricular and extracurricular activities specifically designed for and undertaken by pre-service physics teachers. These activities employ new educational technologies, mobile augmented reality (MAR) and are based on modern teaching and learning theories. MAR is an extension for mobile devices of augmented reality, an interactive and in real time combination, of real and virtual objects overlaid in the real environment. The obtained results show that pre-service physics teachers are confident in using MAR in their teaching and learning activities, and consider that the activities performed helped them develop the skills necessary for science teachers in a technology-based society and to reflect upon the role of technology in the current Romanian educational context.
Augmented reality in medical education?
Kamphuis, Carolien; Barsom, Esther; Schijven, Marlies; Christoph, Noor
2014-09-01
Learning in the medical domain is to a large extent workplace learning and involves mastery of complex skills that require performance up to professional standards in the work environment. Since training in this real-life context is not always possible for reasons of safety, costs, or didactics, alternative ways are needed to achieve clinical excellence. Educational technology and more specifically augmented reality (AR) has the potential to offer a highly realistic situated learning experience supportive of complex medical learning and transfer. AR is a technology that adds virtual content to the physical real world, thereby augmenting the perception of reality. Three examples of dedicated AR learning environments for the medical domain are described. Five types of research questions are identified that may guide empirical research into the effects of these learning environments. Up to now, empirical research mainly appears to focus on the development, usability and initial implementation of AR for learning. Limited review results reflect the motivational value of AR, its potential for training psychomotor skills and the capacity to visualize the invisible, possibly leading to enhanced conceptual understanding of complex causality.
Virtual Reconstruction of Lost Architectures: from the Tls Survey to AR Visualization
NASA Astrophysics Data System (ADS)
Quattrini, R.; Pierdicca, R.; Frontoni, E.; Barcaglioni, R.
2016-06-01
The exploitation of high quality 3D models for dissemination of archaeological heritage is currently an investigated topic, although Mobile Augmented Reality platforms for historical architecture are not available, allowing to develop low-cost pipelines for effective contents. The paper presents a virtual anastylosis, starting from historical sources and from 3D model based on TLS survey. Several efforts and outputs in augmented or immersive environments, exploiting this reconstruction, are discussed. The work demonstrates the feasibility of a 3D reconstruction approach for complex architectural shapes starting from point clouds and its AR/VR exploitation, allowing the superimposition with archaeological evidences. Major contributions consist in the presentation and the discussion of a pipeline starting from the virtual model, to its simplification showing several outcomes, comparing also the supported data qualities and advantages/disadvantages due to MAR and VR limitations.
When Worlds Collide: An Augmented Reality Check
ERIC Educational Resources Information Center
Villano, Matt
2008-01-01
The technology is simple: Mobile technologies such as handheld computers and global positioning systems work in sync to create an alternate, hybrid world that mixes virtual characters with the actual physical environment. The result is a digital simulation that offers powerful game-playing opportunities and allows students to become more engaged…
Percutaneous spinal fixation simulation with virtual reality and haptics.
Luciano, Cristian J; Banerjee, P Pat; Sorenson, Jeffery M; Foley, Kevin T; Ansari, Sameer A; Rizzi, Silvio; Germanwala, Anand V; Kranzler, Leonard; Chittiboina, Prashant; Roitberg, Ben Z
2013-01-01
In this study, we evaluated the use of a part-task simulator with 3-dimensional and haptic feedback as a training tool for percutaneous spinal needle placement. To evaluate the learning effectiveness in terms of entry point/target point accuracy of percutaneous spinal needle placement on a high-performance augmented-reality and haptic technology workstation with the ability to control the duration of computer-simulated fluoroscopic exposure, thereby simulating an actual situation. Sixty-three fellows and residents performed needle placement on the simulator. A virtual needle was percutaneously inserted into a virtual patient's thoracic spine derived from an actual patient computed tomography data set. Ten of 126 needle placement attempts by 63 participants ended in failure for a failure rate of 7.93%. From all 126 needle insertions, the average error (15.69 vs 13.91), average fluoroscopy exposure (4.6 vs 3.92), and average individual performance score (32.39 vs 30.71) improved from the first to the second attempt. Performance accuracy yielded P = .04 from a 2-sample t test in which the rejected null hypothesis assumes no improvement in performance accuracy from the first to second attempt in the test session. The experiments showed evidence (P = .04) of performance accuracy improvement from the first to the second percutaneous needle placement attempt. This result, combined with previous learning retention and/or face validity results of using the simulator for open thoracic pedicle screw placement and ventriculostomy catheter placement, supports the efficacy of augmented reality and haptics simulation as a learning tool.
An augmented-reality edge enhancement application for Google Glass.
Hwang, Alex D; Peli, Eli
2014-08-01
Google Glass provides a platform that can be easily extended to include a vision enhancement tool. We have implemented an augmented vision system on Glass, which overlays enhanced edge information over the wearer's real-world view, to provide contrast-improved central vision to the Glass wearers. The enhanced central vision can be naturally integrated with scanning. Google Glass' camera lens distortions were corrected by using an image warping. Because the camera and virtual display are horizontally separated by 16 mm, and the camera aiming and virtual display projection angle are off by 10°, the warped camera image had to go through a series of three-dimensional transformations to minimize parallax errors before the final projection to the Glass' see-through virtual display. All image processes were implemented to achieve near real-time performance. The impacts of the contrast enhancements were measured for three normal-vision subjects, with and without a diffuser film to simulate vision loss. For all three subjects, significantly improved contrast sensitivity was achieved when the subjects used the edge enhancements with a diffuser film. The performance boost is limited by the Glass camera's performance. The authors assume that this accounts for why performance improvements were observed only with the diffuser filter condition (simulating low vision). Improvements were measured with simulated visual impairments. With the benefit of see-through augmented reality edge enhancement, natural visual scanning process is possible and suggests that the device may provide better visual function in a cosmetically and ergonomically attractive format for patients with macular degeneration.
Near-to-eye electroholography via guided-wave acousto-optics for augmented reality
NASA Astrophysics Data System (ADS)
Jolly, Sundeep; Savidis, Nickolaos; Datta, Bianca; Smalley, Daniel; Bove, V. Michael
2017-03-01
Near-to-eye holographic displays act to directly project wavefronts into a viewer's eye in order to recreate 3-D scenes for augmented or virtual reality applications. Recently, several solutions for near-to-eye electroholography have been proposed based on digital spatial light modulators in conjunction with supporting optics, such as holographic waveguides for light delivery; however, such schemes are limited by the inherent low space-bandwidth product available with current digital SLMs. In this paper, we depict a fully monolithic, integrated optical platform for transparent near-to-eye holographic display requiring no supporting optics. Our solution employs a guided-wave acousto-optic spatial light modulator implemented in lithium niobate in conjunction with an integrated Bragg-regime reflection volume hologram.
Virtual rehabilitation--benefits and challenges.
Burdea, G C
2003-01-01
To discuss the advantages and disadvantages of rehabilitation applications of virtual reality. VR can be used as an enhancement to conventional therapy for patients with conditions ranging from musculoskeletal problems, to stroke-induced paralysis, to cognitive deficits. This approach is called "VR-augmented rehabilitation." Alternately, VR can replace conventional interventions altogether, in which case the rehabilitation is "VR-based." If the intervention is done at a distance, then it is called "telerehabilitation." Simulation exercises for post-stroke patients have been developed using a "teacher object" approach or a video game approach. Simulations for musculo-skeletal patients use virtual replicas of rehabilitation devices (such as rubber ball, power putty, peg board). Phobia-inducing virtual environments are prescribed for patients with cognitive deficits. VR-augmented rehabilitation has been shown effective for stroke patients in the chronic phase of the disease. VR-based rehabilitation has been improving patients with fear of flying, Vietnam syndrome, fear of heights, and chronic stroke patients. Telerehabilitation interventions using VR have improved musculo-skeletal and post-stroke patients, however less data is available at this time. Virtual reality presents significant advantages when applied to rehabilitation of patients with varied conditions. These advantages include patient motivation, adaptability and variability based on patient baseline, transparent data storage, online remote data access, economy of scale, reduced medical costs. Challenges in VR use for rehabilitation relate to lack of computer skills on the part of therapists, lack of support infrastructure, expensive equipment (initially), inadequate communication infrastructure (for telerehabilitation in rural areas), and patient safety concerns.
Low cost heads-up virtual reality (HUVR) with optical tracking and haptic feedback
NASA Astrophysics Data System (ADS)
Margolis, Todd; DeFanti, Thomas A.; Dawe, Greg; Prudhomme, Andrew; Schulze, Jurgen P.; Cutchin, Steve
2011-03-01
Researchers at the University of California, San Diego, have created a new, relatively low-cost augmented reality system that enables users to touch the virtual environment they are immersed in. The Heads-Up Virtual Reality device (HUVR) couples a consumer 3D HD flat screen TV with a half-silvered mirror to project any graphic image onto the user's hands and into the space surrounding them. With his or her head position optically tracked to generate the correct perspective view, the user maneuvers a force-feedback (haptic) device to interact with the 3D image, literally 'touching' the object's angles and contours as if it was a tangible physical object. HUVR can be used for training and education in structural and mechanical engineering, archaeology and medicine as well as other tasks that require hand-eye coordination. One of the most unique characteristics of HUVR is that a user can place their hands inside of the virtual environment without occluding the 3D image. Built using open-source software and consumer level hardware, HUVR offers users a tactile experience in an immersive environment that is functional, affordable and scalable.
Avatar - a multi-sensory system for real time body position monitoring.
Jovanov, E; Hanish, N; Courson, V; Stidham, J; Stinson, H; Webb, C; Denny, K
2009-01-01
Virtual reality and computer assisted physical rehabilitation applications require an unobtrusive and inexpensive real time monitoring systems. Existing systems are usually complex and expensive and based on infrared monitoring. In this paper we propose Avatar, a hybrid system consisting of off-the-shelf components and sensors. Absolute positioning of a few reference points is determined using infrared diode on subject's body and a set of Wii Remotes as optical sensors. Individual body segments are monitored by intelligent inertial sensor nodes iSense. A network of inertial nodes is controlled by a master node that serves as a gateway for communication with a capture device. Each sensor features a 3D accelerometer and a 2 axis gyroscope. Avatar system is used for control of avatars in Virtual Reality applications, but could be used in a variety of augmented reality, gaming, and computer assisted physical rehabilitation applications.
Virtual Reality-Based Center of Mass-Assisted Personalized Balance Training System
Kumar, Deepesh; González, Alejandro; Das, Abhijit; Dutta, Anirban; Fraisse, Philippe; Hayashibe, Mitsuhiro; Lahiri, Uttama
2018-01-01
Poststroke hemiplegic patients often show altered weight distribution with balance disorders, increasing their risk of fall. Conventional balance training, though powerful, suffers from scarcity of trained therapists, frequent visits to clinics to get therapy, one-on-one therapy sessions, and monotony of repetitive exercise tasks. Thus, technology-assisted balance rehabilitation can be an alternative solution. Here, we chose virtual reality as a technology-based platform to develop motivating balance tasks. This platform was augmented with off-the-shelf available sensors such as Nintendo Wii balance board and Kinect to estimate one’s center of mass (CoM). The virtual reality-based CoM-assisted balance tasks (Virtual CoMBaT) was designed to be adaptive to one’s individualized weight-shifting capability quantified through CoM displacement. Participants were asked to interact with Virtual CoMBaT that offered tasks of varying challenge levels while adhering to ankle strategy for weight shifting. To facilitate the patients to use ankle strategy during weight-shifting, we designed a heel lift detection module. A usability study was carried out with 12 hemiplegic patients. Results indicate the potential of our system to contribute to improving one’s overall performance in balance-related tasks belonging to different difficulty levels. PMID:29359128
Miragall, Marta; Baños, Rosa M.; Cebolla, Ausiàs; Botella, Cristina
2015-01-01
This study examines the psychometric properties of the Working Alliance Inventory-Short (WAI-S) adaptation to Virtual Reality (VR) and Augmented Reality (AR) therapies (WAI-VAR). The relationship between the therapeutic alliance (TA) with VR and AR and clinically significant change (CSC) is also explored. Seventy-five patients took part in this study (74.7% women, Mage = 34.41). Fear of flying and adjustment disorder patients received VR therapy, and cockroach phobia patients received AR therapy. Psychometric properties, CSC, one-way ANOVA, Spearman’s Correlations and Multiple Regression were calculated. The WAI-VAR showed a unidimensional structure, high internal consistency and adequate convergent validity. “Not changed” patients scored lower on the WAI-VAR than “improved” and “recovered” patients. Correlation between the WAI-VAR and CSC was moderate. The best fitting model for predicting CSC was a linear combination of the TA with therapist (WAI-S) and the TA with VR and AR (WAI-VAR), due to the latter variable slightly increased the percentage of variability accounted for in CSC. The WAI-VAR is the first validated instrument to measure the TA with VR and AR in research and clinical practice. This study reveals the importance of the quality of the TA with technologies in achieving positive outcomes in the therapy. PMID:26500589
Miragall, Marta; Baños, Rosa M; Cebolla, Ausiàs; Botella, Cristina
2015-01-01
This study examines the psychometric properties of the Working Alliance Inventory-Short (WAI-S) adaptation to Virtual Reality (VR) and Augmented Reality (AR) therapies (WAI-VAR). The relationship between the therapeutic alliance (TA) with VR and AR and clinically significant change (CSC) is also explored. Seventy-five patients took part in this study (74.7% women, M age = 34.41). Fear of flying and adjustment disorder patients received VR therapy, and cockroach phobia patients received AR therapy. Psychometric properties, CSC, one-way ANOVA, Spearman's Correlations and Multiple Regression were calculated. The WAI-VAR showed a unidimensional structure, high internal consistency and adequate convergent validity. "Not changed" patients scored lower on the WAI-VAR than "improved" and "recovered" patients. Correlation between the WAI-VAR and CSC was moderate. The best fitting model for predicting CSC was a linear combination of the TA with therapist (WAI-S) and the TA with VR and AR (WAI-VAR), due to the latter variable slightly increased the percentage of variability accounted for in CSC. The WAI-VAR is the first validated instrument to measure the TA with VR and AR in research and clinical practice. This study reveals the importance of the quality of the TA with technologies in achieving positive outcomes in the therapy.
Human responses to augmented virtual scaffolding models.
Hsiao, Hongwei; Simeonov, Peter; Dotson, Brian; Ammons, Douglas; Kau, Tsui-Ying; Chiou, Sharon
2005-08-15
This study investigated the effect of adding real planks, in virtual scaffolding models of elevation, on human performance in a surround-screen virtual reality (SSVR) system. Twenty-four construction workers and 24 inexperienced controls performed walking tasks on real and virtual planks at three virtual heights (0, 6 m, 12 m) and two scaffolding-platform-width conditions (30, 60 cm). Gait patterns, walking instability measurements and cardiovascular reactivity were assessed. The results showed differences in human responses to real vs. virtual planks in walking patterns, instability score and heart-rate inter-beat intervals; it appeared that adding real planks in the SSVR virtual scaffolding model enhanced the quality of SSVR as a human - environment interface research tool. In addition, there were significant differences in performance between construction workers and the control group. The inexperienced participants were more unstable as compared to construction workers. Both groups increased their stride length with repetitions of the task, indicating a possibly confidence- or habit-related learning effect. The practical implications of this study are in the adoption of augmented virtual models of elevated construction environments for injury prevention research, and the development of programme for balance-control training to reduce the risk of falls at elevation before workers enter a construction job.
ProMIS augmented reality training of laparoscopic procedures face validity.
Botden, Sanne M B I; Buzink, Sonja N; Schijven, Marlies P; Jakimowicz, Jack J
2008-01-01
Conventional video trainers lack the ability to assess the trainee objectively, but offer modalities that are often missing in virtual reality simulation, such as realistic haptic feedback. The ProMIS augmented reality laparoscopic simulator retains the benefit of a traditional box trainer, by using original laparoscopic instruments and tactile tasks, but additionally generates objective measures of performance. Fifty-five participants performed a "basic skills" and "suturing and knot-tying" task on ProMIS, after which they filled out a questionnaire regarding realism, haptics, and didactic value of the simulator, on a 5-point-Likert scale. The participants were allotted to 2 experience groups: "experienced" (>50 procedures and >5 sutures; N = 27), and "moderately experienced" (<50 procedures and <5 sutures; N = 28). General consensus among all participants, particularly the experienced, was that ProMIS is a useful tool for training (mean: 4.67, SD: 0.48). It was considered very realistic (mean: 4.44, SD: 0.66), with good haptics (mean: 4.10, SD: 0.97) and didactic value (mean 4.10, SD: 0.65). This study established the face validity of the ProMIS augmented reality simulator for "basic skills" and "suturing and knot-tying" tasks. ProMIS was considered a good tool for training in laparoscopic skills for surgical residents and surgeons.
Hoermann, Simon; Ferreira Dos Santos, Luara; Morkisch, Nadine; Jettkowski, Katrin; Sillis, Moran; Devan, Hemakumar; Kanagasabai, Parimala S; Schmidt, Henning; Krüger, Jörg; Dohle, Christian; Regenbrecht, Holger; Hale, Leigh; Cutfield, Nicholas J
2017-07-01
New rehabilitation strategies for post-stroke upper limb rehabilitation employing visual stimulation show promising results, however, cost-efficient and clinically feasible ways to provide these interventions are still lacking. An integral step is to translate recent technological advances, such as in virtual and augmented reality, into therapeutic practice to improve outcomes for patients. This requires research on the adaptation of the technology for clinical use as well as on the appropriate guidelines and protocols for sustainable integration into therapeutic routines. Here, we present and evaluate a novel and affordable augmented reality system (Augmented Reflection Technology, ART) in combination with a validated mirror therapy protocol for upper limb rehabilitation after stroke. We evaluated components of the therapeutic intervention, from the patients' and the therapists' points of view in a clinical feasibility study at a rehabilitation centre. We also assessed the integration of ART as an adjunct therapy for the clinical rehabilitation of subacute patients at two different hospitals. The results showed that the combination and application of the Berlin Protocol for Mirror Therapy together with ART was feasible for clinical use. This combination was integrated into the therapeutic plan of subacute stroke patients at the two clinical locations where the second part of this research was conducted. Our findings pave the way for using technology to provide mirror therapy in clinical settings and show potential for the more effective use of inpatient time and enhanced recoveries for patients. Implications for Rehabilitation Computerised Mirror Therapy is feasible for clinical use Augmented Reflection Technology can be integrated as an adjunctive therapeutic intervention for subacute stroke patients in an inpatient setting Virtual Rehabilitation devices such as Augmented Reflection Technology have considerable potential to enhance stroke rehabilitation.
Feasibility of virtual reality augmented cycling for health promotion of people poststroke.
Deutsch, Judith E; Myslinski, Mary Jane; Kafri, Michal; Ranky, Richard; Sivak, Mark; Mavroidis, Constantinos; Lewis, Jeffrey A
2013-09-01
A virtual reality (VR) augmented cycling kit (VRACK) was developed to address motor control and fitness deficits of individuals with chronic stroke. In this article, we report on the safety, feasibility, and efficacy of using the VR augmented cycling kit to improve cardiorespiratory (CR) fitness of individuals in the chronic phase poststroke. Four individuals with chronic stroke (47-65 years old and ≥3 years poststroke), with residual lower extremity impairments (Fugl-Meyer 24-26/34), who were limited community ambulators (gait speed range 0.56-1.1 m/s) participated in this study. Safety was defined as the absence of adverse events. Feasibility was measured using attendance, total exercise time, and "involvement" measured with the presence questionnaire (PQ). Efficacy of CR fitness was evaluated using a submaximal bicycle ergometer test before and after an 8-week training program. The intervention was safe and feasible with participants having 1 adverse event, 100% adherence, achieving between 90 and 125 minutes of cycling each week, and a mean PQ score of 39 (SD 3.3). There was a statistically significant (13%; P = 0.035) improvement in peak VO(2), with a range of 6% to 24.5%. For these individuals, poststroke, VR augmented cycling, using their heart rate to set their avatar's speed, fostered training of sufficient duration and intensity to promote CR fitness. In addition, there was a transfer of training from the bicycle to walking endurance. VR augmented cycling may be an addition to the therapist's tools for concurrent training of mobility and health promotion of individuals poststroke.
NASA Astrophysics Data System (ADS)
Canciani, M.; Conigliaro, E.; Del Grasso, M.; Papalini, P.; Saccone, M.
2016-06-01
The development of close-range photogrammetry has produced a lot of new possibility to study cultural heritage. 3D data acquired with conventional and low cost cameras can be used to document, investigate the full appearance, materials and conservation status, to help the restoration process and identify intervention priorities. At the same time, with 3D survey a lot of three-dimensional data are collected and analyzed by researchers, but there are a very few possibility of 3D output. The augmented reality is one of this possible output with a very low cost technology but a very interesting result. Using simple mobile technology (for iPad and Android Tablets) and shareware software (in the case presented "Augment") it is possible to share and visualize a large number of 3D models with your own device. The case study presented is a part of an architecture graduate thesis, made in Rome at Department of Architecture of Roma Tre University. We have developed a photogrammetric survey to study the Aurelian Wall at Castra Praetoria in Rome. The surveys of 8000 square meters of surface have allowed to identify stratigraphy and construction phases of a complex portion of Aurelian Wall, specially about the Northern door of Castra. During this study, the data coming out of 3D survey (photogrammetric and topographic), are stored and used to create a reverse 3D model, or virtual reconstruction, of the Northern door of Castra. This virtual reconstruction shows the door in the Tiberian period, nowadays it's totally hidden by a curtain wall but, little and significative architectural details allow to know its original feature. The 3D model of the ancient walls has been mapped with the exact type of bricks and mortar, oriented and scaled according to the existing one to use augmented reality. Finally, two kind of application have been developed, one on site, were you can see superimposed the virtual reconstruction on the existing walls using the image recognition. On the other hand, to show the results also during the graduation day, the same application has been created in off-site condition using a poster.
NASA Astrophysics Data System (ADS)
Cheok, Adrian David
This chapter details the Human Pacman system to illuminate entertainment computing which ventures to embed the natural physical world seamlessly with a fantasy virtual playground by capitalizing on infrastructure provided by mobile computing, wireless LAN, and ubiquitous computing. With Human Pacman, we have a physical role-playing computer fantasy together with real human-social and mobile-gaming that emphasizes on collaboration and competition between players in a wide outdoor physical area that allows natural wide-area human-physical movements. Pacmen and Ghosts are now real human players in the real world experiencing mixed computer graphics fantasy-reality provided by using the wearable computers on them. Virtual cookies and actual tangible physical objects are incorporated into the game play to provide novel experiences of seamless transitions between the real and virtual worlds. This is an example of a new form of gaming that anchors on physicality, mobility, social interaction, and ubiquitous computing.
Computer-Based Technologies in Dentistry: Types and Applications
Albuha Al-Mussawi, Raja’a M.; Farid, Farzaneh
2016-01-01
During dental education, dental students learn how to examine patients, make diagnosis, plan treatment and perform dental procedures perfectly and efficiently. However, progresses in computer-based technologies including virtual reality (VR) simulators, augmented reality (AR) and computer aided design/computer aided manufacturing (CAD/CAM) systems have resulted in new modalities for instruction and practice of dentistry. Virtual reality dental simulators enable repeated, objective and assessable practice in various controlled situations. Superimposition of three-dimensional (3D) virtual images on actual images in AR allows surgeons to simultaneously visualize the surgical site and superimpose informative 3D images of invisible regions on the surgical site to serve as a guide. The use of CAD/CAM systems for designing and manufacturing of dental appliances and prostheses has been well established. This article reviews computer-based technologies, their application in dentistry and their potentials and limitations in promoting dental education, training and practice. Practitioners will be able to choose from a broader spectrum of options in their field of practice by becoming familiar with new modalities of training and practice. PMID:28392819
Computer-Based Technologies in Dentistry: Types and Applications.
Albuha Al-Mussawi, Raja'a M; Farid, Farzaneh
2016-06-01
During dental education, dental students learn how to examine patients, make diagnosis, plan treatment and perform dental procedures perfectly and efficiently. However, progresses in computer-based technologies including virtual reality (VR) simulators, augmented reality (AR) and computer aided design/computer aided manufacturing (CAD/CAM) systems have resulted in new modalities for instruction and practice of dentistry. Virtual reality dental simulators enable repeated, objective and assessable practice in various controlled situations. Superimposition of three-dimensional (3D) virtual images on actual images in AR allows surgeons to simultaneously visualize the surgical site and superimpose informative 3D images of invisible regions on the surgical site to serve as a guide. The use of CAD/CAM systems for designing and manufacturing of dental appliances and prostheses has been well established. This article reviews computer-based technologies, their application in dentistry and their potentials and limitations in promoting dental education, training and practice. Practitioners will be able to choose from a broader spectrum of options in their field of practice by becoming familiar with new modalities of training and practice.
ERIC Educational Resources Information Center
Brown, Abbie Howard
1999-01-01
Describes and discusses how simulation activities can be used in teacher education to augment the traditional field-experience approach, focusing on artificial intelligence, virtual reality, and intelligent tutoring systems. Includes an overview of simulation as a teaching and learning strategy and specific examples of high-technology simulations…
The Effectiveness of Virtual and Augmented Reality in Health Sciences and Medical Anatomy
ERIC Educational Resources Information Center
Moro, Christian; Štromberga, Zane; Raikos, Athanasios; Stirling, Allan
2017-01-01
Although cadavers constitute the gold standard for teaching anatomy to medical and health science students, there are substantial financial, ethical, and supervisory constraints on their use. In addition, although anatomy remains one of the fundamental areas of medical education, universities have decreased the hours allocated to teaching gross…
A Review on Making Things See: Augmented Reality for Futuristic Virtual Educator
ERIC Educational Resources Information Center
Iqbal, Javid; Sidhu, Manjit Singh
2017-01-01
In the past few years many choreographers have focused upon implementation of computer technology to enhance their artistic skills. Computer vision technology presents new methods for learning, instructing, developing, and assessing physical movements as well as provides scope to expand dance resources and rediscover the learning process. This…
ERIC Educational Resources Information Center
Duncan, Mike R.; Birrell, Bob; Williams, Toni
2005-01-01
Virtual Reality (VR) is primarily a visual technology. Elements such as haptics (touch feedback) and sound can augment an experience, but the visual cues are the prime driver of what an audience will experience from a VR presentation. At its inception in 2001 the Centre for Advanced Visualization (CFAV) at Niagara College of Arts and Technology…
The Role of Immersive Media in Online Education
ERIC Educational Resources Information Center
Bronack, Stephen C.
2011-01-01
An increasing number of educators are integrating immersive media into core course offerings. Virtual worlds, serious games, simulations, and augmented reality are enabling students and instructors to connect with content and with one another in novel ways. As a result, many are investigating the new affordances these media provide and the impact…
A spatially augmented reality sketching interface for architectural daylighting design.
Sheng, Yu; Yapo, Theodore C; Young, Christopher; Cutler, Barbara
2011-01-01
We present an application of interactive global illumination and spatially augmented reality to architectural daylight modeling that allows designers to explore alternative designs and new technologies for improving the sustainability of their buildings. Images of a model in the real world, captured by a camera above the scene, are processed to construct a virtual 3D model. To achieve interactive rendering rates, we use a hybrid rendering technique, leveraging radiosity to simulate the interreflectance between diffuse patches and shadow volumes to generate per-pixel direct illumination. The rendered images are then projected on the real model by four calibrated projectors to help users study the daylighting illumination. The virtual heliodon is a physical design environment in which multiple designers, a designer and a client, or a teacher and students can gather to experience animated visualizations of the natural illumination within a proposed design by controlling the time of day, season, and climate. Furthermore, participants may interactively redesign the geometry and materials of the space by manipulating physical design elements and see the updated lighting simulation. © 2011 IEEE Published by the IEEE Computer Society
Designing and researching of the virtual display system based on the prism elements
NASA Astrophysics Data System (ADS)
Vasilev, V. N.; Grimm, V. A.; Romanova, G. E.; Smirnov, S. A.; Bakholdin, A. V.; Grishina, N. Y.
2014-05-01
Problems of designing of systems for virtual display systems for augmented reality placed near the observers eye (so called head worn displays) with the light guide prismatic elements are considered. Systems of augmented reality is the complex consists of the image generator (most often it's the microdisplay with the illumination system if the display is not self-luminous), the objective which forms the display image practically in infinity and the combiner which organizes the light splitting so that an observer could see the information of the microdisplay and the surrounding environment as the background at the same time. This work deals with the system with the combiner based on the composite structure of the prism elements. In the work three cases of the prism combiner design are considered and also the results of the modeling with the optical design software are presented. In the model the question of the large pupil zone was analyzed and also the discontinuous character (mosaic structure) of the angular field in transmission of the information from the microdisplay to the observer's eye with the prismatic structure are discussed.
Explore and experience: mobile augmented reality for medical training.
Albrecht, Urs-Vito; Noll, Christoph; von Jan, Ute
2013-01-01
In medicine, especially in basic education, it may sometimes be inappropriate to integrate real patients into classes due to ethical issues that must be avoided. Nevertheless, the quality of medical education may suffer without the use of real cases. This is especially true of medical specialties such as legal medicine: survivors of a crime are already subjected to procedures that constitute a severe emotional burden and may cause additional distress even without the added presence of students. Using augmented reality based applications may alleviate this ethical dilemma by giving students the possibility to practice the necessary skills based on virtual but nevertheless almost realistic cases. The app "mARble®" that is presented in this paper follows this approach. The currently available learning module for legal medicine gives users an opportunity to learn about various wound patterns by virtually overlaying them on their own skin and is applicable in different learning settings. Preliminary evaluation results covering learning efficiency and emotional components of the learning process are promising. Content modules for other medical specialtiesare currently under construction.
Realistic Real-Time Outdoor Rendering in Augmented Reality
Kolivand, Hoshang; Sunar, Mohd Shahrizal
2014-01-01
Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems. PMID:25268480
Realistic real-time outdoor rendering in augmented reality.
Kolivand, Hoshang; Sunar, Mohd Shahrizal
2014-01-01
Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems.
Real-Time Motion Tracking for Mobile Augmented/Virtual Reality Using Adaptive Visual-Inertial Fusion
Fang, Wei; Zheng, Lianyu; Deng, Huanjun; Zhang, Hongbo
2017-01-01
In mobile augmented/virtual reality (AR/VR), real-time 6-Degree of Freedom (DoF) motion tracking is essential for the registration between virtual scenes and the real world. However, due to the limited computational capacity of mobile terminals today, the latency between consecutive arriving poses would damage the user experience in mobile AR/VR. Thus, a visual-inertial based real-time motion tracking for mobile AR/VR is proposed in this paper. By means of high frequency and passive outputs from the inertial sensor, the real-time performance of arriving poses for mobile AR/VR is achieved. In addition, to alleviate the jitter phenomenon during the visual-inertial fusion, an adaptive filter framework is established to cope with different motion situations automatically, enabling the real-time 6-DoF motion tracking by balancing the jitter and latency. Besides, the robustness of the traditional visual-only based motion tracking is enhanced, giving rise to a better mobile AR/VR performance when motion blur is encountered. Finally, experiments are carried out to demonstrate the proposed method, and the results show that this work is capable of providing a smooth and robust 6-DoF motion tracking for mobile AR/VR in real-time. PMID:28475145
Fang, Wei; Zheng, Lianyu; Deng, Huanjun; Zhang, Hongbo
2017-05-05
In mobile augmented/virtual reality (AR/VR), real-time 6-Degree of Freedom (DoF) motion tracking is essential for the registration between virtual scenes and the real world. However, due to the limited computational capacity of mobile terminals today, the latency between consecutive arriving poses would damage the user experience in mobile AR/VR. Thus, a visual-inertial based real-time motion tracking for mobile AR/VR is proposed in this paper. By means of high frequency and passive outputs from the inertial sensor, the real-time performance of arriving poses for mobile AR/VR is achieved. In addition, to alleviate the jitter phenomenon during the visual-inertial fusion, an adaptive filter framework is established to cope with different motion situations automatically, enabling the real-time 6-DoF motion tracking by balancing the jitter and latency. Besides, the robustness of the traditional visual-only based motion tracking is enhanced, giving rise to a better mobile AR/VR performance when motion blur is encountered. Finally, experiments are carried out to demonstrate the proposed method, and the results show that this work is capable of providing a smooth and robust 6-DoF motion tracking for mobile AR/VR in real-time.
A Survey of Mobile and Wireless Technologies for Augmented Reality Systems (Preprint)
2008-02-01
Windows XP. A number of researchers have started employing them in AR simulations such as Wagner et al [25], Newman et al [46] and specifically the Sony ...different music clubs and styles of music according to the selection and tastes of the listeners. In the intro sequence the user can select an animated...3-D character (avatar) as his or her virtual persona and visit the different music rooms in the virtual disco. Users can download or stream music in
Augmented reality in neurosurgery: a systematic review.
Meola, Antonio; Cutolo, Fabrizio; Carbone, Marina; Cagnazzo, Federico; Ferrari, Mauro; Ferrari, Vincenzo
2017-10-01
Neuronavigation has become an essential neurosurgical tool in pursuing minimal invasiveness and maximal safety, even though it has several technical limitations. Augmented reality (AR) neuronavigation is a significant advance, providing a real-time updated 3D virtual model of anatomical details, overlaid on the real surgical field. Currently, only a few AR systems have been tested in a clinical setting. The aim is to review such devices. We performed a PubMed search of reports restricted to human studies of in vivo applications of AR in any neurosurgical procedure using the search terms "Augmented reality" and "Neurosurgery." Eligibility assessment was performed independently by two reviewers in an unblinded standardized manner. The systems were qualitatively evaluated on the basis of the following: neurosurgical subspecialty of application, pathology of treated lesions and lesion locations, real data source, virtual data source, tracking modality, registration technique, visualization processing, display type, and perception location. Eighteen studies were included during the period 1996 to September 30, 2015. The AR systems were grouped by the real data source: microscope (8), hand- or head-held cameras (4), direct patient view (2), endoscope (1), and X-ray fluoroscopy (1) head-mounted display (1). A total of 195 lesions were treated: 75 (38.46 %) were neoplastic, 77 (39.48 %) neurovascular, and 1 (0.51 %) hydrocephalus, and 42 (21.53 %) were undetermined. Current literature confirms that AR is a reliable and versatile tool when performing minimally invasive approaches in a wide range of neurosurgical diseases, although prospective randomized studies are not yet available and technical improvements are needed.
The Architectonic Experience of Body and Space in Augmented Interiors
Pasqualini, Isabella; Blefari, Maria Laura; Tadi, Tej; Serino, Andrea; Blanke, Olaf
2018-01-01
The environment shapes our experience of space in constant interaction with the body. Architectonic interiors amplify the perception of space through the bodily senses; an effect also known as embodiment. The interaction of the bodily senses with the space surrounding the body can be tested experimentally through the manipulation of multisensory stimulation and measured via a range of behaviors related to bodily self-consciousness. Many studies have used Virtual Reality to show that visuotactile conflicts mediated via a virtual body or avatar can disrupt the unified subjective experience of the body and self. In the full-body illusion paradigm, participants feel as if the avatar was their body (ownership, self-identification) and they shift their center of awareness toward the position of the avatar (self-location). However, the influence of non-bodily spatial cues around the body on embodiment remains unclear, and data about the impact of architectonic space on human perception and self-conscious states are sparse. We placed participants into a Virtual Reality arena, where large and narrow virtual interiors were displayed with and without an avatar. We then applied synchronous or asynchronous visuotactile strokes to the back of the participants and avatar, or, to the front wall of the void interiors. During conditions of illusory self-identification with the avatar, participants reported sensations of containment, drift, and touch with the architectonic environment. The absence of the avatar suppressed such feelings, yet, in the large space, we found an effect of continuity between the physical and the virtual interior depending on the full-body illusion. We discuss subjective feelings evoked by architecture and compare the full-body illusion in augmented interiors to architectonic embodiment. A relevant outcome of this study is the potential to dissociate the egocentric, first-person view from the physical point of view through augmented architectonic space. PMID:29755378
The Architectonic Experience of Body and Space in Augmented Interiors.
Pasqualini, Isabella; Blefari, Maria Laura; Tadi, Tej; Serino, Andrea; Blanke, Olaf
2018-01-01
The environment shapes our experience of space in constant interaction with the body. Architectonic interiors amplify the perception of space through the bodily senses; an effect also known as embodiment. The interaction of the bodily senses with the space surrounding the body can be tested experimentally through the manipulation of multisensory stimulation and measured via a range of behaviors related to bodily self-consciousness. Many studies have used Virtual Reality to show that visuotactile conflicts mediated via a virtual body or avatar can disrupt the unified subjective experience of the body and self. In the full-body illusion paradigm, participants feel as if the avatar was their body (ownership, self-identification) and they shift their center of awareness toward the position of the avatar (self-location). However, the influence of non-bodily spatial cues around the body on embodiment remains unclear, and data about the impact of architectonic space on human perception and self-conscious states are sparse. We placed participants into a Virtual Reality arena, where large and narrow virtual interiors were displayed with and without an avatar. We then applied synchronous or asynchronous visuotactile strokes to the back of the participants and avatar, or, to the front wall of the void interiors. During conditions of illusory self-identification with the avatar, participants reported sensations of containment, drift, and touch with the architectonic environment. The absence of the avatar suppressed such feelings, yet, in the large space, we found an effect of continuity between the physical and the virtual interior depending on the full-body illusion. We discuss subjective feelings evoked by architecture and compare the full-body illusion in augmented interiors to architectonic embodiment. A relevant outcome of this study is the potential to dissociate the egocentric, first-person view from the physical point of view through augmented architectonic space.
Pose tracking for augmented reality applications in outdoor archaeological sites
NASA Astrophysics Data System (ADS)
Younes, Georges; Asmar, Daniel; Elhajj, Imad; Al-Harithy, Howayda
2017-01-01
In recent years, agencies around the world have invested huge amounts of effort toward digitizing many aspects of the world's cultural heritage. Of particular importance is the digitization of outdoor archaeological sites. In the spirit of valorization of this digital information, many groups have developed virtual or augmented reality (AR) computer applications themed around a particular archaeological object. The problem of pose tracking in outdoor AR applications is addressed. Different positional systems are analyzed, resulting in the selection of a monocular camera-based user tracker. The limitations that challenge this technique from map generation, scale, anchoring, to lighting conditions are analyzed and systematically addressed. Finally, as a case study, our pose tracking system is implemented within an AR experience in the Byblos Roman theater in Lebanon.
A see through future: augmented reality and health information systems.
Monkman, Helen; Kushniruk, Andre W
2015-01-01
Augmented Reality (AR) is a method whereby virtual objects are superimposed on the real world. AR technology is becoming increasingly accessible and affordable and it has many potential health applications. This paper discusses current research on AR health applications such as medical education and medical practice. Some of the potential future uses for this technology (e.g., health information systems, consumer health applications) will also be presented. Additionally, there will be a discussion outlining some of usability and human factors challenges associated with AR in healthcare. It is expected that AR will become increasingly prevalent in healthcare; however, further investigation is required to demonstrate that they provide benefits over traditional methods. Moreover, AR applications must be thoroughly tested to ensure they do not introduce new errors into practice and have patient safety implications.
Visual error augmentation enhances learning in three dimensions.
Sharp, Ian; Huang, Felix; Patton, James
2011-09-02
Because recent preliminary evidence points to the use of Error augmentation (EA) for motor learning enhancements, we visually enhanced deviations from a straight line path while subjects practiced a sensorimotor reversal task, similar to laparoscopic surgery. Our study asked 10 healthy subjects in two groups to perform targeted reaching in a simulated virtual reality environment, where the transformation of the hand position matrix was a complete reversal--rotated 180 degrees about an arbitrary axis (hence 2 of the 3 coordinates are reversed). Our data showed that after 500 practice trials, error-augmented-trained subjects reached the desired targets more quickly and with lower error (differences of 0.4 seconds and 0.5 cm Maximum Perpendicular Trajectory deviation) when compared to the control group. Furthermore, the manner in which subjects practiced was influenced by the error augmentation, resulting in more continuous motions for this group and smaller errors. Even with the extreme sensory discordance of a reversal, these data further support that distorted reality can promote more complete adaptation/learning when compared to regular training. Lastly, upon removing the flip all subjects quickly returned to baseline rapidly within 6 trials.
An Augmented-Reality Edge Enhancement Application for Google Glass
Hwang, Alex D.; Peli, Eli
2014-01-01
Purpose Google Glass provides a platform that can be easily extended to include a vision enhancement tool. We have implemented an augmented vision system on Glass, which overlays enhanced edge information over the wearer’s real world view, to provide contrast-improved central vision to the Glass wearers. The enhanced central vision can be naturally integrated with scanning. Methods Goggle Glass’s camera lens distortions were corrected by using an image warping. Since the camera and virtual display are horizontally separated by 16mm, and the camera aiming and virtual display projection angle are off by 10°, the warped camera image had to go through a series of 3D transformations to minimize parallax errors before the final projection to the Glass’ see-through virtual display. All image processes were implemented to achieve near real-time performance. The impacts of the contrast enhancements were measured for three normal vision subjects, with and without a diffuser film to simulate vision loss. Results For all three subjects, significantly improved contrast sensitivity was achieved when the subjects used the edge enhancements with a diffuser film. The performance boost is limited by the Glass camera’s performance. The authors assume this accounts for why performance improvements were observed only with the diffuser filter condition (simulating low vision). Conclusions Improvements were measured with simulated visual impairments. With the benefit of see-through augmented reality edge enhancement, natural visual scanning process is possible, and suggests that the device may provide better visual function in a cosmetically and ergonomically attractive format for patients with macular degeneration. PMID:24978871
Villiger, Michael; Bohli, Dominik; Kiper, Daniel; Pyk, Pawel; Spillmann, Jeremy; Meilick, Bruno; Curt, Armin; Hepp-Reymond, Marie-Claude; Hotz-Boendermaker, Sabina; Eng, Kynan
2013-10-01
Neurorehabilitation interventions to improve lower limb function and neuropathic pain have had limited success in people with chronic, incomplete spinal cord injury (iSCI). We hypothesized that intense virtual reality (VR)-augmented training of observed and executed leg movements would improve limb function and neuropathic pain. Patients used a VR system with a first-person view of virtual lower limbs, controlled via movement sensors fitted to the patient's own shoes. Four tasks were used to deliver intensive training of individual muscles (tibialis anterior, quadriceps, leg ad-/abductors). The tasks engaged motivation through feedback of task success. Fourteen chronic iSCI patients were treated over 4 weeks in 16 to 20 sessions of 45 minutes. Outcome measures were 10 Meter Walking Test, Berg Balance Scale, Lower Extremity Motor Score, Spinal Cord Independence Measure, Locomotion and Neuropathic Pain Scale (NPS), obtained at the start and at 4 to 6 weeks before intervention. In addition to positive changes reported by the patients (Patients' Global Impression of Change), measures of walking capacity, balance, and strength revealed improvements in lower limb function. Intensity and unpleasantness of neuropathic pain in half of the affected participants were reduced on the NPS test. Overall findings remained stable 12 to 16 weeks after termination of the training. In a pretest/posttest, uncontrolled design, VR-augmented training was associated with improvements in motor function and neuropathic pain in persons with chronic iSCI, several of which reached the level of a minimal clinically important change. A controlled trial is needed to compare this intervention to active training alone or in combination.
Yudkowsky, Rachel; Luciano, Cristian; Banerjee, Pat; Schwartz, Alan; Alaraj, Ali; Lemole, G Michael; Charbel, Fady; Smith, Kelly; Rizzi, Silvio; Byrne, Richard; Bendok, Bernard; Frim, David
2013-02-01
Ventriculostomy is a neurosurgical procedure for providing therapeutic cerebrospinal fluid drainage. Complications may arise during repeated attempts at placing the catheter in the ventricle. We studied the impact of simulation-based practice with a library of virtual brains on neurosurgery residents' performance in simulated and live surgical ventriculostomies. Using computed tomographic scans of actual patients, we developed a library of 15 virtual brains for the ImmersiveTouch system, a head- and hand-tracked augmented reality and haptic simulator. The virtual brains represent a range of anatomies including normal, shifted, and compressed ventricles. Neurosurgery residents participated in individual simulator practice on the library of brains including visualizing the 3-dimensional location of the catheter within the brain immediately after each insertion. Performance of participants on novel brains in the simulator and during actual surgery before and after intervention was analyzed using generalized linear mixed models. Simulator cannulation success rates increased after intervention, and live procedure outcomes showed improvement in the rate of successful cannulation on the first pass. However, the incidence of deeper, contralateral (simulator) and third-ventricle (live) placements increased after intervention. Residents reported that simulations were realistic and helpful in improving procedural skills such as aiming the probe, sensing the pressure change when entering the ventricle, and estimating how far the catheter should be advanced within the ventricle. Simulator practice with a library of virtual brains representing a range of anatomies and difficulty levels may improve performance, potentially decreasing complications due to inexpert technique.
NASA Astrophysics Data System (ADS)
Engelhardt, Sandy; Kolb, Silvio; De Simone, Raffaele; Karck, Matthias; Meinzer, Hans-Peter; Wolf, Ivo
2016-03-01
Mitral valve annuloplasty describes a surgical procedure where an artificial prosthesis is sutured onto the anatomical structure of the mitral annulus to re-establish the valve's functionality. Choosing an appropriate commercially available ring size and shape is a difficult decision the surgeon has to make intraoperatively according to his experience. In our augmented-reality framework, digitalized ring models are superimposed onto endoscopic image streams without using any additional hardware. To place the ring model on the proper position within the endoscopic image plane, a pose estimation is performed that depends on the localization of sutures placed by the surgeon around the leaflet origins and punctured through the stiffer structure of the annulus. In this work, the tissue penetration points are tracked by the real-time capable Lucas Kanade optical flow algorithm. The accuracy and robustness of this tracking algorithm is investigated with respect to the question whether outliers influence the subsequent pose estimation. Our results suggest that optical flow is very stable for a variety of different endoscopic scenes and tracking errors do not affect the position of the superimposed virtual objects in the scene, making this approach a viable candidate for annuloplasty augmented reality-enhanced decision support.
archAR: an archaeological augmented reality experience
NASA Astrophysics Data System (ADS)
Wiley, Bridgette; Schulze, Jürgen P.
2015-03-01
We present an application for Android phones or tablets called "archAR" that uses augmented reality as an alternative, portable way of viewing archaeological information from UCSD's Levantine Archaeology Laboratory. archAR provides a unique experience of flying through an archaeological dig site in the Levantine area and exploring the artifacts uncovered there. Using a Google Nexus tablet and Qualcomm's Vuforia API, we use an image target as a map and overlay a three-dimensional model of the dig site onto it, augmenting reality such that we are able to interact with the plotted artifacts. The user can physically move the Android device around the image target and see the dig site model from any perspective. The user can also move the device closer to the model in order to "zoom" into the view of a particular section of the model and its associated artifacts. This is especially useful, as the dig site model and the collection of artifacts are very detailed. The artifacts are plotted as points, colored by type. The user can touch the virtual points to trigger a popup information window that contains details of the artifact, such as photographs, material descriptions, and more.
Measuring the Usability of Augmented Reality e-Learning Systems: A User-Centered Evaluation Approach
NASA Astrophysics Data System (ADS)
Pribeanu, Costin; Balog, Alexandru; Iordache, Dragoş Daniel
The development of Augmented Reality (AR) systems is creating new challenges and opportunities for the designers of e-learning systems. The mix of real and virtual requires appropriate interaction techniques that have to be evaluated with users in order to avoid usability problems. Formative usability aims at finding usability problems as early as possible in the development life cycle and is suitable to support the development of such novel interactive systems. This work presents an approach to the user-centered usability evaluation of an e-learning scenario for Biology developed on an Augmented Reality educational platform. The evaluation has been carried on during and after a summer school held within the ARiSE research project. The basic idea was to perform usability evaluation twice. In this respect, we conducted user testing with a small number of students during the summer school in order to get a fast feedback from users having good knowledge in Biology. Then, we repeated the user testing in different conditions and with a relatively larger number of representative users. In this paper we describe both experiments and compare the usability evaluation results.
Alaraj, Ali; Charbel, Fady T; Birk, Daniel; Tobin, Matthew; Tobin, Mathew; Luciano, Cristian; Banerjee, Pat P; Rizzi, Silvio; Sorenson, Jeff; Foley, Kevin; Slavin, Konstantin; Roitberg, Ben
2013-01-01
Recent studies have shown that mental script-based rehearsal and simulation-based training improve the transfer of surgical skills in various medical disciplines. Despite significant advances in technology and intraoperative techniques over the last several decades, surgical skills training on neurosurgical operations still carries significant risk of serious morbidity or mortality. Potentially avoidable technical errors are well recognized as contributing to poor surgical outcome. Surgical education is undergoing overwhelming change, as a result of the reduction of work hours and current trends focusing on patient safety and linking reimbursement with clinical outcomes. Thus, there is a need for adjunctive means for neurosurgical training, which is a recent advancement in simulation technology. ImmersiveTouch is an augmented reality system that integrates a haptic device and a high-resolution stereoscopic display. This simulation platform uses multiple sensory modalities, re-creating many of the environmental cues experienced during an actual procedure. Modules available include ventriculostomy, bone drilling, percutaneous trigeminal rhizotomy, and simulated spinal modules such as pedicle screw placement, vertebroplasty, and lumbar puncture. We present our experience with the development of such augmented reality neurosurgical modules and the feedback from neurosurgical residents.
[Interactive augmented reality systems : Aid for personalized patient education and rehabilitation].
Bork, F
2018-04-01
During patient education, information exchange plays a critical role both for patient compliance during medical or rehabilitative treatment and for obtaining an informed consent for an operative procedure. In this article the augmented reality system "Magic Mirror" as an additive tool during patient education, rehabilitation as well as anatomical education is highlighted. The Magic Mirror system allows the user of the system to inspect both a detailed model of the 3‑dimensional anatomy of the human body and volumetric slice images in a virtual mirror environment. First preliminary results from the areas of rehabilitation and learning anatomy indicate the broad potential of the Magic Mirror. Similarly, the system also provides interesting advantages for patient education situations in comparison to traditional methods of information exchange. Novel technologies, such as augmented reality are a door opener for many innovations in medicine. In the future, patient-specific systems, such as the Magic Mirror will be used increasingly more in areas such as patient education and rehabilitation. In order to maximize the benefits of such systems, further evaluation studies are necessary to find out about the best use cases and to start an iterative optimization process of these systems.
A virtual tour of geological heritage: Valourising geodiversity using Google Earth and QR code
NASA Astrophysics Data System (ADS)
Martínez-Graña, A. M.; Goy, J. L.; Cimarra, C. A.
2013-12-01
When making land-use plans, it is necessary to inventory and catalogue the geological heritage and geodiversity of a site to establish an apolitical conservation protection plan to meet the educational and social needs of society. New technologies make it possible to create virtual databases using virtual globes - e.g., Google Earth - and other personal-use geomatics applications (smartphones, tablets, PDAs) for accessing geological heritage information in “real time” for scientific, educational, and cultural purposes via a virtual geological itinerary. Seventeen mapped and georeferenced geosites have been created in Keyhole Markup Language for use in map layers used in geological itinerary stops for different applications. A virtual tour has been developed for Las Quilamas Natural Park, which is located in the Spanish Central System, using geological layers and topographic and digital terrain models that can be overlaid in a 3D model. The Google Earth application was used to import the geosite placemarks. For each geosite, a tab has been developed that shows a description of the geology with photographs and diagrams and that evaluates the scientific, educational, and tourism quality. Augmented reality allows the user to access these georeferenced thematic layers and overlay data, images, and graphics in real time on their mobile devices. These virtual tours can be incorporated into subject guides designed by public. Seven educational and interpretive panels describing some of the geosites were designed and tagged with a QR code that could be printed at each stop or in the printed itinerary. These QR codes can be scanned with the camera found on most mobile devices, and video virtual tours can be viewed on these devices. The virtual tour of the geological heritage can be used to show tourists the geological history of the Las Quilamas Natural Park using new geomatics technologies (virtual globes, augmented reality, and QR codes).
Augmented reality-guided artery-first pancreatico-duodenectomy.
Marzano, Ettore; Piardi, Tullio; Soler, Luc; Diana, Michele; Mutter, Didier; Marescaux, Jacques; Pessaux, Patrick
2013-11-01
Augmented Reality (AR) in surgery consists in the fusion of synthetic computer-generated images (3D virtual model) obtained from medical imaging preoperative work-up and real-time patient images with the aim to visualize unapparent anatomical details. The potential of AR navigation as a tool to improve safety of the surgical dissection is presented in a case of pancreatico-duodenectomy (PD). A 77-year-old male patient underwent an AR-assisted PD. The 3D virtual anatomical model was obtained from thoraco-abdominal CT scan using customary software (VR-RENDER®, IRCAD). The virtual model was superimposed to the operative field using an Exoscope (VITOM®, Karl Storz, Tüttlingen, Germany) as well as different visible landmarks (inferior vena cava, left renal vein, aorta, superior mesenteric vein, inferior margin of the pancreas). A computer scientist manually registered virtual and real images using a video mixer (MX 70; Panasonic, Secaucus, NJ) in real time. Dissection of the superior mesenteric artery and the hanging maneuver were performed under AR guidance along the hanging plane. AR allowed for precise and safe recognition of all the important vascular structures. Operative time was 360 min. AR display and fine registration was performed within 6 min. The postoperative course was uneventful. The pathology was positive for ampullary adenocarcinoma; the final stage was pT1N0 (0/43 retrieved lymph nodes) with clear surgical margins. AR is a valuable navigation tool that can enhance the ability to achieve a safe surgical resection during PD.
Park, Yu-Hyung; Lee, Chi-Ho; Lee, Byoung-Hee
2013-01-01
This study is a single blind randomized controlled trial to determine the effect of virtual reality-based postural control training on the gait ability in patients with chronic stroke. Sixteen subjects were randomly assigned to either experimental group (VR, n= 8) or control group (CPT, n= 8). Subjects in both groups received conventional physical therapy for 60 min per day, five days per week during a period of four weeks. Subjects in the VR group received additional augmented reality-based training for 30 min per day, three days per week during a period of four weeks. The subjects were evaluated one week before and after participating in a four week training and follow-up at one month post-training. Data derived from the gait analyses included spatiotemporal gait parameters, 10 meters walking test (10 mWT). In the gait parameters, subjects in the VR group showed significant improvement, except for cadence at post-training and follow-up within the experimental group. However, no obvious significant improvement was observed within the control group. In between group comparisons, the experimental group (VR group) showed significantly greater improvement only in stride length compared with the control group (P< 0.05), however, no significant difference was observed in other gait parameters. In conclusion, we demonstrate significant improvement in gait ability in chronic stroke patients who received virtual reality based postural control training. These findings suggest that virtual reality (VR) postural control training using real-time information may be a useful approach for enhancement of gait ability in patients with chronic stroke.
ERIC Educational Resources Information Center
Gilbert, Kristen A.
2017-01-01
Aim/Purpose: Improving public schools is a focus of federal legislation in the United States with much of the burden placed on principals. However, preparing principals for this task has proven elusive despite many changes in programming by institutions of higher learning. Emerging technologies that rely on augmented and virtual realities are…
The Game "Pokemon Go" as a Crosscultural Phenomenon
ERIC Educational Resources Information Center
Koroleva, D. O.; Kochervey, A. I.; Nasonova, K. M.; Shibeko, Yu. V.
2016-01-01
The gaming culture of modern childhood takes place not only in real space, but in a virtual one too. These two components (two planes of development) are often viewed in isolation from one another. This study investigates the completely new phenomenon of augmented reality gaming through the lens of "Pokemon Go," and it describes how the…
Ingress in Geography: Portals to Academic Success?
ERIC Educational Resources Information Center
Davis, Michael
2017-01-01
Niantic Labs has developed an augmented virtual reality mobile app game called Ingress in which agents must seek out and control locations for their designated factions. The app uses the Google Maps interface along with GPS to enhance a geocaching-like experience with elements of other classical games such as capture-the-flag. This study aims to…
Learning Protein Structure with Peers in an AR-Enhanced Learning Environment
ERIC Educational Resources Information Center
Chen, Yu-Chien
2013-01-01
Augmented reality (AR) is an interactive system that allows users to interact with virtual objects and the real world at the same time. The purpose of this dissertation was to explore how AR, as a new visualization tool, that can demonstrate spatial relationships by representing three dimensional objects and animations, facilitates students to…
Analysis of Mental Workload in Online Shopping: Are Augmented and Virtual Reality Consistent?
Zhao, Xiaojun; Shi, Changxiu; You, Xuqun; Zong, Chenming
2017-01-01
A market research company (Nielsen) reported that consumers in the Asia-Pacific region have become the most active group in online shopping. Focusing on augmented reality (AR), which is one of three major techniques used to change the method of shopping in the future, this study used a mixed design to discuss the influences of the method of online shopping, user gender, cognitive style, product value, and sensory channel on mental workload in virtual reality (VR) and AR situations. The results showed that males' mental workloads were significantly higher than females'. For males, high-value products' mental workload was significantly higher than that of low-value products. In the VR situation, the visual mental workload of field-independent and field-dependent consumers showed a significant difference, but the difference was reduced under audio-visual conditions. In the AR situation, the visual mental workload of field-independent and field-dependent consumers showed a significant difference, but the difference increased under audio-visual conditions. This study provided a psychological study of online shopping with AR and VR technology with applications in the future. Based on the perspective of embodied cognition, AR online shopping may be potential focus of research and market application. For the future design of online shopping platforms and the updating of user experience, this study provides a reference.
Analysis of Mental Workload in Online Shopping: Are Augmented and Virtual Reality Consistent?
Zhao, Xiaojun; Shi, Changxiu; You, Xuqun; Zong, Chenming
2017-01-01
A market research company (Nielsen) reported that consumers in the Asia-Pacific region have become the most active group in online shopping. Focusing on augmented reality (AR), which is one of three major techniques used to change the method of shopping in the future, this study used a mixed design to discuss the influences of the method of online shopping, user gender, cognitive style, product value, and sensory channel on mental workload in virtual reality (VR) and AR situations. The results showed that males’ mental workloads were significantly higher than females’. For males, high-value products’ mental workload was significantly higher than that of low-value products. In the VR situation, the visual mental workload of field-independent and field-dependent consumers showed a significant difference, but the difference was reduced under audio–visual conditions. In the AR situation, the visual mental workload of field-independent and field-dependent consumers showed a significant difference, but the difference increased under audio–visual conditions. This study provided a psychological study of online shopping with AR and VR technology with applications in the future. Based on the perspective of embodied cognition, AR online shopping may be potential focus of research and market application. For the future design of online shopping platforms and the updating of user experience, this study provides a reference. PMID:28184207
Fundamental arthroscopic skill differentiation with virtual reality simulation.
Rose, Kelsey; Pedowitz, Robert
2015-02-01
The purpose of this study was to investigate the use and validity of virtual reality modules as part of the educational approach to mastering arthroscopy in a safe environment by assessing the ability to distinguish between experience levels. Additionally, the study aimed to evaluate whether experts have greater ambidexterity than do novices. Three virtual reality modules (Swemac/Augmented Reality Systems, Linkoping, Sweden) were created to test fundamental arthroscopic skills. Thirty participants-10 experts consisting of faculty, 10 intermediate participants consisting of orthopaedic residents, and 10 novices consisting of medical students-performed each exercise. Steady and Telescope was designed to train centering and image stability. Steady and Probe was designed to train basic triangulation. Track and Moving Target was designed to train coordinated motions of arthroscope and probe. Metrics reflecting speed, accuracy, and efficiency of motion were used to measure construct validity. Steady and Probe and Track a Moving Target both exhibited construct validity, with better performance by experts and intermediate participants than by novices (P < .05), whereas Steady and Telescope did not show validity. There was an overall trend toward better ambidexterity as a function of greater surgical experience, with experts consistently more proficient than novices throughout all 3 modules. This study represents a new way to assess basic arthroscopy skills using virtual reality modules developed through task deconstruction. Participants with the most arthroscopic experience performed better and were more consistent than novices on all 3 virtual reality modules. Greater arthroscopic experience correlates with more symmetry of ambidextrous performance. However, further adjustment of the modules may better simulate fundamental arthroscopic skills and discriminate between experience levels. Arthroscopy training is a critical element of orthopaedic surgery resident training. Developing techniques to safely and effectively train these skills is critical for patient safety and resident education. Copyright © 2015 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.
Towards Pervasive Augmented Reality: Context-Awareness in Augmented Reality.
Grubert, Jens; Langlotz, Tobias; Zollmann, Stefanie; Regenbrecht, Holger
2017-06-01
Augmented Reality is a technique that enables users to interact with their physical environment through the overlay of digital information. While being researched for decades, more recently, Augmented Reality moved out of the research labs and into the field. While most of the applications are used sporadically and for one particular task only, current and future scenarios will provide a continuous and multi-purpose user experience. Therefore, in this paper, we present the concept of Pervasive Augmented Reality, aiming to provide such an experience by sensing the user's current context and adapting the AR system based on the changing requirements and constraints. We present a taxonomy for Pervasive Augmented Reality and context-aware Augmented Reality, which classifies context sources and context targets relevant for implementing such a context-aware, continuous Augmented Reality experience. We further summarize existing approaches that contribute towards Pervasive Augmented Reality. Based our taxonomy and survey, we identify challenges for future research directions in Pervasive Augmented Reality.
Marker Registration Technique for Handwritten Text Marker in Augmented Reality Applications
NASA Astrophysics Data System (ADS)
Thanaborvornwiwat, N.; Patanukhom, K.
2018-04-01
Marker registration is a fundamental process to estimate camera poses in marker-based Augmented Reality (AR) systems. We developed AR system that creates correspondence virtual objects on handwritten text markers. This paper presents a new method for registration that is robust for low-content text markers, variation of camera poses, and variation of handwritten styles. The proposed method uses Maximally Stable Extremal Regions (MSER) and polygon simplification for a feature point extraction. The experiment shows that we need to extract only five feature points per image which can provide the best registration results. An exhaustive search is used to find the best matching pattern of the feature points in two images. We also compared performance of the proposed method to some existing registration methods and found that the proposed method can provide better accuracy and time efficiency.
Augmented Reality in Neurosurgery: A Review of Current Concepts and Emerging Applications.
Guha, Daipayan; Alotaibi, Naif M; Nguyen, Nhu; Gupta, Shaurya; McFaul, Christopher; Yang, Victor X D
2017-05-01
Augmented reality (AR) superimposes computer-generated virtual objects onto the user's view of the real world. Among medical disciplines, neurosurgery has long been at the forefront of image-guided surgery, and it continues to push the frontiers of AR technology in the operating room. In this systematic review, we explore the history of AR in neurosurgery and examine the literature on current neurosurgical applications of AR. Significant challenges to surgical AR exist, including compounded sources of registration error, impaired depth perception, visual and tactile temporal asynchrony, and operator inattentional blindness. Nevertheless, the ability to accurately display multiple three-dimensional datasets congruently over the area where they are most useful, coupled with future advances in imaging, registration, display technology, and robotic actuation, portend a promising role for AR in the neurosurgical operating room.
Future Reality: How Emerging Technologies Will Change Language Itself.
Perlin, Ken
2016-01-01
Just as notebook computers once freed us to take our computers with us, smartphones freed us to walk around with computers in our pockets, and wearables will soon free us from needing to hold a screen at all. Today, as high-quality virtual and augmented reality begins to become available at consumer prices, the "screen" will soon be all around us. But the largest long-term impact here may not merely be one of form factor, but rather one of language itself. Once wearables become small enough, cheap enough, and therefore ubiquitous enough to be accepted as part of our everyday reality, our use of language will evolve in important ways.
Tang, Rui; Ma, Long-Fei; Rong, Zhi-Xia; Li, Mo-Dan; Zeng, Jian-Ping; Wang, Xue-Dong; Liao, Hong-En; Dong, Jia-Hong
2018-04-01
Augmented reality (AR) technology is used to reconstruct three-dimensional (3D) images of hepatic and biliary structures from computed tomography and magnetic resonance imaging data, and to superimpose the virtual images onto a view of the surgical field. In liver surgery, these superimposed virtual images help the surgeon to visualize intrahepatic structures and therefore, to operate precisely and to improve clinical outcomes. The keywords "augmented reality", "liver", "laparoscopic" and "hepatectomy" were used for searching publications in the PubMed database. The primary source of literatures was from peer-reviewed journals up to December 2016. Additional articles were identified by manual search of references found in the key articles. In general, AR technology mainly includes 3D reconstruction, display, registration as well as tracking techniques and has recently been adopted gradually for liver surgeries including laparoscopy and laparotomy with video-based AR assisted laparoscopic resection as the main technical application. By applying AR technology, blood vessels and tumor structures in the liver can be displayed during surgery, which permits precise navigation during complex surgical procedures. Liver transformation and registration errors during surgery were the main factors that limit the application of AR technology. With recent advances, AR technologies have the potential to improve hepatobiliary surgical procedures. However, additional clinical studies will be required to evaluate AR as a tool for reducing postoperative morbidity and mortality and for the improvement of long-term clinical outcomes. Future research is needed in the fusion of multiple imaging modalities, improving biomechanical liver modeling, and enhancing image data processing and tracking technologies to increase the accuracy of current AR methods. Copyright © 2018 First Affiliated Hospital, Zhejiang University School of Medicine in China. Published by Elsevier B.V. All rights reserved.
Virtual reality and the unfolding of higher dimensions
NASA Astrophysics Data System (ADS)
Aguilera, Julieta C.
2006-02-01
As virtual/augmented reality evolves, the need for spaces that are responsive to structures independent from three dimensional spatial constraints, become apparent. The visual medium of computer graphics may also challenge these self imposed constraints. If one can get used to how projections affect 3D objects in two dimensions, it may also be possible to compose a situation in which to get used to the variations that occur while moving through higher dimensions. The presented application is an enveloping landscape of concave and convex forms, which are determined by the orientation and displacement of the user in relation to a grid made of tesseracts (cubes in four dimensions). The interface accepts input from tridimensional and four-dimensional transformations, and smoothly displays such interactions in real-time. The motion of the user becomes the graphic element whereas the higher dimensional grid references to his/her position relative to it. The user learns how motion inputs affect the grid, recognizing a correlation between the input and the transformations. Mapping information to complex grids in virtual reality is valuable for engineers, artists and users in general because navigation can be internalized like a dance pattern, and further engage us to maneuver space in order to know and experience.
Harrison, C S; Grant, P M; Conway, B A
2010-01-01
The increasing importance of inclusive design and in particular accessibility guidelines established in the U.K. 1996 Disability Discrimination Act (DDA) has been a prime motivation for the work on wheelchair access, a subset of the DDA guidelines, described in this article. The development of these guidelines mirrors the long-standing provisions developed in the U.S. In order to raise awareness of these guidelines and in particular to give architects, building designers, and users a physical sensation of how a planned development could be experienced, a wheelchair virtual reality system was developed. This compares with conventional methods of measuring against drawings and comparing dimensions against building regulations, established in the U.K. under British standards. Features of this approach include the marriage of an electromechanical force-feedback system with high-quality immersive graphics as well as the potential ability to generate a physiological rating of buildings that do not yet exist. The provision of this sense of "feel" augments immersion within the virtual reality environment and also provides the basis from which both qualitative and quantitative measures of a building's access performance can be gained.
Virtual reality training in neurosurgery: Review of current status and future applications
Alaraj, Ali; Lemole, Michael G.; Finkle, Joshua H.; Yudkowsky, Rachel; Wallace, Adam; Luciano, Cristian; Banerjee, P. Pat; Rizzi, Silvio H.; Charbel, Fady T.
2011-01-01
Background: Over years, surgical training is changing and years of tradition are being challenged by legal and ethical concerns for patient safety, work hour restrictions, and the cost of operating room time. Surgical simulation and skill training offer an opportunity to teach and practice advanced techniques before attempting them on patients. Simulation training can be as straightforward as using real instruments and video equipment to manipulate simulated “tissue” in a box trainer. More advanced virtual reality (VR) simulators are now available and ready for widespread use. Early systems have demonstrated their effectiveness and discriminative ability. Newer systems enable the development of comprehensive curricula and full procedural simulations. Methods: A PubMed review of the literature was performed for the MESH words “Virtual reality, “Augmented Reality”, “Simulation”, “Training”, and “Neurosurgery”. Relevant articles were retrieved and reviewed. A review of the literature was performed for the history, current status of VR simulation in neurosurgery. Results: Surgical organizations are calling for methods to ensure the maintenance of skills, advance surgical training, and credential surgeons as technically competent. The number of published literature discussing the application of VR simulation in neurosurgery training has evolved over the last decade from data visualization, including stereoscopic evaluation to more complex augmented reality models. With the revolution of computational analysis abilities, fully immersive VR models are currently available in neurosurgery training. Ventriculostomy catheters insertion, endoscopic and endovascular simulations are used in neurosurgical residency training centers across the world. Recent studies have shown the coloration of proficiency with those simulators and levels of experience in the real world. Conclusion: Fully immersive technology is starting to be applied to the practice of neurosurgery. In the near future, detailed VR neurosurgical modules will evolve to be an essential part of the curriculum of the training of neurosurgeons. PMID:21697968
Camera pose estimation for augmented reality in a small indoor dynamic scene
NASA Astrophysics Data System (ADS)
Frikha, Rawia; Ejbali, Ridha; Zaied, Mourad
2017-09-01
Camera pose estimation remains a challenging task for augmented reality (AR) applications. Simultaneous localization and mapping (SLAM)-based methods are able to estimate the six degrees of freedom camera motion while constructing a map of an unknown environment. However, these methods do not provide any reference for where to insert virtual objects since they do not have any information about scene structure and may fail in cases of occlusion of three-dimensional (3-D) map points or dynamic objects. This paper presents a real-time monocular piece wise planar SLAM method using the planar scene assumption. Using planar structures in the mapping process allows rendering virtual objects in a meaningful way on the one hand and improving the precision of the camera pose and the quality of 3-D reconstruction of the environment by adding constraints on 3-D points and poses in the optimization process on the other hand. We proposed to benefit from the 3-D planes rigidity motion in the tracking process to enhance the system robustness in the case of dynamic scenes. Experimental results show that using a constrained planar scene improves our system accuracy and robustness compared with the classical SLAM systems.
Augmented reality: past, present, future
NASA Astrophysics Data System (ADS)
Inzerillo, Laura
2013-03-01
A great opportunity has permitted to carry out a cultural, historical, architectural and social research with great impact factor on the international cultural interest. We are talking about the realization of a museum whose the main theme is the visit and the discovery of a monument of great prestige: the monumental building the "Steri" in Palermo. The museum is divided into sub themes including the one above all, that has aroused the international interest so much that it has been presented the instance to include the museum in the cultural heritage of UNESCO. It is the realization of a museum path that regards the cells of the Inquisition, which are located just inside of some buildings of the monumental building. The project, as a whole, is faced, in a total view, between the various competences implicated: historic, chemic, architectonic, topographic, drawing, representation, virtual communication, informatics. The birth of the museum will be a sum of the results of all these disciplines involved. Methodology, implementation, fruition, virtual museum, goals, 2D graphic restitution, effects on the cultural heritage and landscape environmental, augmented reality, Surveying 2D and 3D, hi-touch screen, Photogrammetric survey, Photographic survey, representation, drawing 3D and more than this has been dealt with this research.
Design and evaluation of an augmented reality simulator using leap motion.
Wright, Trinette; de Ribaupierre, Sandrine; Eagleson, Roy
2017-10-01
Advances in virtual and augmented reality (AR) are having an impact on the medical field in areas such as surgical simulation. Improvements to surgical simulation will provide students and residents with additional training and evaluation methods. This is particularly important for procedures such as the endoscopic third ventriculostomy (ETV), which residents perform regularly. Simulators such as NeuroTouch, have been designed to aid in training associated with this procedure. The authors have designed an affordable and easily accessible ETV simulator, and compare it with the existing NeuroTouch for its usability and training effectiveness. This simulator was developed using Unity, Vuforia and the leap motion (LM) for an AR environment. The participants, 16 novices and two expert neurosurgeons, were asked to complete 40 targeting tasks. Participants used the NeuroTouch tool or a virtual hand controlled by the LM to select the position and orientation for these tasks. The length of time to complete each task was recorded and the trajectory log files were used to calculate performance. The resulting data from the novices' and experts' speed and accuracy are compared, and they discuss the objective performance of training in terms of the speed and accuracy of targeting accuracy for each system.
Design and evaluation of an augmented reality simulator using leap motion
de Ribaupierre, Sandrine; Eagleson, Roy
2017-01-01
Advances in virtual and augmented reality (AR) are having an impact on the medical field in areas such as surgical simulation. Improvements to surgical simulation will provide students and residents with additional training and evaluation methods. This is particularly important for procedures such as the endoscopic third ventriculostomy (ETV), which residents perform regularly. Simulators such as NeuroTouch, have been designed to aid in training associated with this procedure. The authors have designed an affordable and easily accessible ETV simulator, and compare it with the existing NeuroTouch for its usability and training effectiveness. This simulator was developed using Unity, Vuforia and the leap motion (LM) for an AR environment. The participants, 16 novices and two expert neurosurgeons, were asked to complete 40 targeting tasks. Participants used the NeuroTouch tool or a virtual hand controlled by the LM to select the position and orientation for these tasks. The length of time to complete each task was recorded and the trajectory log files were used to calculate performance. The resulting data from the novices' and experts' speed and accuracy are compared, and they discuss the objective performance of training in terms of the speed and accuracy of targeting accuracy for each system. PMID:29184667
Developing an Augmented Reality Environment for Earth Science Education
NASA Astrophysics Data System (ADS)
Pratt, M. J.; Skemer, P. A.; Arvidson, R. E.
2017-12-01
The emerging field of augmented reality (AR) provides new and exciting ways to explore geologic phenomena for research and education. The primary advantage of AR is that it allows users to physically explore complex three-dimensional structures that were previously inaccessible, for example a remote geologic outcrop or a mineral structure at the atomic scale. It is used, for example, with OnSight software during tactical operations to plan the Mars Curiosity rover's traverses by providing virtual views to walk through terrain and the rover at true scales. This mode of physical exploration allows users more freedom to investigate and understand the 3D structure than is possible on a flat computer screen, or within a static PowerPoint presentation during a classroom lecture. The Microsoft HoloLens headset provides the most-advanced, mobile AR platform currently available to developers. The Fossett Laboratory for Virtual Planetary Exploration at Washington University in St. Louis has applied this technology, coupled with photogrammetric software and the Unity 3D gaming engine, to develop photorealistic environments of 3D geologic outcrops from around the world. The untethered HoloLens provides an ideal platform for a classroom setting as it allows for shared experiences of the holograms of interest, projecting them in the same location for all users to explore. Furthermore, the HoloLens allows for face-to-face communication during use that is important in teaching, a feature that virtual reality does not allow. Our development of an AR application includes the design of an online database of photogrammetric outcrop models curated for the current limitations of AR technology. This database will be accessible to both those wishing to submit models, and is free to those wishing to use the application for teaching, outreach or research purposes.
Vroom: designing an augmented environment for remote collaboration in digital cinema production
NASA Astrophysics Data System (ADS)
Margolis, Todd; Cornish, Tracy
2013-03-01
As media technologies become increasingly affordable, compact and inherently networked, new generations of telecollaborative platforms continue to arise which integrate these new affordances. Virtual reality has been primarily concerned with creating simulations of environments that can transport participants to real or imagined spaces that replace the "real world". Meanwhile Augmented Reality systems have evolved to interleave objects from Virtual Reality environments into the physical landscape. Perhaps now there is a new class of systems that reverse this precept to enhance dynamic media landscapes and immersive physical display environments to enable intuitive data exploration through collaboration. Vroom (Virtual Room) is a next-generation reconfigurable tiled display environment in development at the California Institute for Telecommunications and Information Technology (Calit2) at the University of California, San Diego. Vroom enables freely scalable digital collaboratories, connecting distributed, high-resolution visualization resources for collaborative work in the sciences, engineering and the arts. Vroom transforms a physical space into an immersive media environment with large format interactive display surfaces, video teleconferencing and spatialized audio built on a highspeed optical network backbone. Vroom enables group collaboration for local and remote participants to share knowledge and experiences. Possible applications include: remote learning, command and control, storyboarding, post-production editorial review, high resolution video playback, 3D visualization, screencasting and image, video and multimedia file sharing. To support these various scenarios, Vroom features support for multiple user interfaces (optical tracking, touch UI, gesture interface, etc.), support for directional and spatialized audio, giga-pixel image interactivity, 4K video streaming, 3D visualization and telematic production. This paper explains the design process that has been utilized to make Vroom an accessible and intuitive immersive environment for remote collaboration specifically for digital cinema production.
White, Ian; Buchberg, Brian; Tsikitis, V Liana; Herzig, Daniel O; Vetto, John T; Lu, Kim C
2014-06-01
Colorectal cancer is the second most common cause of death in the USA. The need for screening colonoscopies, and thus adequately trained endoscopists, particularly in rural areas, is on the rise. Recent increases in required endoscopic cases for surgical resident graduation by the Surgery Residency Review Committee (RRC) further emphasize the need for more effective endoscopic training during residency to determine if a virtual reality colonoscopy simulator enhances surgical resident endoscopic education by detecting improvement in colonoscopy skills before and after 6 weeks of formal clinical endoscopic training. We conducted a retrospective review of prospectively collected surgery resident data on an endoscopy simulator. Residents performed four different clinical scenarios on the endoscopic simulator before and after a 6-week endoscopic training course. Data were collected over a 5-year period from 94 different residents performing a total of 795 colonoscopic simulation scenarios. Main outcome measures included time to cecal intubation, "red out" time, and severity of simulated patient discomfort (mild, moderate, severe, extreme) during colonoscopy scenarios. Average time to intubation of the cecum was 6.8 min for those residents who had not undergone endoscopic training versus 4.4 min for those who had undergone endoscopic training (p < 0.001). Residents who could be compared against themselves (pre vs. post-training), cecal intubation times decreased from 7.1 to 4.3 min (p < 0.001). Post-endoscopy rotation residents caused less severe discomfort during simulated colonoscopy than pre-endoscopy rotation residents (4 vs. 10%; p = 0.004). Virtual reality endoscopic simulation is an effective tool for both augmenting surgical resident endoscopy cancer education and measuring improvement in resident performance after formal clinical endoscopic training.
NASA Astrophysics Data System (ADS)
McFadden, D.; Tavakkoli, A.; Regenbrecht, J.; Wilson, B.
2017-12-01
Virtual Reality (VR) and Augmented Reality (AR) applications have recently seen an impressive growth, thanks to the advent of commercial Head Mounted Displays (HMDs). This new visualization era has opened the possibility of presenting researchers from multiple disciplines with data visualization techniques not possible via traditional 2D screens. In a purely VR environment researchers are presented with the visual data in a virtual environment, whereas in a purely AR application, a piece of virtual object is projected into the real world with which researchers could interact. There are several limitations to the purely VR or AR application when taken within the context of remote planetary exploration. For example, in a purely VR environment, contents of the planet surface (e.g. rocks, terrain, or other features) should be created off-line from a multitude of images using image processing techniques to generate 3D mesh data that will populate the virtual surface of the planet. This process usually takes a tremendous amount of computational resources and cannot be delivered in real-time. As an alternative, video frames may be superimposed on the virtual environment to save processing time. However, such rendered video frames will lack 3D visual information -i.e. depth information. In this paper, we present a technique to utilize a remotely situated robot's stereoscopic cameras to provide a live visual feed from the real world into the virtual environment in which planetary scientists are immersed. Moreover, the proposed technique will blend the virtual environment with the real world in such a way as to preserve both the depth and visual information from the real world while allowing for the sensation of immersion when the entire sequence is viewed via an HMD such as Oculus Rift. The figure shows the virtual environment with an overlay of the real-world stereoscopic video being presented in real-time into the virtual environment. Notice the preservation of the object's shape, shadows, and depth information. The distortions shown in the image are due to the rendering of the stereoscopic data into a 2D image for the purposes of taking screenshots.
Virtual reality and brain computer interface in neurorehabilitation
Dahdah, Marie; Driver, Simon; Parsons, Thomas D.; Richter, Kathleen M.
2016-01-01
The potential benefit of technology to enhance recovery after central nervous system injuries is an area of increasing interest and exploration. The primary emphasis to date has been motor recovery/augmentation and communication. This paper introduces two original studies to demonstrate how advanced technology may be integrated into subacute rehabilitation. The first study addresses the feasibility of brain computer interface with patients on an inpatient spinal cord injury unit. The second study explores the validity of two virtual environments with acquired brain injury as part of an intensive outpatient neurorehabilitation program. These preliminary studies support the feasibility of advanced technologies in the subacute stage of neurorehabilitation. These modalities were well tolerated by participants and could be incorporated into patients' inpatient and outpatient rehabilitation regimens without schedule disruptions. This paper expands the limited literature base regarding the use of advanced technologies in the early stages of recovery for neurorehabilitation populations and speaks favorably to the potential integration of brain computer interface and virtual reality technologies as part of a multidisciplinary treatment program. PMID:27034541
Towards cybernetic surgery: robotic and augmented reality-assisted liver segmentectomy.
Pessaux, Patrick; Diana, Michele; Soler, Luc; Piardi, Tullio; Mutter, Didier; Marescaux, Jacques
2015-04-01
Augmented reality (AR) in surgery consists in the fusion of synthetic computer-generated images (3D virtual model) obtained from medical imaging preoperative workup and real-time patient images in order to visualize unapparent anatomical details. The 3D model could be used for a preoperative planning of the procedure. The potential of AR navigation as a tool to improve safety of the surgical dissection is outlined for robotic hepatectomy. Three patients underwent a fully robotic and AR-assisted hepatic segmentectomy. The 3D virtual anatomical model was obtained using a thoracoabdominal CT scan with a customary software (VR-RENDER®, IRCAD). The model was then processed using a VR-RENDER® plug-in application, the Virtual Surgical Planning (VSP®, IRCAD), to delineate surgical resection planes including the elective ligature of vascular structures. Deformations associated with pneumoperitoneum were also simulated. The virtual model was superimposed to the operative field. A computer scientist manually registered virtual and real images using a video mixer (MX 70; Panasonic, Secaucus, NJ) in real time. Two totally robotic AR segmentectomy V and one segmentectomy VI were performed. AR allowed for the precise and safe recognition of all major vascular structures during the procedure. Total time required to obtain AR was 8 min (range 6-10 min). Each registration (alignment of the vascular anatomy) required a few seconds. Hepatic pedicle clamping was never performed. At the end of the procedure, the remnant liver was correctly vascularized. Resection margins were negative in all cases. The postoperative period was uneventful without perioperative transfusion. AR is a valuable navigation tool which may enhance the ability to achieve safe surgical resection during robotic hepatectomy.
Virtual Reality Cue Refusal Video Game for Alcohol and Cigarette Recovery Support: Summative Study.
Metcalf, Mary; Rossie, Karen; Stokes, Katie; Tallman, Christina; Tanner, Bradley
2018-04-16
New technologies such as virtual reality, augmented reality, and video games hold promise to support and enhance individuals in addiction treatment and recovery. Quitting or decreasing cigarette or alcohol use can lead to significant health improvements for individuals, decreasing heart disease risk and cancer risks (for both nicotine and alcohol use), among others. However, remaining in recovery from use is a significant challenge for most individuals. We developed and assessed the Take Control game, a partially immersive Kinect for Windows platform game that allows users to counter substance cues through active movements (hitting, kicking, etc). Formative analysis during phase I and phase II guided development. We conducted a small wait-list control trial using a quasi-random sampling technique (systematic) with 61 participants in recovery from addiction to alcohol or tobacco. Participants used the game 3 times and reported on substance use, cravings, satisfaction with the game experience, self-efficacy related to recovery, and side effects from exposure to a virtual reality intervention and substance cues. Participants found the game engaging and fun and felt playing the game would support recovery efforts. On average, reported substance use decreased for participants during the intervention period. Participants in recovery for alcohol use saw more benefit than those in recovery for tobacco use, with a statistically significant increase in self-efficacy, attitude, and behavior during the intervention. Side effects from the use of a virtual reality intervention were minor and decreased over time; cravings and side effects also decreased during the study. The preliminary results suggest the intervention holds promise as an adjunct to standard treatment for those in recovery, particularly from alcohol use. ©Mary Metcalf, Karen Rossie, Katie Stokes, Christina Tallman, Bradley Tanner. Originally published in JMIR Serious Games (http://games.jmir.org), 16.04.2018.
Virtual Reality Cue Refusal Video Game for Alcohol and Cigarette Recovery Support: Summative Study
Rossie, Karen; Stokes, Katie; Tallman, Christina; Tanner, Bradley
2018-01-01
Background New technologies such as virtual reality, augmented reality, and video games hold promise to support and enhance individuals in addiction treatment and recovery. Quitting or decreasing cigarette or alcohol use can lead to significant health improvements for individuals, decreasing heart disease risk and cancer risks (for both nicotine and alcohol use), among others. However, remaining in recovery from use is a significant challenge for most individuals. Objective We developed and assessed the Take Control game, a partially immersive Kinect for Windows platform game that allows users to counter substance cues through active movements (hitting, kicking, etc). Methods Formative analysis during phase I and phase II guided development. We conducted a small wait-list control trial using a quasi-random sampling technique (systematic) with 61 participants in recovery from addiction to alcohol or tobacco. Participants used the game 3 times and reported on substance use, cravings, satisfaction with the game experience, self-efficacy related to recovery, and side effects from exposure to a virtual reality intervention and substance cues. Results Participants found the game engaging and fun and felt playing the game would support recovery efforts. On average, reported substance use decreased for participants during the intervention period. Participants in recovery for alcohol use saw more benefit than those in recovery for tobacco use, with a statistically significant increase in self-efficacy, attitude, and behavior during the intervention. Side effects from the use of a virtual reality intervention were minor and decreased over time; cravings and side effects also decreased during the study. Conclusions The preliminary results suggest the intervention holds promise as an adjunct to standard treatment for those in recovery, particularly from alcohol use. PMID:29661748
Ntourakis, Dimitrios; Memeo, Ricardo; Soler, Luc; Marescaux, Jacques; Mutter, Didier; Pessaux, Patrick
2016-02-01
Modern chemotherapy achieves the shrinking of colorectal cancer liver metastases (CRLM) to such extent that they may disappear from radiological imaging. Disappearing CRLM rarely represents a complete pathological remission and have an important risk of recurrence. Augmented reality (AR) consists in the fusion of real-time patient images with a computer-generated 3D virtual patient model created from pre-operative medical imaging. The aim of this prospective pilot study is to investigate the potential of AR navigation as a tool to help locate and surgically resect missing CRLM. A 3D virtual anatomical model was created from thoracoabdominal CT-scans using customary software (VR RENDER(®), IRCAD). The virtual model was superimposed to the operative field using an Exoscope (VITOM(®), Karl Storz, Tüttlingen, Germany). Virtual and real images were manually registered in real-time using a video mixer, based on external anatomical landmarks with an estimated accuracy of 5 mm. This modality was tested in three patients, with four missing CRLM that had sizes from 12 to 24 mm, undergoing laparotomy after receiving pre-operative oxaliplatin-based chemotherapy. AR display and fine registration was performed within 6 min. AR helped detect all four missing CRLM, and guided their resection. In all cases the planned security margin of 1 cm was clear and resections were confirmed to be R0 by pathology. There was no postoperative major morbidity or mortality. No local recurrence occurred in the follow-up period of 6-22 months. This initial experience suggests that AR may be a helpful navigation tool for the resection of missing CRLM.
Tools virtualization for command and control systems
NASA Astrophysics Data System (ADS)
Piszczek, Marek; Maciejewski, Marcin; Pomianek, Mateusz; Szustakowski, Mieczysław
2017-10-01
Information management is an inseparable part of the command process. The result is that the person making decisions at the command post interacts with data providing devices in various ways. Tools virtualization process can introduce a number of significant modifications in the design of solutions for management and command. The general idea involves replacing physical devices user interface with their digital representation (so-called Virtual instruments). A more advanced level of the systems "digitalization" is to use the mixed reality environments. In solutions using Augmented reality (AR) customized HMI is displayed to the operator when he approaches to each device. Identification of device is done by image recognition of photo codes. Visualization is achieved by (optical) see-through head mounted display (HMD). Control can be done for example by means of a handheld touch panel. Using the immersive virtual environment, the command center can be digitally reconstructed. Workstation requires only VR system (HMD) and access to information network. Operator can interact with devices in such a way as it would perform in real world (for example with the virtual hands). Because of their procedures (an analysis of central vision, eye tracking) MR systems offers another useful feature of reducing requirements for system data throughput. Due to the fact that at the moment we focus on the single device. Experiments carried out using Moverio BT-200 and SteamVR systems and the results of experimental application testing clearly indicate the ability to create a fully functional information system with the use of mixed reality technology.
Jedi training: playful evaluation of head-mounted augmented reality display systems
NASA Astrophysics Data System (ADS)
Ozbek, Christopher S.; Giesler, Bjorn; Dillmann, Ruediger
2004-05-01
A fundamental decision in building augmented reality (AR) systems is how to accomplish the combining of the real and virtual worlds. Nowadays this key-question boils down to the two alternatives video-see-through (VST) vs. optical-see-through (OST). Both systems have advantages and disadvantages in areas like production-simplicity, resolution, flexibility in composition strategies, field of view etc. To provide additional decision criteria for high dexterity, accuracy tasks and subjective user-acceptance a gaming environment was programmed that allowed good evaluation of hand-eye coordination, and that was inspired by the Star Wars movies. During an experimentation session with more than thirty participants a preference for optical-see-through glasses in conjunction with infra-red-tracking was found. Especially the high-computational demand for video-capture, processing and the resulting drop in frame rate emerged as a key-weakness of the VST-system.
NASA Astrophysics Data System (ADS)
Yamauchi, Makoto; Iwamoto, Kazuyo
2010-05-01
Line heating is a skilled task in shipbuilding to shape the outer plates of ship hulls. Real-time information on the deformation of the plates during the task would be helpful to workers performing this process. Therefore, we herein propose an interactive scheme for supporting workers performing line heating; the system provides such information through an optical shape measurement instrument combined with an augmented reality (AR) system. The instrument was designed and fabricated so that the measured data were represented using coordinates based on fiducial markers. Since the markers were simultaneously used in the AR system for the purpose of positioning, the data could then be displayed to the workers through a head-mounted display as a virtual image overlaid on the plates. Feedback of the shape measurement results was thus performed in real time using the proposed system.
Multithreaded hybrid feature tracking for markerless augmented reality.
Lee, Taehee; Höllerer, Tobias
2009-01-01
We describe a novel markerless camera tracking approach and user interaction methodology for augmented reality (AR) on unprepared tabletop environments. We propose a real-time system architecture that combines two types of feature tracking. Distinctive image features of the scene are detected and tracked frame-to-frame by computing optical flow. In order to achieve real-time performance, multiple operations are processed in a synchronized multi-threaded manner: capturing a video frame, tracking features using optical flow, detecting distinctive invariant features, and rendering an output frame. We also introduce user interaction methodology for establishing a global coordinate system and for placing virtual objects in the AR environment by tracking a user's outstretched hand and estimating a camera pose relative to it. We evaluate the speed and accuracy of our hybrid feature tracking approach, and demonstrate a proof-of-concept application for enabling AR in unprepared tabletop environments, using bare hands for interaction.
Inexpensive Monocular Pico-Projector-based Augmented Reality Display for Surgical Microscope
Shi, Chen; Becker, Brian C.; Riviere, Cameron N.
2013-01-01
This paper describes an inexpensive pico-projector-based augmented reality (AR) display for a surgical microscope. The system is designed for use with Micron, an active handheld surgical tool that cancels hand tremor of surgeons to improve microsurgical accuracy. Using the AR display, virtual cues can be injected into the microscope view to track the movement of the tip of Micron, show the desired position, and indicate the position error. Cues can be used to maintain high performance by helping the surgeon to avoid drifting out of the workspace of the instrument. Also, boundary information such as the view range of the cameras that record surgical procedures can be displayed to tell surgeons the operation area. Furthermore, numerical, textual, or graphical information can be displayed, showing such things as tool tip depth in the work space and on/off status of the canceling function of Micron. PMID:25264542
Rao, Jinmeng; Qiao, Yanjun; Ren, Fu; Wang, Junxing; Du, Qingyun
2017-01-01
The purpose of this study was to develop a robust, fast and markerless mobile augmented reality method for registration, geovisualization and interaction in uncontrolled outdoor environments. We propose a lightweight deep-learning-based object detection approach for mobile or embedded devices; the vision-based detection results of this approach are combined with spatial relationships by means of the host device’s built-in Global Positioning System receiver, Inertial Measurement Unit and magnetometer. Virtual objects generated based on geospatial information are precisely registered in the real world, and an interaction method based on touch gestures is implemented. The entire method is independent of the network to ensure robustness to poor signal conditions. A prototype system was developed and tested on the Wuhan University campus to evaluate the method and validate its results. The findings demonstrate that our method achieves a high detection accuracy, stable geovisualization results and interaction. PMID:28837096
Rao, Jinmeng; Qiao, Yanjun; Ren, Fu; Wang, Junxing; Du, Qingyun
2017-08-24
The purpose of this study was to develop a robust, fast and markerless mobile augmented reality method for registration, geovisualization and interaction in uncontrolled outdoor environments. We propose a lightweight deep-learning-based object detection approach for mobile or embedded devices; the vision-based detection results of this approach are combined with spatial relationships by means of the host device's built-in Global Positioning System receiver, Inertial Measurement Unit and magnetometer. Virtual objects generated based on geospatial information are precisely registered in the real world, and an interaction method based on touch gestures is implemented. The entire method is independent of the network to ensure robustness to poor signal conditions. A prototype system was developed and tested on the Wuhan University campus to evaluate the method and validate its results. The findings demonstrate that our method achieves a high detection accuracy, stable geovisualization results and interaction.
NASA Astrophysics Data System (ADS)
Aubert, A. H.; Schnepel, O.; Kraft, P.; Houska, T.; Plesca, I.; Orlowski, N.; Breuer, L.
2015-11-01
This paper addresses education and communication in hydrology and geosciences. Many approaches can be used, such as the well-known seminars, modelling exercises and practical field work but out-door learning in our discipline is a must, and this paper focuses on the recent development of a new out-door learning tool at the landscape scale. To facilitate improved teaching and hands-on experience, we designed the Studienlandschaft Schwingbachtal. Equipped with field instrumentation, education trails, and geocache, we now implemented an augmented reality App, adding virtual teaching objects on the real landscape. The App development is detailed, to serve as methodology for people wishing to implement such a tool. The resulting application, namely the Schwingbachtal App, is described as an example. We conclude that such an App is useful for communication and education purposes, making learning pleasant, and offering personalized options.
ERIC Educational Resources Information Center
Squires, David R.
2017-01-01
The structure of the literature review features the current trajectory of Augmented Reality in the field including the current literature detailing how Augmented Reality has been applied in educational environments; how Augmented Reality has been applied in training environments; how Augmented Reality has been used to measure cognition and the…
Human computer confluence applied in healthcare and rehabilitation.
Viaud-Delmon, Isabelle; Gaggioli, Andrea; Ferscha, Alois; Dunne, Stephen
2012-01-01
Human computer confluence (HCC) is an ambitious research program studying how the emerging symbiotic relation between humans and computing devices can enable radically new forms of sensing, perception, interaction, and understanding. It is an interdisciplinary field, bringing together researches from horizons as various as pervasive computing, bio-signals processing, neuroscience, electronics, robotics, virtual & augmented reality, and provides an amazing potential for applications in medicine and rehabilitation.
ERIC Educational Resources Information Center
Borrero, A. Mejias; Marquez, J. M. Andujar
2012-01-01
Lab practices are an essential part of teaching in Engineering. However, traditional laboratory lessons developed in classroom labs (CL) must be adapted to teaching and learning strategies that go far beyond the common concept of e-learning, in the sense that completely virtualized distance education disconnects teachers and students from the real…
ERIC Educational Resources Information Center
Chatham-Carpenter, April
2017-01-01
As students begin using more virtual and augmented reality tools for education and entertainment, they may come to expect instructors to use similar types of technology in their teaching (Kelly, 2016). Instructors may choose to employ this type of technology in the future, to create more of a real-life feel in the online classroom. The effects of…
NASA Astrophysics Data System (ADS)
Mejías Borrero, A.; Andújar Márquez, J. M.
2012-10-01
Lab practices are an essential part of teaching in Engineering. However, traditional laboratory lessons developed in classroom labs (CL) must be adapted to teaching and learning strategies that go far beyond the common concept of e-learning, in the sense that completely virtualized distance education disconnects teachers and students from the real world, which can generate specific problems in laboratory classes. Current proposals of virtual labs (VL) and remote labs (RL) do not either cover new needs properly or contribute remarkable improvement to traditional labs—except that they favor distance training. Therefore, online teaching and learning in lab practices demand a further step beyond current VL and RL. This paper poses a new reality and new teaching/learning concepts in the field of lab practices in engineering. The developed augmented reality-based lab system (augmented remote lab, ARL) enables teachers and students to work remotely (Internet/intranet) in current CL, including virtual elements which interact with real ones. An educational experience was conducted to assess the developed ARL with the participation of a group of 10 teachers and another group of 20 students. Both groups have completed lab practices of the contents in the subjects Digital Systems and Robotics and Industrial Automation, which belong to the second year of the new degree in Electronic Engineering (adapted to the European Space for Higher Education). The labs were carried out by means of three different possibilities: CL, VL and ARL. After completion, both groups were asked to fill in some questionnaires aimed at measuring the improvement contributed by ARL relative to CL and VL. Except in some specific questions, the opinion of teachers and students was rather similar and positive regarding the use and possibilities of ARL. Although the results are still preliminary and need further study, seems to conclude that ARL remarkably improves the possibilities of current VL and RL. Furthermore, ARL can be concluded to allow further possibilities when used online than traditional laboratory lessons completed in CL.
Li, Liang; Yang, Jian; Chu, Yakui; Wu, Wenbo; Xue, Jin; Liang, Ping; Chen, Lei
2016-01-01
Objective To verify the reliability and clinical feasibility of a self-developed navigation system based on an augmented reality technique for endoscopic sinus and skull base surgery. Materials and Methods In this study we performed a head phantom and cadaver experiment to determine the display effect and accuracy of our navigational system. We compared cadaver head-based simulated operations, the target registration error, operation time, and National Aeronautics and Space Administration Task Load Index scores of our navigation system to conventional navigation systems. Results The navigation system developed in this study has a novel display mode capable of fusing endoscopic images to three-dimensional (3-D) virtual images. In the cadaver head experiment, the target registration error was 1.28 ± 0.45 mm, which met the accepted standards of a navigation system used for nasal endoscopic surgery. Compared with conventional navigation systems, the new system was more effective in terms of operation time and the mental workload of surgeons, which is especially important for less experienced surgeons. Conclusion The self-developed augmented reality navigation system for endoscopic sinus and skull base surgery appears to have advantages that outweigh those of conventional navigation systems. We conclude that this navigational system will provide rhinologists with more intuitive and more detailed imaging information, thus reducing the judgment time and mental workload of surgeons when performing complex sinus and skull base surgeries. Ultimately, this new navigational system has potential to increase the quality of surgeries. In addition, the augmented reality navigational system could be of interest to junior doctors being trained in endoscopic techniques because it could speed up their learning. However, it should be noted that the navigation system serves as an adjunct to a surgeon’s skills and knowledge, not as a substitute. PMID:26757365
Li, Liang; Yang, Jian; Chu, Yakui; Wu, Wenbo; Xue, Jin; Liang, Ping; Chen, Lei
2016-01-01
To verify the reliability and clinical feasibility of a self-developed navigation system based on an augmented reality technique for endoscopic sinus and skull base surgery. In this study we performed a head phantom and cadaver experiment to determine the display effect and accuracy of our navigational system. We compared cadaver head-based simulated operations, the target registration error, operation time, and National Aeronautics and Space Administration Task Load Index scores of our navigation system to conventional navigation systems. The navigation system developed in this study has a novel display mode capable of fusing endoscopic images to three-dimensional (3-D) virtual images. In the cadaver head experiment, the target registration error was 1.28 ± 0.45 mm, which met the accepted standards of a navigation system used for nasal endoscopic surgery. Compared with conventional navigation systems, the new system was more effective in terms of operation time and the mental workload of surgeons, which is especially important for less experienced surgeons. The self-developed augmented reality navigation system for endoscopic sinus and skull base surgery appears to have advantages that outweigh those of conventional navigation systems. We conclude that this navigational system will provide rhinologists with more intuitive and more detailed imaging information, thus reducing the judgment time and mental workload of surgeons when performing complex sinus and skull base surgeries. Ultimately, this new navigational system has potential to increase the quality of surgeries. In addition, the augmented reality navigational system could be of interest to junior doctors being trained in endoscopic techniques because it could speed up their learning. However, it should be noted that the navigation system serves as an adjunct to a surgeon's skills and knowledge, not as a substitute.
Anesthesiology training using 3D imaging and virtual reality
NASA Astrophysics Data System (ADS)
Blezek, Daniel J.; Robb, Richard A.; Camp, Jon J.; Nauss, Lee A.
1996-04-01
Current training for regional nerve block procedures by anesthesiology residents requires expert supervision and the use of cadavers; both of which are relatively expensive commodities in today's cost-conscious medical environment. We are developing methods to augment and eventually replace these training procedures with real-time and realistic computer visualizations and manipulations of the anatomical structures involved in anesthesiology procedures, such as nerve plexus injections (e.g., celiac blocks). The initial work is focused on visualizations: both static images and rotational renderings. From the initial results, a coherent paradigm for virtual patient and scene representation will be developed.
Feasibility of Virtual Reality Augmented Cycling for Health Promotion of People Post-Stroke
Deutsch, Judith E; Myslinski, Mary Jane; Kafri, Michal; Ranky, Richard; Sivak, Mark; Mavroidis, Constantinos; Lewis, Jeffrey A
2013-01-01
Background and Purpose A virtual reality (VR) augmented cycling kit (VRACK) was developed to address motor control and fitness deficits of individuals with chronic stroke. In this paper we report on the safety, feasibility and efficacy of using the VRACK to train cardio-respiratory (CR) fitness of individuals in the chronic phase poststroke. Methods Four individuals with chronic stroke (47–65 years old and three or more years post-stroke), with residual lower extremity impairments (Fugl Meyer 24–26/34) who were limited community ambulators (gait speed range 0.56 to 1.1 m/s) participated in this study. Safety was defined as the absence of adverse events. Feasibility was measured using attendance, total exercise time, and “involvement” measured with the Presence Questionnaire (PQ). Efficacy of CR fitness was evaluated using a sub-maximal bicycle ergometer test before and after an 8-week training program. Results The intervention was safe and feasible with participants having 1 adverse event, 100% adherence, achieving between 90 and 125 minutes of cycling each week and a mean PQ score of 39 (SD 3.3). There was a statistically significant 13% (p = 0.035) improvement in peak VO2 with a range of 6–24.5 %. Discussion and Conclusion For these individuals post-stroke, VR augmented cycling, using their heart rate to set their avatar’s speed, fostered training of sufficient duration and intensity to promote CR fitness. In addition, there was a transfer of training from the bicycle to walking endurance. VR augmented cycling may be an addition to the therapist’s tools for concurrent training of mobility and health promotion of individuals post-stroke. Video Abstract available (see Video, Supplemental Digital Content 1) for more insights from the authors. PMID:23863828
Mixed reality ultrasound guidance system: a case study in system development and a cautionary tale.
Ameri, Golafsoun; Baxter, John S H; Bainbridge, Daniel; Peters, Terry M; Chen, Elvis C S
2018-04-01
Real-time ultrasound has become a crucial aspect of several image-guided interventions. One of the main constraints of such an approach is the difficulty in interpretability of the limited field of view of the image, a problem that has recently been addressed using mixed reality, such as augmented reality and augmented virtuality. The growing popularity and maturity of mixed reality has led to a series of informal guidelines to direct development of new systems and to facilitate regulatory approval. However, the goals of mixed reality image guidance systems and the guidelines for their development have not been thoroughly discussed. The purpose of this paper is to identify and critically examine development guidelines in the context of a mixed reality ultrasound guidance system through a case study. A mixed reality ultrasound guidance system tailored to central line insertions was developed in close collaboration with an expert user. This system outperformed ultrasound-only guidance in a novice user study and has obtained clearance for clinical use in humans. A phantom study with 25 experienced physicians was carried out to compare the performance of the mixed reality ultrasound system against conventional ultrasound-only guidance. Despite the previous promising results, there was no statistically significant difference between the two systems. Guidelines for developing mixed reality image guidance systems cannot be applied indiscriminately. Each design decision, no matter how well justified, should be the subject of scientific and technical investigation. Iterative and small-scale evaluation can readily unearth issues and previously unknown or implicit system requirements. We recommend a wary eye in development of mixed reality ultrasound image guidance systems emphasizing small-scale iterative evaluation alongside system development. Ultimately, we recommend that the image-guided intervention community furthers and deepens this discussion into best practices in developing image-guided interventions.
Registration of an on-axis see-through head-mounted display and camera system
NASA Astrophysics Data System (ADS)
Luo, Gang; Rensing, Noa M.; Weststrate, Evan; Peli, Eli
2005-02-01
An optical see-through head-mounted display (HMD) system integrating a miniature camera that is aligned with the user's pupil is developed and tested. Such an HMD system has a potential value in many augmented reality applications, in which registration of the virtual display to the real scene is one of the critical aspects. The camera alignment to the user's pupil results in a simple yet accurate calibration and a low registration error across a wide range of depth. In reality, a small camera-eye misalignment may still occur in such a system due to the inevitable variations of HMD wearing position with respect to the eye. The effects of such errors are measured. Calculation further shows that the registration error as a function of viewing distance behaves nearly the same for different virtual image distances, except for a shift. The impact of prismatic effect of the display lens on registration is also discussed.
The use of virtual reality-based therapy to augment poststroke upper limb recovery.
Samuel, Geoffrey S; Choo, Min; Chan, Wai Yin; Kok, Stanley; Ng, Yee Sien
2015-07-01
Stroke remains one of the major causes of disability worldwide. This case report illustrates the complementary use of biomechanical and kinematic in-game markers, in addition to standard clinical outcomes, to comprehensively assess and track a patient's disabilities. A 65-year-old patient was admitted for right-sided weakness and clinically diagnosed with acute ischaemic stroke. She participated in a short trial of standard stroke occupational therapy and physiotherapy with additional daily virtual reality (VR)-based therapy. Outcomes were tracked using kinematic data and conventional clinical assessments. Her Functional Independence Measure score improved from 87 to 113 and Fugl-Meyer motor score improved from 56 to 62, denoting clinically significant improvement. Corresponding kinematic analysis revealed improved hand path ratios and a decrease in velocity peaks. Further research is being undertaken to elucidate the optimal type, timing, setting and duration of VR-based therapy, as well as the use of neuropharmacological adjuncts.
The use of virtual reality-based therapy to augment poststroke upper limb recovery
Samuel, Geoffrey S; Choo, Min; Chan, Wai Yin; Kok, Stanley; Ng, Yee Sien
2015-01-01
Stroke remains one of the major causes of disability worldwide. This case report illustrates the complementary use of biomechanical and kinematic in-game markers, in addition to standard clinical outcomes, to comprehensively assess and track a patient’s disabilities. A 65-year-old patient was admitted for right-sided weakness and clinically diagnosed with acute ischaemic stroke. She participated in a short trial of standard stroke occupational therapy and physiotherapy with additional daily virtual reality (VR)-based therapy. Outcomes were tracked using kinematic data and conventional clinical assessments. Her Functional Independence Measure score improved from 87 to 113 and Fugl-Meyer motor score improved from 56 to 62, denoting clinically significant improvement. Corresponding kinematic analysis revealed improved hand path ratios and a decrease in velocity peaks. Further research is being undertaken to elucidate the optimal type, timing, setting and duration of VR-based therapy, as well as the use of neuropharmacological adjuncts. PMID:26243983
NASA Astrophysics Data System (ADS)
Tadokoro, Satoshi; Kitano, Hiroaki; Takahashi, Tomoichi; Noda, Itsuki; Matsubara, Hitoshi; Shinjoh, Atsushi; Koto, Tetsuo; Takeuchi, Ikuo; Takahashi, Hironao; Matsuno, Fumitoshi; Hatayama, Mitsunori; Nobe, Jun; Shimada, Susumu
2000-07-01
This paper introduces the RoboCup-Rescue Simulation Project, a contribution to the disaster mitigation, search and rescue problem. A comprehensive urban disaster simulator is constructed on distributed computers. Heterogeneous intelligent agents such as fire fighters, victims and volunteers conduct search and rescue activities in this virtual disaster world. A real world interface integrates various sensor systems and controllers of infrastructures in the real cities with the real world. Real-time simulation is synchronized with actual disasters, computing complex relationship between various damage factors and agent behaviors. A mission-critical man-machine interface provides portability and robustness of disaster mitigation centers, and augmented-reality interfaces for rescue in real disasters. It also provides a virtual- reality training function for the public. This diverse spectrum of RoboCup-Rescue contributes to the creation of the safer social system.
Kuriakose, Selvia; Lahiri, Uttama
2015-07-01
Individuals with Autism are characterized by deficits in socialization and communication. In recent years several assistive technologies, e.g., Virtual Reality (VR), have been investigated to address the socialization deficits in these individuals. Presently available VR-based systems address various aspects of social communication in an isolated manner and without monitoring one's affective state such as, anxiety. However, in conventional observation-based therapy, a therapist adjusts the intervention paradigm by monitoring one's anxiety level. But, often these individuals have an inherent inability to explicitly express their anxiety thereby inducing limitations on conventional techniques. Physiological signals being continuously available and not directly impacted by these communication difficulties can be alternatively used as markers of one's anxiety level. In our research we aim at designing a Virtual-reality bAsed Social-communication Task (VAST) system that can address the various aspects of social communication, e.g., social context, subtle social cues, emotional expression, etc., in a cumulative and structured way. In addition, we augment this with a capability to use one's physiological signals as markers of one's anxiety level. In our preliminary feasibility study we investigate the potential of VAST to cause variations in one's performance and anxiety level that can be mapped from one's physiological indices.
Emerging Utility of Virtual Reality as a Multidisciplinary Tool in Clinical Medicine.
Pourmand, Ali; Davis, Steven; Lee, Danny; Barber, Scott; Sikka, Neal
2017-10-01
Among the more recent products borne of the evolution of digital technology, virtual reality (VR) is gaining a foothold in clinical medicine as an adjunct to traditional therapies. Early studies suggest a growing role for VR applications in pain management, clinical skills training, cognitive assessment and cognitive therapy, and physical rehabilitation. To complete a review of the literature, we searched PubMed and MEDLINE databases with the following search terms: "virtual reality," "procedural medicine," "oncology," "physical therapy," and "burn." We further limited our search to publications in the English language. Boolean operators were used to combine search terms. The included search terms yielded 97 potential articles, of which 45 were identified as meeting study criteria, and are included in this review. These articles provide data, which strongly support the hypothesis that VR simulations can enhance pain management (by reducing patient perception of pain and anxiety), can augment clinical training curricula and physical rehabilitation protocols (through immersive audiovisual environments), and can improve clinical assessment of cognitive function (through improved ecological validity). Through computer-generated, life-like digital landscapes, VR stands to change the current approach to pain management, medical training, neurocognitive diagnosis, and physical rehabilitation. Additional studies are needed to help define best practices in VR utilization, and to explore new therapeutic uses for VR in clinical practice.
A 3-D mixed-reality system for stereoscopic visualization of medical dataset.
Ferrari, Vincenzo; Megali, Giuseppe; Troia, Elena; Pietrabissa, Andrea; Mosca, Franco
2009-11-01
We developed a simple, light, and cheap 3-D visualization device based on mixed reality that can be used by physicians to see preoperative radiological exams in a natural way. The system allows the user to see stereoscopic "augmented images," which are created by mixing 3-D virtual models of anatomies obtained by processing preoperative volumetric radiological images (computed tomography or MRI) with real patient live images, grabbed by means of cameras. The interface of the system consists of a head-mounted display equipped with two high-definition cameras. Cameras are mounted in correspondence of the user's eyes and allow one to grab live images of the patient with the same point of view of the user. The system does not use any external tracker to detect movements of the user or the patient. The movements of the user's head and the alignment of virtual patient with the real one are done using machine vision methods applied on pairs of live images. Experimental results, concerning frame rate and alignment precision between virtual and real patient, demonstrate that machine vision methods used for localization are appropriate for the specific application and that systems based on stereoscopic mixed reality are feasible and can be proficiently adopted in clinical practice.
Mixed reality ventriculostomy simulation: experience in neurosurgical residency.
Hooten, Kristopher G; Lister, J Richard; Lombard, Gwen; Lizdas, David E; Lampotang, Samsun; Rajon, Didier A; Bova, Frank; Murad, Gregory J A
2014-12-01
Medicine and surgery are turning toward simulation to improve on limited patient interaction during residency training. Many simulators today use virtual reality with augmented haptic feedback with little to no physical elements. In a collaborative effort, the University of Florida Department of Neurosurgery and the Center for Safety, Simulation & Advanced Learning Technologies created a novel "mixed" physical and virtual simulator to mimic the ventriculostomy procedure. The simulator contains all the physical components encountered for the procedure with superimposed 3-D virtual elements for the neuroanatomical structures. To introduce the ventriculostomy simulator and its validation as a necessary training tool in neurosurgical residency. We tested the simulator in more than 260 residents. An algorithm combining time and accuracy was used to grade performance. Voluntary postperformance surveys were used to evaluate the experience. Results demonstrate that more experienced residents have statistically significant better scores and completed the procedure in less time than inexperienced residents. Survey results revealed that most residents agreed that practice on the simulator would help with future ventriculostomies. This mixed reality simulator provides a real-life experience, and will be an instrumental tool in training the next generation of neurosurgeons. We have now implemented a standard where incoming residents must prove efficiency and skill on the simulator before their first interaction with a patient.
Raque, Jessica; Goble, Adam; Jones, Veronica M; Waldman, Lindsey E; Sutton, Erica
2015-07-01
With the introduction of Fundamentals of Endoscopic Surgery, training methods in flexible endoscopy are being augmented with simulation-based curricula. The investment for virtual reality simulators warrants further research into its training advantage. Trainees were randomized into bedside or simulator training groups (BED vs SIM). SIM participated in a proficiency-based virtual reality curriculum. Trainees' endoscopic skills were rated using the Global Assessment of Gastrointestinal Endoscopic Skills (GAGES) in the patient care setting. The number of cases to reach 90 per cent of the maximum GAGES score and calculated costs of training were compared. Nineteen residents participated in the study. There was no difference in the average number of cases required to achieve 90 per cent of the maximum GAGES score for esophagogastroduodenoscopy, 13 (SIM) versus11 (BED) (P = 0.63), or colonoscopy 21 (SIM) versus 4 (BED) (P = 0.34). The average per case cost of training for esophagogastroduodenoscopy was $35.98 (SIM) versus $39.71 (BED) (P = 0.50), not including the depreciation costs associated with the simulator ($715.00 per resident over six years). Use of a simulator appeared to increase the cost of training without accelerating the learning curve or decreasing faculty time spent in instruction. The importance of simulation in endoscopy training will be predicated on more cost-effective simulators.
Linte, Cristian A; White, James; Eagleson, Roy; Guiraudon, Gérard M; Peters, Terry M
2010-01-01
Virtual and augmented reality environments have been adopted in medicine as a means to enhance the clinician's view of the anatomy and facilitate the performance of minimally invasive procedures. Their value is truly appreciated during interventions where the surgeon cannot directly visualize the targets to be treated, such as during cardiac procedures performed on the beating heart. These environments must accurately represent the real surgical field and require seamless integration of pre- and intra-operative imaging, surgical tracking, and visualization technology in a common framework centered around the patient. This review begins with an overview of minimally invasive cardiac interventions, describes the architecture of a typical surgical guidance platform including imaging, tracking, registration and visualization, highlights both clinical and engineering accuracy limitations in cardiac image guidance, and discusses the translation of the work from the laboratory into the operating room together with typically encountered challenges.
The Perception and Estimation of Egocentric Distance in Real and Augmented Reality Environments
2008-05-01
MICHELLE SAMS, PhD. Research Program Manager Director Training and Leader Development Technical review by Jennifer L. Solberg, U.S. Army Research...with augmented reality technology that are essential for determining the usefulness of current augmented reality (AR) for training and performance...determine perceived distance. 15. SUBJECT TERMS Augmented Environments, Augmented Reality, Dismounted Infantry, Training , Presence, distance perception
Embedded Augmented Reality Training System for Dynamic Human-Robot Cooperation
2009-10-01
through (OST) head- mounted displays ( HMDs ) still lack in usability and ergonomics because of their size, weight, resolution, and the hard-to-realize...with addressable focal planes [10], for example. Accurate and easy-to-use calibration routines for OST HMDs remains a challenging task; established...methods are based on matching of virtual over real objects [11], newer approaches use cameras looking directly through the HMD optics to exploit both
Towards surgeon-authored VR training: the scene-development cycle.
Dindar, Saleh; Nguyen, Thien; Peters, Jörg
2016-01-01
Enabling surgeon-educators to themselves create virtual reality (VR) training units promises greater variety, specialization, and relevance of the units. This paper describes a software bridge that semi-automates the scene-generation cycle, a key bottleneck in authoring, modeling, and developing VR units. Augmenting an open source modeling environment with physical behavior attachment and collision specifications yields single-click testing of the full force-feedback enabled anatomical scene.
Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S.; Phoon, Sin Ye
2016-01-01
This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively. PMID:27271840
Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S; Phoon, Sin Ye
2016-06-07
This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively.
NASA Astrophysics Data System (ADS)
Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S.; Phoon, Sin Ye
2016-06-01
This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively.
Helmet-mounted displays: why haven't they taken off?
NASA Astrophysics Data System (ADS)
Havig, P.; Goff, C.; McIntire, J.; Franck, D.
2009-05-01
Helmet-Mounted Display (HMD) technologies have been developing for over 3 decades and have been studied for multiple applications ranging from military aircraft, to virtual reality, augmented reality, entertainment and a host of other ideas. It would not be unreasonable to assume that after this much time they would be employed in our daily lives as ubiquitously as the common desktop monitor. However, this is simply not the case. How can this be when they can be used in so many ways for so many tasks? Throughout this work we will look at some of the reasons why as well of some of the ways they can be used.
Using virtual reality to augment perception, enhance sensorimotor adaptation, and change our minds.
Wright, W Geoffrey
2014-01-01
Technological advances that involve human sensorimotor processes can have both intended and unintended effects on the central nervous system (CNS). This mini review focuses on the use of virtual environments (VE) to augment brain functions by enhancing perception, eliciting automatic motor behavior, and inducing sensorimotor adaptation. VE technology is becoming increasingly prevalent in medical rehabilitation, training simulators, gaming, and entertainment. Although these VE applications have often been shown to optimize outcomes, whether it be to speed recovery, reduce training time, or enhance immersion and enjoyment, there are inherent drawbacks to environments that can potentially change sensorimotor calibration. Across numerous VE studies over the years, we have investigated the effects of combining visual and physical motion on perception, motor control, and adaptation. Recent results from our research involving exposure to dynamic passive motion within a visually-depicted VE reveal that short-term exposure to augmented sensorimotor discordance can result in systematic aftereffects that last beyond the exposure period. Whether these adaptations are advantageous or not, remains to be seen. Benefits as well as risks of using VE-driven sensorimotor stimulation to enhance brain processes will be discussed.
NASA Astrophysics Data System (ADS)
Tan, Kian Lam; Lim, Chen Kim
2017-10-01
In the last decade, cultural heritage including historical sites are reconstructed into digital heritage. Based on UNESCO, digital heritage defines as "cultural, educational, scientific and administrative resources, as well as technical, legal, medical and other kinds of information created digitally, or converted into digital form from existing analogue resources". In addition, the digital heritage is doubling in size every two years and expected will grow tenfold between 2013 and 2020. In order to attract and stir the interest of younger generations about digital heritage, gamification has been widely promoted. In this research, a virtual walkthrough combine with gamifications are proposed for learning and exploring historical places in Malaysia by using mobile device. In conjunction with Visit Perak 2017 Campaign, this virtual walkthrough is proposed for Kellie's Castle at Perak. The objectives of this research is two folds 1) modelling and design of innovative mobile game for virtual walkthrough application, and 2) to attract tourist to explore and learn historical places by using sophisticated graphics from Augmented Reality. The efficiency and effectiveness of the mobile virtual walkthrough will be accessed by the International and local tourists. In conclusion, this research is speculated to be pervasively improve the cultural and historical knowledge of the learners.
NASA Astrophysics Data System (ADS)
Barazzetti, L.; Banfi, F.; Brumana, R.; Oreni, D.; Previtali, M.; Roncoroni, F.
2015-08-01
This paper describes a procedure for the generation of a detailed HBIM which is then turned into a model for mobile apps based on augmented and virtual reality. Starting from laser point clouds, photogrammetric data and additional information, a geometric reconstruction with a high level of detail can be carried out by considering the basic requirements of BIM projects (parametric modelling, object relations, attributes). The work aims at demonstrating that a complex HBIM can be managed in portable devices to extract useful information not only for expert operators, but also towards a wider user community interested in cultural tourism.
Moving forward in treatment of posttraumatic stress disorder: innovations to exposure-based therapy.
Nijdam, Mirjam J; Vermetten, Eric
2018-01-01
The field of treatment of posttraumatic stress disorder (PTSD) has been a pacesetter for the changing face of psychotherapy, as is illustrated in the introduction of Virtual Reality Exposure Therapy. This paper outlines a novel approach that builds on a cognitive-motor interaction in a virtual interactive environment. It is based on the theory of memory reconsolidation and the embodiment of cognition. The framework we envision allows the patient to 'step into the past' by using forward motion as an essential ingredient to augment the impact of exposure to traumatic events. The behavioural response of approaching that is the exact opposite from the avoidance usually applied by patients and the enhancement of divergent thinking are the most prominent hypothesized mechanisms of action. This can contribute to strengthening of personal efficacy and self-reflection that is generated by high emotional engagement, as well as a sense of accomplishment and enhanced recovery as illustrated by a clinical case example. We argue that innovations with personalized virtual reality and motion need to be further investigated and implemented in current therapy settings.
Moving forward in treatment of posttraumatic stress disorder: innovations to exposure-based therapy
Nijdam, Mirjam J.; Vermetten, Eric
2018-01-01
ABSTRACT The field of treatment of posttraumatic stress disorder (PTSD) has been a pacesetter for the changing face of psychotherapy, as is illustrated in the introduction of Virtual Reality Exposure Therapy. This paper outlines a novel approach that builds on a cognitive-motor interaction in a virtual interactive environment. It is based on the theory of memory reconsolidation and the embodiment of cognition. The framework we envision allows the patient to 'step into the past' by using forward motion as an essential ingredient to augment the impact of exposure to traumatic events. The behavioural response of approaching that is the exact opposite from the avoidance usually applied by patients and the enhancement of divergent thinking are the most prominent hypothesized mechanisms of action. This can contribute to strengthening of personal efficacy and self-reflection that is generated by high emotional engagement, as well as a sense of accomplishment and enhanced recovery as illustrated by a clinical case example. We argue that innovations with personalized virtual reality and motion need to be further investigated and implemented in current therapy settings. PMID:29805777
Internet virtual studio: low-cost augmented reality system for WebTV
NASA Astrophysics Data System (ADS)
Sitnik, Robert; Pasko, Slawomir; Karaszewski, Maciej; Witkowski, Marcin
2008-02-01
In this paper a concept of a Internet Virtual Studio as a modern system for production of news, entertainment, educational and training material is proposed. This system is based on virtual studio technology and integrated with multimedia data base. Its was developed for web television content production. In successive subentries the general system architecture, as well as the architecture of modules one by one is discussed. The authors describe each module by presentation of a brief information about work principles and technical limitations. The presentation of modules is strictly connected with a presentation of their capabilities. Results produced by each of them are shown in the form of exemplary images. Finally, exemplary short production is presented and discussed.
Recent Development of Augmented Reality in Surgery: A Review.
Vávra, P; Roman, J; Zonča, P; Ihnát, P; Němec, M; Kumar, J; Habib, N; El-Gendi, A
2017-01-01
The development augmented reality devices allow physicians to incorporate data visualization into diagnostic and treatment procedures to improve work efficiency, safety, and cost and to enhance surgical training. However, the awareness of possibilities of augmented reality is generally low. This review evaluates whether augmented reality can presently improve the results of surgical procedures. We performed a review of available literature dating from 2010 to November 2016 by searching PubMed and Scopus using the terms "augmented reality" and "surgery." Results . The initial search yielded 808 studies. After removing duplicates and including only journal articles, a total of 417 studies were identified. By reading of abstracts, 91 relevant studies were chosen to be included. 11 references were gathered by cross-referencing. A total of 102 studies were included in this review. The present literature suggest an increasing interest of surgeons regarding employing augmented reality into surgery leading to improved safety and efficacy of surgical procedures. Many studies showed that the performance of newly devised augmented reality systems is comparable to traditional techniques. However, several problems need to be addressed before augmented reality is implemented into the routine practice.
Implementation of Augmented Reality Technology in Sangiran Museum with Vuforia
NASA Astrophysics Data System (ADS)
Purnomo, F. A.; Santosa, P. I.; Hartanto, R.; Pratisto, E. H.; Purbayu, A.
2018-03-01
Archaeological object is an evidence of life on ancient relics which has a lifespan of millions years ago. The discovery of this ancient object by the Museum Sangiran then is preserved and protected from potential damage. This research will develop Augmented Reality application for the museum that display a virtual information from ancient object on display. The content includes information as text, audio, and animation of 3D model as a representation of the ancient object. This study emphasizes the 3D Markerless recognition process by using Vuforia Augmented Reality (AR) system so that visitor can access the exhibition objects through different viewpoints. Based on the test result, by registering image target with 25o angle interval, 3D markerless keypoint feature can be detected with different viewpoint. The device must meet minimal specifications of Dual Core 1.2 GHz processor, GPU Power VR SG5X, 8 MP auto focus camera and 1 GB of memory to run the application. The average success of the AR application detects object in museum exhibition to 3D Markerless with a single view by 40%, Markerless multiview by 86% (for angle 0° - 180°) and 100% (for angle 0° - 360°). Application detection distance is between 23 cm and up to 540 cm with the response time to detect 3D Markerless has 12 seconds in average.
Projector-Based Augmented Reality for Quality Inspection of Scanned Objects
NASA Astrophysics Data System (ADS)
Kern, J.; Weinmann, M.; Wursthorn, S.
2017-09-01
After scanning or reconstructing the geometry of objects, we need to inspect the result of our work. Are there any parts missing? Is every detail covered in the desired quality? We typically do this by looking at the resulting point clouds or meshes of our objects on-screen. What, if we could see the information directly visualized on the object itself? Augmented reality is the generic term for bringing virtual information into our real environment. In our paper, we show how we can project any 3D information like thematic visualizations or specific monitoring information with reference to our object onto the object's surface itself, thus augmenting it with additional information. For small objects that could for instance be scanned in a laboratory, we propose a low-cost method involving a projector-camera system to solve this task. The user only needs a calibration board with coded fiducial markers to calibrate the system and to estimate the projector's pose later on for projecting textures with information onto the object's surface. Changes within the projected 3D information or of the projector's pose will be applied in real-time. Our results clearly reveal that such a simple setup will deliver a good quality of the augmented information.
ERIC Educational Resources Information Center
Auld, Lawrence W. S.; Pantelidis, Veronica S.
1994-01-01
Describes the Virtual Reality and Education Lab (VREL) established at East Carolina University to study the implications of virtual reality for elementary and secondary education. Highlights include virtual reality software evaluation; hardware evaluation; computer-based curriculum objectives which could use virtual reality; and keeping current…
Seamless 3D interaction for virtual tables, projection planes, and CAVEs
NASA Astrophysics Data System (ADS)
Encarnacao, L. M.; Bimber, Oliver; Schmalstieg, Dieter; Barton, Robert J., III
2000-08-01
The Virtual Table presents stereoscopic graphics to a user in a workbench-like setting. This device shares with other large- screen display technologies (such as data walls and surround- screen projection systems) the lack of human-centered unencumbered user interfaces and 3D interaction technologies. Such shortcomings present severe limitations to the application of virtual reality (VR) technology to time- critical applications as well as employment scenarios that involve heterogeneous groups of end-users without high levels of computer familiarity and expertise. Traditionally such employment scenarios are common in planning-related application areas such as mission rehearsal and command and control. For these applications, a high grade of flexibility with respect to the system requirements (display and I/O devices) as well as to the ability to seamlessly and intuitively switch between different interaction modalities and interaction are sought. Conventional VR techniques may be insufficient to meet this challenge. This paper presents novel approaches for human-centered interfaces to Virtual Environments focusing on the Virtual Table visual input device. It introduces new paradigms for 3D interaction in virtual environments (VE) for a variety of application areas based on pen-and-clipboard, mirror-in-hand, and magic-lens metaphors, and introduces new concepts for combining VR and augmented reality (AR) techniques. It finally describes approaches toward hybrid and distributed multi-user interaction environments and concludes by hypothesizing on possible use cases for defense applications.
A novel 3D guidance system using augmented reality for percutaneous vertebroplasty: technical note.
Abe, Yuichiro; Sato, Shigenobu; Kato, Koji; Hyakumachi, Takahiko; Yanagibashi, Yasushi; Ito, Manabu; Abumi, Kuniyoshi
2013-10-01
Augmented reality (AR) is an imaging technology by which virtual objects are overlaid onto images of real objects captured in real time by a tracking camera. This study aimed to introduce a novel AR guidance system called virtual protractor with augmented reality (VIPAR) to visualize a needle trajectory in 3D space during percutaneous vertebroplasty (PVP). The AR system used for this study comprised a head-mount display (HMD) with a tracking camera and a marker sheet. An augmented scene was created by overlaying the preoperatively generated needle trajectory path onto a marker detected on the patient using AR software, thereby providing the surgeon with augmented views in real time through the HMD. The accuracy of the system was evaluated by using a computer-generated simulation model in a spine phantom and also evaluated clinically in 5 patients. In the 40 spine phantom trials, the error of the insertion angle (EIA), defined as the difference between the attempted angle and the insertion angle, was evaluated using 3D CT scanning. Computed tomography analysis of the 40 spine phantom trials showed that the EIA in the axial plane significantly improved when VIPAR was used compared with when it was not used (0.96° ± 0.61° vs 4.34° ± 2.36°, respectively). The same held true for EIA in the sagittal plane (0.61° ± 0.70° vs 2.55° ± 1.93°, respectively). In the clinical evaluation of the AR system, 5 patients with osteoporotic vertebral fractures underwent VIPAR-guided PVP from October 2011 to May 2012. The postoperative EIA was evaluated using CT. The clinical results of the 5 patients showed that the EIA in all 10 needle insertions was 2.09° ± 1.3° in the axial plane and 1.98° ± 1.8° in the sagittal plane. There was no pedicle breach or leakage of polymethylmethacrylate. VIPAR was successfully used to assist in needle insertion during PVP by providing the surgeon with an ideal insertion point and needle trajectory through the HMD. The findings indicate that AR guidance technology can become a useful assistive device during spine surgeries requiring percutaneous procedures.
Gupta, Anita; Scott, Kevin; Dukewich, Matthew
2018-01-01
Virtual reality (VR) is an exciting new technology with almost endless possible uses in medicine. One area it has shown promise is pain management. This selective review focused on studies that gave evidence to the distraction or nondistraction mechanisms by which VR leads to the treatment of pain. The review looked at articles from 2000 to July 29, 2016, focusing on studies concerning mechanisms by which virtual reality can augment pain relief. The data was collected through a search of MEDLINE and Web of Science using the key words of "virtual reality" and "pain" or "distraction." Six studies were identified: four small randomized controlled studies and two prospective/pilot studies. The search results provided evidence that distraction is a technique by which VR can have benefits in the treatment of pain. Both adult and pediatric populations were included in these studies. In addition to acute pain, several studies looked at chronic pain states such as headaches or fibromyalgia. These studies also combined VR with other treatment modalities such as biofeedback mechanisms and cognitive behavioral therapy. These results demonstrate that in addition to distraction, there are novel mechanisms for VR treatment in pain, such as producing neurophysiologic changes related to conditioning and exposure therapies. If these new mechanisms can lead to new treatment options for patients with chronic pain, VR may have the ability to help reduce opioid use and misuse among chronic pain patients. More studies are needed to reproduce results from prospective/pilot studies in large randomized control studies. © 2017 American Academy of Pain Medicine. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
NASA Astrophysics Data System (ADS)
Clay, Alexis; Delord, Elric; Couture, Nadine; Domenger, Gaël
We describe the joint research that we conduct in gesture-based emotion recognition and virtual augmentation of a stage, bridging together the fields of computer science and dance. After establishing a common ground for dialogue, we could conduct a research process that equally benefits both fields. As computer scientists, dance is a perfect application case. Dancer's artistic creativity orient our research choices. As dancers, computer science provides new tools for creativity, and more importantly a new point of view that forces us to reconsider dance from its fundamentals. In this paper we hence describe our scientific work and its implications on dance. We provide an overview of our system to augment a ballet stage, taking a dancer's emotion into account. To illustrate our work in both fields, we describe three events that mixed dance, emotion recognition and augmented reality.
Augmented Reality Implementation in Watch Catalog as e-Marketing Based on Mobile Aplication
NASA Astrophysics Data System (ADS)
Adrianto, D.; Luwinda, F. A.; Yesmaya, V.
2017-01-01
Augmented Reality is one of important methods to provide user with a better interactive user interface. In this research, Augmented Reality in Mobile Application will be applied to provide user with useful information related with Watch Catalogue. This research will be focused on design and implementation an application using Augmented Reality. The process model used in this research is Extreme Programming. Extreme Programming have a several steps which are planning, design, coding, and testing. The result of this research is Augmented Reality application based on Android. This research will be conclude that implementation of Augmented Reality based on Android in Watch Catalogue will help customer to collect the useful information related to the specific object of watch.
Augmented Reality Technology Using Microsoft HoloLens in Anatomic Pathology.
Hanna, Matthew G; Ahmed, Ishtiaque; Nine, Jeffrey; Prajapati, Shyam; Pantanowitz, Liron
2018-05-01
Context Augmented reality (AR) devices such as the Microsoft HoloLens have not been well used in the medical field. Objective To test the HoloLens for clinical and nonclinical applications in pathology. Design A Microsoft HoloLens was tested for virtual annotation during autopsy, viewing 3D gross and microscopic pathology specimens, navigating whole slide images, telepathology, as well as real-time pathology-radiology correlation. Results Pathology residents performing an autopsy wearing the HoloLens were remotely instructed with real-time diagrams, annotations, and voice instruction. 3D-scanned gross pathology specimens could be viewed as holograms and easily manipulated. Telepathology was supported during gross examination and at the time of intraoperative consultation, allowing users to remotely access a pathologist for guidance and to virtually annotate areas of interest on specimens in real-time. The HoloLens permitted radiographs to be coregistered on gross specimens and thereby enhanced locating important pathologic findings. The HoloLens also allowed easy viewing and navigation of whole slide images, using an AR workstation, including multiple coregistered tissue sections facilitating volumetric pathology evaluation. Conclusions The HoloLens is a novel AR tool with multiple clinical and nonclinical applications in pathology. The device was comfortable to wear, easy to use, provided sufficient computing power, and supported high-resolution imaging. It was useful for autopsy, gross and microscopic examination, and ideally suited for digital pathology. Unique applications include remote supervision and annotation, 3D image viewing and manipulation, telepathology in a mixed-reality environment, and real-time pathology-radiology correlation.
2006-06-01
allowing substantial see-around capability. Regions of visual suppression due to binocular rivalry ( luning ) are shown along the shaded flanks of...that the visual suppression of binocular rivalry, luning , (Velger, 1998, p.56-58) associated with the partial overlap conditions did not materially...tags were displayed. Thus, the frequency of conflicting binocular contours was reduced. In any case, luning does not seem to introduce major
Human Factors Issues in the Use of Virtual and Augmented Reality for Military Purposes - USA
2005-12-01
and provide a means of output, MOVES has built a prototype system and continues research into the artificial intelligence and other factors required...role in any attempt to create automaton warriors. Indeed game-theoretic notions have been utilized in applications of artificial intelligence to...Review Board at the Defense Intelligence Agency (DIA). AFRL was notified that DIA will sponsor DTNG for Certification and Accreditation. Det 4 is expected
The Effect of an Augmented Reality Enhanced Mathematics Lesson on Student Achievement and Motivation
ERIC Educational Resources Information Center
Estapa, Anne; Nadolny, Larysa
2015-01-01
The purpose of the study was to assess student achievement and motivation during a high school augmented reality mathematics activity focused on dimensional analysis. Included in this article is a review of the literature on the use of augmented reality in mathematics and the combination of print with augmented reality, also known as interactive…
Evaluating Augmented Reality to Complete a Chain Task for Elementary Students with Autism
ERIC Educational Resources Information Center
Cihak, David F.; Moore, Eric J.; Wright, Rachel E.; McMahon, Don D.; Gibbons, Melinda M.; Smith, Cate
2016-01-01
The purpose of this study was to examine the effects of augmented reality to teach a chain task to three elementary-age students with autism spectrum disorders (ASDs). Augmented reality blends digital information within the real world. This study used a marker-based augmented reality picture prompt to trigger a video model clip of a student…
Adaptive multimodal interaction in mobile augmented reality: A conceptual framework
NASA Astrophysics Data System (ADS)
Abidin, Rimaniza Zainal; Arshad, Haslina; Shukri, Saidatul A'isyah Ahmad
2017-10-01
Recently, Augmented Reality (AR) is an emerging technology in many mobile applications. Mobile AR was defined as a medium for displaying information merged with the real world environment mapped with augmented reality surrounding in a single view. There are four main types of mobile augmented reality interfaces and one of them are multimodal interfaces. Multimodal interface processes two or more combined user input modes (such as speech, pen, touch, manual gesture, gaze, and head and body movements) in a coordinated manner with multimedia system output. In multimodal interface, many frameworks have been proposed to guide the designer to develop a multimodal applications including in augmented reality environment but there has been little work reviewing the framework of adaptive multimodal interface in mobile augmented reality. The main goal of this study is to propose a conceptual framework to illustrate the adaptive multimodal interface in mobile augmented reality. We reviewed several frameworks that have been proposed in the field of multimodal interfaces, adaptive interface and augmented reality. We analyzed the components in the previous frameworks and measure which can be applied in mobile devices. Our framework can be used as a guide for designers and developer to develop a mobile AR application with an adaptive multimodal interfaces.
Ortiz-Catalan, Max; Sander, Nichlas; Kristoffersen, Morten B.; Håkansson, Bo; Brånemark, Rickard
2014-01-01
A variety of treatments have been historically used to alleviate phantom limb pain (PLP) with varying efficacy. Recently, virtual reality (VR) has been employed as a more sophisticated mirror therapy. Despite the advantages of VR over a conventional mirror, this approach has retained the use of the contralateral limb and is therefore restricted to unilateral amputees. Moreover, this strategy disregards the actual effort made by the patient to produce phantom motions. In this work, we investigate a treatment in which the virtual limb responds directly to myoelectric activity at the stump, while the illusion of a restored limb is enhanced through augmented reality (AR). Further, phantom motions are facilitated and encouraged through gaming. The proposed set of technologies was administered to a chronic PLP patient who has shown resistance to a variety of treatments (including mirror therapy) for 48 years. Individual and simultaneous phantom movements were predicted using myoelectric pattern recognition and were then used as input for VR and AR environments, as well as for a racing game. The sustained level of pain reported by the patient was gradually reduced to complete pain-free periods. The phantom posture initially reported as a strongly closed fist was gradually relaxed, interestingly resembling the neutral posture displayed by the virtual limb. The patient acquired the ability to freely move his phantom limb, and a telescopic effect was observed where the position of the phantom hand was restored to the anatomically correct distance. More importantly, the effect of the interventions was positively and noticeably perceived by the patient and his relatives. Despite the limitation of a single case study, the successful results of the proposed system in a patient for whom other medical and non-medical treatments have been ineffective justifies and motivates further investigation in a wider study. PMID:24616655
Ortiz-Catalan, Max; Sander, Nichlas; Kristoffersen, Morten B; Håkansson, Bo; Brånemark, Rickard
2014-01-01
A variety of treatments have been historically used to alleviate phantom limb pain (PLP) with varying efficacy. Recently, virtual reality (VR) has been employed as a more sophisticated mirror therapy. Despite the advantages of VR over a conventional mirror, this approach has retained the use of the contralateral limb and is therefore restricted to unilateral amputees. Moreover, this strategy disregards the actual effort made by the patient to produce phantom motions. In this work, we investigate a treatment in which the virtual limb responds directly to myoelectric activity at the stump, while the illusion of a restored limb is enhanced through augmented reality (AR). Further, phantom motions are facilitated and encouraged through gaming. The proposed set of technologies was administered to a chronic PLP patient who has shown resistance to a variety of treatments (including mirror therapy) for 48 years. Individual and simultaneous phantom movements were predicted using myoelectric pattern recognition and were then used as input for VR and AR environments, as well as for a racing game. The sustained level of pain reported by the patient was gradually reduced to complete pain-free periods. The phantom posture initially reported as a strongly closed fist was gradually relaxed, interestingly resembling the neutral posture displayed by the virtual limb. The patient acquired the ability to freely move his phantom limb, and a telescopic effect was observed where the position of the phantom hand was restored to the anatomically correct distance. More importantly, the effect of the interventions was positively and noticeably perceived by the patient and his relatives. Despite the limitation of a single case study, the successful results of the proposed system in a patient for whom other medical and non-medical treatments have been ineffective justifies and motivates further investigation in a wider study.
Loukas, Constantinos; Lahanas, Vasileios; Georgiou, Evangelos
2013-12-01
Despite the popular use of virtual and physical reality simulators in laparoscopic training, the educational potential of augmented reality (AR) has not received much attention. A major challenge is the robust tracking and three-dimensional (3D) pose estimation of the endoscopic instrument, which are essential for achieving interaction with the virtual world and for realistic rendering when the virtual scene is occluded by the instrument. In this paper we propose a method that addresses these issues, based solely on visual information obtained from the endoscopic camera. Two different tracking algorithms are combined for estimating the 3D pose of the surgical instrument with respect to the camera. The first tracker creates an adaptive model of a colour strip attached to the distal part of the tool (close to the tip). The second algorithm tracks the endoscopic shaft, using a combined Hough-Kalman approach. The 3D pose is estimated with perspective geometry, using appropriate measurements extracted by the two trackers. The method has been validated on several complex image sequences for its tracking efficiency, pose estimation accuracy and applicability in AR-based training. Using a standard endoscopic camera, the absolute average error of the tip position was 2.5 mm for working distances commonly found in laparoscopic training. The average error of the instrument's angle with respect to the camera plane was approximately 2°. The results are also supplemented by video segments of laparoscopic training tasks performed in a physical and an AR environment. The experiments yielded promising results regarding the potential of applying AR technologies for laparoscopic skills training, based on a computer vision framework. The issue of occlusion handling was adequately addressed. The estimated trajectory of the instruments may also be used for surgical gesture interpretation and assessment. Copyright © 2013 John Wiley & Sons, Ltd.
Current status of robotic simulators in acquisition of robotic surgical skills.
Kumar, Anup; Smith, Roger; Patel, Vipul R
2015-03-01
This article provides an overview of the current status of simulator systems in robotic surgery training curriculum, focusing on available simulators for training, their comparison, new technologies introduced in simulation focusing on concepts of training along with existing challenges and future perspectives of simulator training in robotic surgery. The different virtual reality simulators available in the market like dVSS, dVT, RoSS, ProMIS and SEP have shown face, content and construct validity in robotic skills training for novices outside the operating room. Recently, augmented reality simulators like HoST, Maestro AR and RobotiX Mentor have been introduced in robotic training providing a more realistic operating environment, emphasizing more on procedure-specific robotic training . Further, the Xperience Team Trainer, which provides training to console surgeon and bed-side assistant simultaneously, has been recently introduced to emphasize the importance of teamwork and proper coordination. Simulator training holds an important place in current robotic training curriculum of future robotic surgeons. There is a need for more procedure-specific augmented reality simulator training, utilizing advancements in computing and graphical capabilities for new innovations in simulator technology. Further studies are required to establish its cost-benefit ratio along with concurrent and predictive validity.
A brief review of augmented reality science learning
NASA Astrophysics Data System (ADS)
Gopalan, Valarmathie; Bakar, Juliana Aida Abu; Zulkifli, Abdul Nasir
2017-10-01
This paper reviews several literatures concerning the theories and model that could be applied for science motivation for upper secondary school learners (16-17 years old) in order to make the learning experience more amazing and useful. The embedment of AR in science could bring an awe-inspiring transformation on learners' viewpoint towards the respective subject matters. Augmented Reality is able to present the real and virtual learning experience with the addition of multiple media without replacing the real environment. Due to the unique feature of AR, it attracts the mass attention of researchers to implement AR in science learning. This impressive technology offers learners with the ultimate visualization and provides an astonishing and transparent learning experience by bringing to light the unseen perspective of the learning content. This paper will attract the attention of researchers in the related field as well as academicians in the related discipline. This paper aims to propose several related theoretical guidance that could be applied in science motivation to transform the learning in an effective way.
Online tracking of outdoor lighting variations for augmented reality with moving cameras.
Liu, Yanli; Granier, Xavier
2012-04-01
In augmented reality, one of key tasks to achieve a convincing visual appearance consistency between virtual objects and video scenes is to have a coherent illumination along the whole sequence. As outdoor illumination is largely dependent on the weather, the lighting condition may change from frame to frame. In this paper, we propose a full image-based approach for online tracking of outdoor illumination variations from videos captured with moving cameras. Our key idea is to estimate the relative intensities of sunlight and skylight via a sparse set of planar feature-points extracted from each frame. To address the inevitable feature misalignments, a set of constraints are introduced to select the most reliable ones. Exploiting the spatial and temporal coherence of illumination, the relative intensities of sunlight and skylight are finally estimated by using an optimization process. We validate our technique on a set of real-life videos and show that the results with our estimations are visually coherent along the video sequences.
Functional Reflective Polarizer for Augmented Reality and Color Vision Deficiency
2016-03-03
Functional reflective polarizer for augmented reality and color vision deficiency Ruidong Zhu, Guanjun Tan, Jiamin Yuan, and Shin-Tson Wu* College...polarizer that can be incorporated into a compact augmented reality system. The design principle of the functional reflective polarizer is explained and...augment reality system is relatively high as compared to a polarizing beam splitter or a conventional reflective polarizer. Such a functional reflective
ERIC Educational Resources Information Center
Pantelidis, Veronica S.
2009-01-01
Many studies have been conducted on the use of virtual reality in education and training. This article lists examples of such research. Reasons to use virtual reality are discussed. Advantages and disadvantages of using virtual reality are presented, as well as suggestions on when to use and when not to use virtual reality. A model that can be…
Aggarwal, Rajesh; Balasundaram, Indran; Darzi, Ara
2008-03-01
Within the past decade, there has been increasing interest in simulation-based devices for training and assessment of technical skills, especially for minimally invasive techniques such as laparoscopy. The aim of this study was to investigate the perceptions of senior and junior surgeons to virtual reality simulation within the context of current training opportunities for basic laparoscopic procedures. A postal questionnaire was sent to 245 consultants and their corresponding specialist registrar (SpR), detailing laparoscopic surgical practice and their knowledge and use of virtual reality (VR) surgical simulators. One hundred ninety-one (78%) consultants and 103(42%) SpRs returned questionnaires; 16%(10/61) of junior SpRs (year 1-4) had performed more than 50 laparoscopic cholecystectomies to date compared with 76% (32/42) of senior SpRs (year 5-6) (P < 0.001); 90% (55/61) of junior SpRs and 67% (28/42) of senior SpRs were keen to augment their training with VR (P = 0.007); 81% (238/294) of all surgeons agreed that VR has a useful role in the laparoscopic surgical training curriculum. There is a lack of experience in index laparoscopic cases of junior SpRs, and laparoscopic VR simulation is recognized as a useful mode of practice to acquire technical skills. This should encourage surgical program directors to drive the integration of simulation-based training into the surgical curriculum.
Grooms, Dustin R; Kiefer, Adam W; Riley, Michael A; Ellis, Jonathan D; Thomas, Staci; Kitchen, Katie; DiCesare, Christopher; Bonnette, Scott; Gadd, Brooke; Barber Foss, Kim D; Yuan, Weihong; Silva, Paula; Galloway, Ryan; Diekfuss, Jed; Leach, James; Berz, Kate; Myer, Gregory D
2018-03-27
A limiting factor for reducing anterior cruciate ligament (ACL) injury risk is ensuring that the movement adaptions made during the prevention program transfer to sport-specific activity. Virtual reality provides a mechanism to assess transferability and neuroimaging provides a means to assay the neural processes allowing for such skill transfer. To determine the neural mechanisms for injury risk reducing biomechanics transfer to sport after ACL injury prevention training. Cohort study Setting: Research laboratory Participants: Four healthy high school soccer athletes. Participants completed augmented neuromuscular training utilizing real-time visual feedback. An unloaded knee extension task and a loaded leg-press task was completed with neuroimaging before and after training. A virtual reality soccer specific landing task was also competed following training to assess transfer of movement mechanics. Landing mechanics during the virtual reality soccer task and blood oxygen level dependent signal change during neuroimaging. Increased motor planning, sensory and visual region activity during unloaded knee extension and decreased motor cortex activity during loaded leg-press were highly correlated with improvements in landing mechanics (decreased hip adduction and knee rotation). Changes in brain activity may underlie adaptation and transfer of injury risk reducing movement mechanics to sport activity. Clinicians may be able to target these specific brain processes with adjunctive therapy to facilitate intervention improvements transferring to sport.
Suitability of virtual prototypes to support human factors/ergonomics evaluation during the design.
Aromaa, Susanna; Väänänen, Kaisa
2016-09-01
In recent years, the use of virtual prototyping has increased in product development processes, especially in the assessment of complex systems targeted at end-users. The purpose of this study was to evaluate the suitability of virtual prototyping to support human factors/ergonomics evaluation (HFE) during the design phase. Two different virtual prototypes were used: augmented reality (AR) and virtual environment (VE) prototypes of a maintenance platform of a rock crushing machine. Nineteen designers and other stakeholders were asked to assess the suitability of the prototype for HFE evaluation. Results indicate that the system model characteristics and user interface affect the experienced suitability. The VE system was valued as being more suitable to support the assessment of visibility, reach, and the use of tools than the AR system. The findings of this study can be used as a guidance for the implementing virtual prototypes in the product development process. Copyright © 2016 Elsevier Ltd. All rights reserved.
Technological advances in robotic-assisted laparoscopic surgery.
Tan, Gerald Y; Goel, Raj K; Kaouk, Jihad H; Tewari, Ashutosh K
2009-05-01
In this article, the authors describe the evolution of urologic robotic systems and the current state-of-the-art features and existing limitations of the da Vinci S HD System (Intuitive Surgical, Inc.). They then review promising innovations in scaling down the footprint of robotic platforms, the early experience with mobile miniaturized in vivo robots, advances in endoscopic navigation systems using augmented reality technologies and tracking devices, the emergence of technologies for robotic natural orifice transluminal endoscopic surgery and single-port surgery, advances in flexible robotics and haptics, the development of new virtual reality simulator training platforms compatible with the existing da Vinci system, and recent experiences with remote robotic surgery and telestration.
Augmented Reality Imaging System: 3D Viewing of a Breast Cancer.
Douglas, David B; Boone, John M; Petricoin, Emanuel; Liotta, Lance; Wilson, Eugene
2016-01-01
To display images of breast cancer from a dedicated breast CT using Depth 3-Dimensional (D3D) augmented reality. A case of breast cancer imaged using contrast-enhanced breast CT (Computed Tomography) was viewed with the augmented reality imaging, which uses a head display unit (HDU) and joystick control interface. The augmented reality system demonstrated 3D viewing of the breast mass with head position tracking, stereoscopic depth perception, focal point convergence and the use of a 3D cursor and joy-stick enabled fly through with visualization of the spiculations extending from the breast cancer. The augmented reality system provided 3D visualization of the breast cancer with depth perception and visualization of the mass's spiculations. The augmented reality system should be further researched to determine the utility in clinical practice.
Modulation of thermal pain-related brain activity with virtual reality: evidence from fMRI.
Hoffman, Hunter G; Richards, Todd L; Coda, Barbara; Bills, Aric R; Blough, David; Richards, Anne L; Sharar, Sam R
2004-06-07
This study investigated the neural correlates of virtual reality analgesia. Virtual reality significantly reduced subjective pain ratings (i.e. analgesia). Using fMRI, pain-related brain activity was measured for each participant during conditions of no virtual reality and during virtual reality (order randomized). As predicted, virtual reality significantly reduced pain-related brain activity in all five regions of interest; the anterior cingulate cortex, primary and secondary somatosensory cortex, insula, and thalamus (p<0.002, corrected). Results showed direct modulation of human brain pain responses by virtual reality distraction. Copyright 2004 Lippincott Williams and Wilkins
Confronting an Augmented Reality
ERIC Educational Resources Information Center
Munnerley, Danny; Bacon, Matt; Wilson, Anna; Steele, James; Hedberg, John; Fitzgerald, Robert
2012-01-01
How can educators make use of augmented reality technologies and practices to enhance learning and why would we want to embrace such technologies anyway? How can an augmented reality help a learner confront, interpret and ultimately comprehend reality itself ? In this article, we seek to initiate a discussion that focuses on these questions, and…
De Luca, Rosaria; Torrisi, Michele; Piccolo, Adriana; Bonfiglio, Giovanni; Tomasello, Provvidenza; Naro, Antonino; Calabrò, Rocco Salvatore
2017-10-11
Cognitive impairment, as well as mood and anxiety disorders, occur frequently in patients following stroke. Aim of this study was to evaluate the effects of a combined rehabilitative treatment using conventional relaxation and respiratory techniques, in a specific rehabilitative virtual environment (by using Bts-Nirvana). A 58-year-old woman, affected by hemorrhagic stroke, underwent two different rehabilitation trainings, including either standard relaxation techniques alone in a common clinical setting or the same psychological approach in a semi-immersive virtual environment with an augmented sensorial (audio-video) and motor feedback (sensory motor-interaction). We evaluated the patient's cognitive and psychological profile before and after the two different trainings, by using a specific psychometric battery, aimed to assess cognitive status, attention processes and to estimate the presence of mood alterations, anxiety and coping strategies. Only at the end of the combined approach, we observed a significant improvement in attention and memory functions, with a nearly complete relief of anxiety symptoms and an improvement in coping strategies. Relaxation and respiratory techniques in a semi-immersive virtual reality environment, using Bts-Nirvana, may be a promising tool in improving attention process, coping strategies, and anxiety in individuals with neurological disorders, including stroke.
Virtual Reality As a Training Tool to Treat Physical Inactivity in Children.
Kiefer, Adam W; Pincus, David; Richardson, Michael J; Myer, Gregory D
2017-01-01
Lack of adequate physical activity in children is an epidemic that can result in obesity and other poor health outcomes across the lifespan. Physical activity interventions focused on motor skill competence continue to be developed, but some interventions, such as neuromuscular training (NMT), may be limited in how early they can be implemented due to dependence on the child's level of cognitive and perceptual-motor development. Early implementation of motor-rich activities that support motor skill development in children is critical for the development of healthy levels of physical activity that carry through into adulthood. Virtual reality (VR) training may be beneficial in this regard. VR training, when grounded in an information-based theory of perceptual-motor behavior that modifies the visual information in the virtual world, can promote early development of motor skills in youth akin to more natural, real-world development as opposed to strictly formalized training. This approach can be tailored to the individual child and training scenarios can increase in complexity as the child develops. Ultimately, training in VR may help serve as a precursor to "real-world" NMT, and once the child reaches the appropriate training age can also augment more complex NMT regimens performed outside of the virtual environment.
Virtual Reality As a Training Tool to Treat Physical Inactivity in Children
Kiefer, Adam W.; Pincus, David; Richardson, Michael J.; Myer, Gregory D.
2017-01-01
Lack of adequate physical activity in children is an epidemic that can result in obesity and other poor health outcomes across the lifespan. Physical activity interventions focused on motor skill competence continue to be developed, but some interventions, such as neuromuscular training (NMT), may be limited in how early they can be implemented due to dependence on the child’s level of cognitive and perceptual-motor development. Early implementation of motor-rich activities that support motor skill development in children is critical for the development of healthy levels of physical activity that carry through into adulthood. Virtual reality (VR) training may be beneficial in this regard. VR training, when grounded in an information-based theory of perceptual-motor behavior that modifies the visual information in the virtual world, can promote early development of motor skills in youth akin to more natural, real-world development as opposed to strictly formalized training. This approach can be tailored to the individual child and training scenarios can increase in complexity as the child develops. Ultimately, training in VR may help serve as a precursor to “real-world” NMT, and once the child reaches the appropriate training age can also augment more complex NMT regimens performed outside of the virtual environment. PMID:29376045
The status of augmented reality in laparoscopic surgery as of 2016.
Bernhardt, Sylvain; Nicolau, Stéphane A; Soler, Luc; Doignon, Christophe
2017-04-01
This article establishes a comprehensive review of all the different methods proposed by the literature concerning augmented reality in intra-abdominal minimally invasive surgery (also known as laparoscopic surgery). A solid background of surgical augmented reality is first provided in order to support the survey. Then, the various methods of laparoscopic augmented reality as well as their key tasks are categorized in order to better grasp the current landscape of the field. Finally, the various issues gathered from these reviewed approaches are organized in order to outline the remaining challenges of augmented reality in laparoscopic surgery. Copyright © 2017 Elsevier B.V. All rights reserved.
Recent Development of Augmented Reality in Surgery: A Review
Vávra, P.; Zonča, P.; Ihnát, P.; El-Gendi, A.
2017-01-01
Introduction The development augmented reality devices allow physicians to incorporate data visualization into diagnostic and treatment procedures to improve work efficiency, safety, and cost and to enhance surgical training. However, the awareness of possibilities of augmented reality is generally low. This review evaluates whether augmented reality can presently improve the results of surgical procedures. Methods We performed a review of available literature dating from 2010 to November 2016 by searching PubMed and Scopus using the terms “augmented reality” and “surgery.” Results. The initial search yielded 808 studies. After removing duplicates and including only journal articles, a total of 417 studies were identified. By reading of abstracts, 91 relevant studies were chosen to be included. 11 references were gathered by cross-referencing. A total of 102 studies were included in this review. Conclusions The present literature suggest an increasing interest of surgeons regarding employing augmented reality into surgery leading to improved safety and efficacy of surgical procedures. Many studies showed that the performance of newly devised augmented reality systems is comparable to traditional techniques. However, several problems need to be addressed before augmented reality is implemented into the routine practice. PMID:29065604
A Critical Review of the Use of Virtual Reality in Construction Engineering Education and Training.
Wang, Peng; Wu, Peng; Wang, Jun; Chi, Hung-Lin; Wang, Xiangyu
2018-06-08
Virtual Reality (VR) has been rapidly recognized and implemented in construction engineering education and training (CEET) in recent years due to its benefits of providing an engaging and immersive environment. The objective of this review is to critically collect and analyze the VR applications in CEET, aiming at all VR-related journal papers published from 1997 to 2017. The review follows a three-stage analysis on VR technologies, applications and future directions through a systematic analysis. It is found that the VR technologies adopted for CEET evolve over time, from desktop-based VR, immersive VR, 3D game-based VR, to Building Information Modelling (BIM)-enabled VR. A sibling technology, Augmented Reality (AR), for CEET adoptions has also emerged in recent years. These technologies have been applied in architecture and design visualization, construction health and safety training, equipment and operational task training, as well as structural analysis. Future research directions, including the integration of VR with emerging education paradigms and visualization technologies, have also been provided. The findings are useful for both researchers and educators to usefully integrate VR in their education and training programs to improve the training performance.
Designing for Virtual Windows in a Deep Space Habitat
NASA Technical Reports Server (NTRS)
Howe, A. Scott; Howard, Robert L.; Moore, Nathan; Amoroso, Michael
2013-01-01
This paper discusses configurations and test analogs toward the design of a virtual window capability in a Deep Space Habitat. Long-duration space missions will require crews to remain in the confines of a spacecraft for extended periods of time, with possible harmful effects if a crewmember cannot cope with the small habitable volume. Virtual windows expand perceived volume using a minimal amount of image projection equipment and computing resources, and allow a limited immersion in remote environments. Uses for the virtual window include: live or augmented reality views of the external environment; flight deck, piloting, observation, or other participation in remote missions through live transmission of cameras mounted to remote vehicles; pre-recorded background views of nature areas, seasonal occurrences, or cultural events; and pre-recorded events such as birthdays, anniversaries, and other meaningful events prepared by ground support and families of the crewmembers.
Freeman, Daniel; Bradley, Jonathan; Antley, Angus; Bourke, Emilie; DeWeever, Natalie; Evans, Nicole; Černis, Emma; Sheaves, Bryony; Waite, Felicity; Dunn, Graham; Slater, Mel; Clark, David M
2016-07-01
Persecutory delusions may be unfounded threat beliefs maintained by safety-seeking behaviours that prevent disconfirmatory evidence being successfully processed. Use of virtual reality could facilitate new learning. To test the hypothesis that enabling patients to test the threat predictions of persecutory delusions in virtual reality social environments with the dropping of safety-seeking behaviours (virtual reality cognitive therapy) would lead to greater delusion reduction than exposure alone (virtual reality exposure). Conviction in delusions and distress in a real-world situation were assessed in 30 patients with persecutory delusions. Patients were then randomised to virtual reality cognitive therapy or virtual reality exposure, both with 30 min in graded virtual reality social environments. Delusion conviction and real-world distress were then reassessed. In comparison with exposure, virtual reality cognitive therapy led to large reductions in delusional conviction (reduction 22.0%, P = 0.024, Cohen's d = 1.3) and real-world distress (reduction 19.6%, P = 0.020, Cohen's d = 0.8). Cognitive therapy using virtual reality could prove highly effective in treating delusions. © The Royal College of Psychiatrists 2016.
Freeman, Daniel; Bradley, Jonathan; Antley, Angus; Bourke, Emilie; DeWeever, Natalie; Evans, Nicole; Černis, Emma; Sheaves, Bryony; Waite, Felicity; Dunn, Graham; Slater, Mel; Clark, David M.
2016-01-01
Background Persecutory delusions may be unfounded threat beliefs maintained by safety-seeking behaviours that prevent disconfirmatory evidence being successfully processed. Use of virtual reality could facilitate new learning. Aims To test the hypothesis that enabling patients to test the threat predictions of persecutory delusions in virtual reality social environments with the dropping of safety-seeking behaviours (virtual reality cognitive therapy) would lead to greater delusion reduction than exposure alone (virtual reality exposure). Method Conviction in delusions and distress in a real-world situation were assessed in 30 patients with persecutory delusions. Patients were then randomised to virtual reality cognitive therapy or virtual reality exposure, both with 30 min in graded virtual reality social environments. Delusion conviction and real-world distress were then reassessed. Results In comparison with exposure, virtual reality cognitive therapy led to large reductions in delusional conviction (reduction 22.0%, P = 0.024, Cohen's d = 1.3) and real-world distress (reduction 19.6%, P = 0.020, Cohen's d = 0.8). Conclusion Cognitive therapy using virtual reality could prove highly effective in treating delusions. PMID:27151071
Virtual reality and paranoid ideations in people with an 'at-risk mental state' for psychosis.
Valmaggia, Lucia R; Freeman, Daniel; Green, Catherine; Garety, Philippa; Swapp, David; Antley, Angus; Prescott, Corinne; Fowler, David; Kuipers, Elizabeth; Bebbington, Paul; Slater, Mel; Broome, Matthew; McGuire, Philip K
2007-12-01
Virtual reality provides a means of studying paranoid thinking in controlled laboratory conditions. However, this method has not been used with a clinical group. To establish the feasibility and safety of using virtual reality methodology in people with an at-risk mental state and to investigate the applicability of a cognitive model of paranoia to this group. Twenty-one participants with an at-risk mental state were assessed before and after entering a virtual reality environment depicting the inside of an underground train. Virtual reality did not raise levels of distress at the time of testing or cause adverse experiences over the subsequent week. Individuals attributed mental states to virtual reality characters including hostile intent. Persecutory ideation in virtual reality was predicted by higher levels of trait paranoia, anxiety, stress, immersion in virtual reality, perseveration and interpersonal sensitivity. Virtual reality is an acceptable experimental technique for use with individuals with at-risk mental states. Paranoia in virtual reality was understandable in terms of the cognitive model of persecutory delusions.
Augmented reality 3D display based on integral imaging
NASA Astrophysics Data System (ADS)
Deng, Huan; Zhang, Han-Le; He, Min-Yang; Wang, Qiong-Hua
2017-02-01
Integral imaging (II) is a good candidate for augmented reality (AR) display, since it provides various physiological depth cues so that viewers can freely change the accommodation and convergence between the virtual three-dimensional (3D) images and the real-world scene without feeling any visual discomfort. We propose two AR 3D display systems based on the theory of II. In the first AR system, a micro II display unit reconstructs a micro 3D image, and the mciro-3D image is magnified by a convex lens. The lateral and depth distortions of the magnified 3D image are analyzed and resolved by the pitch scaling and depth scaling. The magnified 3D image and real 3D scene are overlapped by using a half-mirror to realize AR 3D display. The second AR system uses a micro-lens array holographic optical element (HOE) as an image combiner. The HOE is a volume holographic grating which functions as a micro-lens array for the Bragg-matched light, and as a transparent glass for Bragg mismatched light. A reference beam can reproduce a virtual 3D image from one side and a reference beam with conjugated phase can reproduce the second 3D image from other side of the micro-lens array HOE, which presents double-sided 3D display feature.
Zhu, Ming; Chai, Gang; Lin, Li; Xin, Yu; Tan, Andy; Bogari, Melia; Zhang, Yan; Li, Qingfeng
2016-12-01
Augmented reality (AR) technology can superimpose the virtual image generated by computer onto the real operating field to present an integral image to enhance surgical safety. The purpose of our study is to develop a novel AR-based navigation system for craniofacial surgery. We focus on orbital hypertelorism correction, because the surgery requires high preciseness and is considered tough even for senior craniofacial surgeon. Twelve patients with orbital hypertelorism were selected. The preoperative computed tomography data were imported into 3-dimensional platform for preoperational design. The position and orientation of virtual information and real world were adjusted by image registration process. The AR toolkits were used to realize the integral image. Afterward, computed tomography was also performed after operation for comparing the difference between preoperational plan and actual operational outcome. Our AR-based navigation system was successfully used in these patients, directly displaying 3-dimensional navigational information onto the surgical field. They all achieved a better appearance by the guidance of navigation image. The difference in interdacryon distance and the dacryon point of each side appear no significant (P > 0.05) between preoperational plan and actual surgical outcome. This study reports on an effective visualized approach for guiding orbital hypertelorism correction. Our AR-based navigation system may lay a foundation for craniofacial surgery navigation. The AR technology could be considered as a helpful tool for precise osteotomy in craniofacial surgery.
van Oosterom, Matthias N; van der Poel, Henk G; Navab, Nassir; van de Velde, Cornelis J H; van Leeuwen, Fijs W B
2018-03-01
To provide an overview of the developments made for virtual- and augmented-reality navigation procedures in urological interventions/surgery. Navigation efforts have demonstrated potential in the field of urology by supporting guidance for various disorders. The navigation approaches differ between the individual indications, but seem interchangeable to a certain extent. An increasing number of pre- and intra-operative imaging modalities has been used to create detailed surgical roadmaps, namely: (cone-beam) computed tomography, MRI, ultrasound, and single-photon emission computed tomography. Registration of these surgical roadmaps with the real-life surgical view has occurred in different forms (e.g. electromagnetic, mechanical, vision, or near-infrared optical-based), whereby the combination of approaches was suggested to provide superior outcome. Soft-tissue deformations demand the use of confirmatory interventional (imaging) modalities. This has resulted in the introduction of new intraoperative modalities such as drop-in US, transurethral US, (drop-in) gamma probes and fluorescence cameras. These noninvasive modalities provide an alternative to invasive technologies that expose the patients to X-ray doses. Whereas some reports have indicated navigation setups provide equal or better results than conventional approaches, most trials have been performed in relatively small patient groups and clear follow-up data are missing. The reported computer-assisted surgery research concepts provide a glimpse in to the future application of navigation technologies in the field of urology.
The potential of virtual reality and gaming to assist successful aging with disability.
Lange, B S; Requejo, P; Flynn, S M; Rizzo, A A; Valero-Cuevas, F J; Baker, L; Winstein, C
2010-05-01
Using the advances in computing power, software and hardware technologies, virtual reality (VR), and gaming applications have the potential to address clinical challenges for a range of disabilities. VR-based games can potentially provide the ability to assess and augment cognitive and motor rehabilitation under a range of stimulus conditions that are not easily controllable and quantifiable in the real world. This article discusses an approach for maximizing function and participation for those aging with and into a disability by combining task-specific training with advances in VR and gaming technologies to enable positive behavioral modifications for independence in the home and community. There is potential for the use of VR and game applications for rehabilitating, maintaining, and enhancing those processes that are affected by aging with and into disability, particularly the need to attain a balance in the interplay between sensorimotor function and cognitive demands and to reap the benefits of task-specific training and regular physical activity and exercise.
Vogt, Tobias; Herpers, Rainer; Askew, Christopher D.; Scherfgen, David; Strüder, Heiko K.; Schneider, Stefan
2015-01-01
Virtual reality environments are increasingly being used to encourage individuals to exercise more regularly, including as part of treatment those with mental health or neurological disorders. The success of virtual environments likely depends on whether a sense of presence can be established, where participants become fully immersed in the virtual environment. Exposure to virtual environments is associated with physiological responses, including cortical activation changes. Whether the addition of a real exercise within a virtual environment alters sense of presence perception, or the accompanying physiological changes, is not known. In a randomized and controlled study design, moderate-intensity Exercise (i.e., self-paced cycling) and No-Exercise (i.e., automatic propulsion) trials were performed within three levels of virtual environment exposure. Each trial was 5 minutes in duration and was followed by posttrial assessments of heart rate, perceived sense of presence, EEG, and mental state. Changes in psychological strain and physical state were generally mirrored by neural activation patterns. Furthermore, these changes indicated that exercise augments the demands of virtual environment exposures and this likely contributed to an enhanced sense of presence. PMID:26366305
Vogt, Tobias; Herpers, Rainer; Askew, Christopher D; Scherfgen, David; Strüder, Heiko K; Schneider, Stefan
2015-01-01
Virtual reality environments are increasingly being used to encourage individuals to exercise more regularly, including as part of treatment those with mental health or neurological disorders. The success of virtual environments likely depends on whether a sense of presence can be established, where participants become fully immersed in the virtual environment. Exposure to virtual environments is associated with physiological responses, including cortical activation changes. Whether the addition of a real exercise within a virtual environment alters sense of presence perception, or the accompanying physiological changes, is not known. In a randomized and controlled study design, moderate-intensity Exercise (i.e., self-paced cycling) and No-Exercise (i.e., automatic propulsion) trials were performed within three levels of virtual environment exposure. Each trial was 5 minutes in duration and was followed by posttrial assessments of heart rate, perceived sense of presence, EEG, and mental state. Changes in psychological strain and physical state were generally mirrored by neural activation patterns. Furthermore, these changes indicated that exercise augments the demands of virtual environment exposures and this likely contributed to an enhanced sense of presence.
Potential perils of peri-Pokémon perambulation: the dark reality of augmented reality?
Joseph, Bellal; Armstrong, David G.
2016-01-01
Recently, the layering of augmented reality information on top of smartphone applications has created unprecedented user engagement and popularity. One augmented reality-based entertainment application, Pokémon Go (Pokémon Company, Tokyo, Japan) has become the most rapidly downloaded in history. This technology holds tremendous promise to promote ambulatory activity. However, there exists the obvious potential for distraction-related morbidity. We report two cases, presenting simultaneously to our trauma center, with injuries sustained secondary to gameplay with this augmented reality-based application. PMID:27713831
Potential perils of peri-Pokémon perambulation: the dark reality of augmented reality?
Joseph, Bellal; Armstrong, David G
2016-10-01
Recently, the layering of augmented reality information on top of smartphone applications has created unprecedented user engagement and popularity. One augmented reality-based entertainment application, Pokémon Go (Pokémon Company, Tokyo, Japan) has become the most rapidly downloaded in history. This technology holds tremendous promise to promote ambulatory activity. However, there exists the obvious potential for distraction-related morbidity. We report two cases, presenting simultaneously to our trauma center, with injuries sustained secondary to gameplay with this augmented reality-based application.
Augmented Reality as a Countermeasure for Sleep Deprivation.
Baumeister, James; Dorrlan, Jillian; Banks, Siobhan; Chatburn, Alex; Smith, Ross T; Carskadon, Mary A; Lushington, Kurt; Thomas, Bruce H
2016-04-01
Sleep deprivation is known to have serious deleterious effects on executive functioning and job performance. Augmented reality has an ability to place pertinent information at the fore, guiding visual focus and reducing instructional complexity. This paper presents a study to explore how spatial augmented reality instructions impact procedural task performance on sleep deprived users. The user study was conducted to examine performance on a procedural task at six time points over the course of a night of total sleep deprivation. Tasks were provided either by spatial augmented reality-based projections or on an adjacent monitor. The results indicate that participant errors significantly increased with the monitor condition when sleep deprived. The augmented reality condition exhibited a positive influence with participant errors and completion time having no significant increase when sleep deprived. The results of our study show that spatial augmented reality is an effective sleep deprivation countermeasure under laboratory conditions.
D3D augmented reality imaging system: proof of concept in mammography.
Douglas, David B; Petricoin, Emanuel F; Liotta, Lance; Wilson, Eugene
2016-01-01
The purpose of this article is to present images from simulated breast microcalcifications and assess the pattern of the microcalcifications with a technical development called "depth 3-dimensional (D3D) augmented reality". A computer, head display unit, joystick, D3D augmented reality software, and an in-house script of simulated data of breast microcalcifications in a ductal distribution were used. No patient data was used and no statistical analysis was performed. The D3D augmented reality system demonstrated stereoscopic depth perception by presenting a unique image to each eye, focal point convergence, head position tracking, 3D cursor, and joystick fly-through. The D3D augmented reality imaging system offers image viewing with depth perception and focal point convergence. The D3D augmented reality system should be tested to determine its utility in clinical practice.
ERIC Educational Resources Information Center
Martín Gutiérrez, Jorge; Meneses Fernández, María Dolores
2014-01-01
This paper explores educational and professional uses of augmented learning environment concerned with issues of training and entertainment. We analyze the state-of-art research of some scenarios based on augmented reality. Some examples for the purpose of education and simulation are described. These applications show that augmented reality can…
Visualizing UAS-collected imagery using augmented reality
NASA Astrophysics Data System (ADS)
Conover, Damon M.; Beidleman, Brittany; McAlinden, Ryan; Borel-Donohue, Christoph C.
2017-05-01
One of the areas where augmented reality will have an impact is in the visualization of 3-D data. 3-D data has traditionally been viewed on a 2-D screen, which has limited its utility. Augmented reality head-mounted displays, such as the Microsoft HoloLens, make it possible to view 3-D data overlaid on the real world. This allows a user to view and interact with the data in ways similar to how they would interact with a physical 3-D object, such as moving, rotating, or walking around it. A type of 3-D data that is particularly useful for military applications is geo-specific 3-D terrain data, and the visualization of this data is critical for training, mission planning, intelligence, and improved situational awareness. Advances in Unmanned Aerial Systems (UAS), photogrammetry software, and rendering hardware have drastically reduced the technological and financial obstacles in collecting aerial imagery and in generating 3-D terrain maps from that imagery. Because of this, there is an increased need to develop new tools for the exploitation of 3-D data. We will demonstrate how the HoloLens can be used as a tool for visualizing 3-D terrain data. We will describe: 1) how UAScollected imagery is used to create 3-D terrain maps, 2) how those maps are deployed to the HoloLens, 3) how a user can view and manipulate the maps, and 4) how multiple users can view the same virtual 3-D object at the same time.
Using virtual reality to augment perception, enhance sensorimotor adaptation, and change our minds
Wright, W. Geoffrey
2014-01-01
Technological advances that involve human sensorimotor processes can have both intended and unintended effects on the central nervous system (CNS). This mini review focuses on the use of virtual environments (VE) to augment brain functions by enhancing perception, eliciting automatic motor behavior, and inducing sensorimotor adaptation. VE technology is becoming increasingly prevalent in medical rehabilitation, training simulators, gaming, and entertainment. Although these VE applications have often been shown to optimize outcomes, whether it be to speed recovery, reduce training time, or enhance immersion and enjoyment, there are inherent drawbacks to environments that can potentially change sensorimotor calibration. Across numerous VE studies over the years, we have investigated the effects of combining visual and physical motion on perception, motor control, and adaptation. Recent results from our research involving exposure to dynamic passive motion within a visually-depicted VE reveal that short-term exposure to augmented sensorimotor discordance can result in systematic aftereffects that last beyond the exposure period. Whether these adaptations are advantageous or not, remains to be seen. Benefits as well as risks of using VE-driven sensorimotor stimulation to enhance brain processes will be discussed. PMID:24782724
Virtual reality simulators and training in laparoscopic surgery.
Yiannakopoulou, Eugenia; Nikiteas, Nikolaos; Perrea, Despina; Tsigris, Christos
2015-01-01
Virtual reality simulators provide basic skills training without supervision in a controlled environment, free of pressure of operating on patients. Skills obtained through virtual reality simulation training can be transferred on the operating room. However, relative evidence is limited with data available only for basic surgical skills and for laparoscopic cholecystectomy. No data exist on the effect of virtual reality simulation on performance on advanced surgical procedures. Evidence suggests that performance on virtual reality simulators reliably distinguishes experienced from novice surgeons Limited available data suggest that independent approach on virtual reality simulation training is not different from proctored approach. The effect of virtual reality simulators training on acquisition of basic surgical skills does not seem to be different from the effect the physical simulators. Limited data exist on the effect of virtual reality simulation training on the acquisition of visual spatial perception and stress coping skills. Undoubtedly, virtual reality simulation training provides an alternative means of improving performance in laparoscopic surgery. However, future research efforts should focus on the effect of virtual reality simulation on performance in the context of advanced surgical procedure, on standardization of training, on the possibility of synergistic effect of virtual reality simulation training combined with mental training, on personalized training. Copyright © 2014 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Strzys, M. P.; Kapp, S.; Thees, M.; Klein, P.; Lukowicz, P.; Knierim, P.; Schmidt, A.; Kuhn, J.
2018-05-01
Fundamental concepts of thermodynamics rely on abstract physical quantities such as energy, heat and entropy, which play an important role in the process of interpreting thermal phenomena and statistical mechanics. However, these quantities are not covered by human visual perception, and since heat sensation is purely qualitative and easy to deceive, an intuitive understanding often is lacking. Today immersive technologies like head-mounted displays of the newest generation, especially HoloLens, allow for high-quality augmented reality learning experiences, which can overcome this gap in human perception by presenting different representations of otherwise invisible quantities directly in the field of view of the user on the experimental apparatus, which simultaneously avoids a split-attention effect. In a mixed reality (MR) scenario as presented in this paper—which we call a holo.lab—human perception can be extended to the thermal regime by presenting false-color representations of the temperature of objects as a virtual augmentation directly on the real object itself in real-time. Direct feedback to experimental actions of the users in the form of different representations allows for immediate comparison to theoretical principles and predictions and therefore is supposed to intensify the theory–experiment interactions and to increase students’ conceptual understanding. We tested this technology for an experiment on thermal conduction of metals in the framework of undergraduate laboratories. A pilot study with treatment and control groups (N = 59) showed a small positive effect of MR on students’ performance measured with a standardized concept test for thermodynamics, pointing to an improvement of the understanding of the underlying physical concepts. These findings indicate that complex experiments could benefit even more from augmentation. This motivates us to enrich further experiments with MR.
Learning Rationales and Virtual Reality Technology in Education.
ERIC Educational Resources Information Center
Chiou, Guey-Fa
1995-01-01
Defines and describes virtual reality technology and differentiates between virtual learning environment, learning material, and learning tools. Links learning rationales to virtual reality technology to pave conceptual foundations for application of virtual reality technology education. Constructivism, case-based learning, problem-based learning,…
Virtual reality for emergency training
DOE Office of Scientific and Technical Information (OSTI.GOV)
Altinkemer, K.
1995-12-31
Virtual reality is a sequence of scenes generated by a computer as a response to the five different senses. These senses are sight, sound, taste, touch, smell. Other senses that can be used in virtual reality include balance, pheromonal, and immunological senses. Many application areas include: leisure and entertainment, medicine, architecture, engineering, manufacturing, and training. Virtual reality is especially important when it is used for emergency training and management of natural disasters including earthquakes, floods, tornados and other situations which are hard to emulate. Classical training methods for these extraordinary environments lack the realistic surroundings that virtual reality can provide.more » In order for virtual reality to be a successful training tool the design needs to include certain aspects; such as how real virtual reality should be and how much fixed cost is entailed in setting up the virtual reality trainer. There are also pricing questions regarding the price per training session on virtual reality trainer, and the appropriate training time length(s).« less
Kong, Seong-Ho; Haouchine, Nazim; Soares, Renato; Klymchenko, Andrey; Andreiuk, Bohdan; Marques, Bruno; Shabat, Galyna; Piechaud, Thierry; Diana, Michele; Cotin, Stéphane; Marescaux, Jacques
2017-07-01
Augmented reality (AR) is the fusion of computer-generated and real-time images. AR can be used in surgery as a navigation tool, by creating a patient-specific virtual model through 3D software manipulation of DICOM imaging (e.g., CT scan). The virtual model can be superimposed to real-time images enabling transparency visualization of internal anatomy and accurate localization of tumors. However, the 3D model is rigid and does not take into account inner structures' deformations. We present a concept of automated AR registration, while the organs undergo deformation during surgical manipulation, based on finite element modeling (FEM) coupled with optical imaging of fluorescent surface fiducials. Two 10 × 1 mm wires (pseudo-tumors) and six 10 × 0.9 mm fluorescent fiducials were placed in ex vivo porcine kidneys (n = 10). Biomechanical FEM-based models were generated from CT scan. Kidneys were deformed and the shape changes were identified by tracking the fiducials, using a near-infrared optical system. The changes were registered automatically with the virtual model, which was deformed accordingly. Accuracy of prediction of pseudo-tumors' location was evaluated with a CT scan in the deformed status (ground truth). In vivo: fluorescent fiducials were inserted under ultrasound guidance in the kidney of one pig, followed by a CT scan. The FEM-based virtual model was superimposed on laparoscopic images by automatic registration of the fiducials. Biomechanical models were successfully generated and accurately superimposed on optical images. The mean measured distance between the estimated tumor by biomechanical propagation and the scanned tumor (ground truth) was 0.84 ± 0.42 mm. All fiducials were successfully placed in in vivo kidney and well visualized in near-infrared mode enabling accurate automatic registration of the virtual model on the laparoscopic images. Our preliminary experiments showed the potential of a biomechanical model with fluorescent fiducials to propagate the deformation of solid organs' surface to their inner structures including tumors with good accuracy and automatized robust tracking.
NASA Astrophysics Data System (ADS)
Kim, Seoksoo; Jung, Sungmo; Song, Jae-Gu; Kang, Byong-Ho
As augmented reality and a gravity sensor is of growing interest, siginificant developement is being made on related technology, which allows application of the technology in a variety of areas with greater expectations. In applying Context-aware to augmented reality, it can make useful programs. A traning system suggested in this study helps a user to understand an effcienct training method using augmented reality and make sure if his exercise is being done propery based on the data collected by a gravity sensor. Therefore, this research aims to suggest an efficient training environment that can enhance previous training methods by applying augmented reality and a gravity sensor.
Augmented reality in neurosurgery
Tagaytayan, Raniel; Kelemen, Arpad
2016-01-01
Neurosurgery is a medical specialty that relies heavily on imaging. The use of computed tomography and magnetic resonance images during preoperative planning and intraoperative surgical navigation is vital to the success of the surgery and positive patient outcome. Augmented reality application in neurosurgery has the potential to revolutionize and change the way neurosurgeons plan and perform surgical procedures in the future. Augmented reality technology is currently commercially available for neurosurgery for simulation and training. However, the use of augmented reality in the clinical setting is still in its infancy. Researchers are now testing augmented reality system prototypes to determine and address the barriers and limitations of the technology before it can be widely accepted and used in the clinical setting. PMID:29765445
Augmented reality in neurosurgery.
Tagaytayan, Raniel; Kelemen, Arpad; Sik-Lanyi, Cecilia
2018-04-01
Neurosurgery is a medical specialty that relies heavily on imaging. The use of computed tomography and magnetic resonance images during preoperative planning and intraoperative surgical navigation is vital to the success of the surgery and positive patient outcome. Augmented reality application in neurosurgery has the potential to revolutionize and change the way neurosurgeons plan and perform surgical procedures in the future. Augmented reality technology is currently commercially available for neurosurgery for simulation and training. However, the use of augmented reality in the clinical setting is still in its infancy. Researchers are now testing augmented reality system prototypes to determine and address the barriers and limitations of the technology before it can be widely accepted and used in the clinical setting.
Generating classes of 3D virtual mandibles for AR-based medical simulation.
Hippalgaonkar, Neha R; Sider, Alexa D; Hamza-Lup, Felix G; Santhanam, Anand P; Jaganathan, Bala; Imielinska, Celina; Rolland, Jannick P
2008-01-01
Simulation and modeling represent promising tools for several application domains from engineering to forensic science and medicine. Advances in 3D imaging technology convey paradigms such as augmented reality (AR) and mixed reality inside promising simulation tools for the training industry. Motivated by the requirement for superimposing anatomically correct 3D models on a human patient simulator (HPS) and visualizing them in an AR environment, the purpose of this research effort was to develop and validate a method for scaling a source human mandible to a target human mandible within a 2 mm root mean square (RMS) error. Results show that, given a distance between 2 same landmarks on 2 different mandibles, a relative scaling factor may be computed. Using this scaling factor, results show that a 3D virtual mandible model can be made morphometrically equivalent to a real target-specific mandible within a 1.30 mm RMS error. The virtual mandible may be further used as a reference target for registering other anatomic models, such as the lungs, on the HPS. Such registration will be made possible by physical constraints among the mandible and the spinal column in the horizontal normal rest position.
NASA Astrophysics Data System (ADS)
Lee, Wendy
The advent of multisensory display systems, such as virtual and augmented reality, has fostered a new relationship between humans and space. Not only can these systems mimic real-world environments, they have the ability to create a new space typology made solely of data. In these spaces, two-dimensional information is displayed in three dimensions, requiring human senses to be used to understand virtual, attention-based elements. Studies in the field of big data have predominately focused on visual representations and extractions of information with little focus on sounds. The goal of this research is to evaluate the most efficient methods of perceptually extracting visual data using auditory stimuli in immersive environments. Using Rensselaer's CRAIVE-Lab, a virtual reality space with 360-degree panorama visuals and an array of 128 loudspeakers, participants were asked questions based on complex visual displays using a variety of auditory cues ranging from sine tones to camera shutter sounds. Analysis of the speed and accuracy of participant responses revealed that auditory cues that were more favorable for localization and were positively perceived were best for data extraction and could help create more user-friendly systems in the future.
Application of Virtual and Augmented reality to geoscientific teaching and research.
NASA Astrophysics Data System (ADS)
Hodgetts, David
2017-04-01
The geological sciences are the ideal candidate for the application of Virtual Reality (VR) and Augmented Reality (AR). Digital data collection techniques such as laser scanning, digital photogrammetry and the increasing use of Unmanned Aerial Vehicles (UAV) or Small Unmanned Aircraft (SUA) technology allow us to collect large datasets efficiently and evermore affordably. This linked with the recent resurgence in VR and AR technologies make these 3D digital datasets even more valuable. These advances in VR and AR have been further supported by rapid improvements in graphics card technologies, and by development of high performance software applications to support them. Visualising data in VR is more complex than normal 3D rendering, consideration needs to be given to latency, frame-rate and the comfort of the viewer to enable reasonably long immersion time. Each frame has to be rendered from 2 viewpoints (one for each eye) requiring twice the rendering than for normal monoscopic views. Any unnatural effects (e.g. incorrect lighting) can lead to an uncomfortable VR experience so these have to be minimised. With large digital outcrop datasets comprising 10's-100's of millions of triangles this is challenging but achievable. Apart from the obvious "wow factor" of VR there are some serious applications. It is often the case that users of digital outcrop data do not appreciate the size of features they are dealing with. This is not the case when using correctly scaled VR, and a true sense of scale can be achieved. In addition VR provides an excellent way of performing quality control on 3D models and interpretations and errors are much more easily visible. VR models can then be used to create content that can then be used in AR applications closing the loop and taking interpretations back into the field.
Augmented versus Virtual Reality Laparoscopic Simulation: What Is the Difference?
Botden, Sanne M.B.I.; Buzink, Sonja N.; Schijven, Marlies P.
2007-01-01
Background Virtual reality (VR) is an emerging new modality for laparoscopic skills training; however, most simulators lack realistic haptic feedback. Augmented reality (AR) is a new laparoscopic simulation system offering a combination of physical objects and VR simulation. Laparoscopic instruments are used within an hybrid mannequin on tissue or objects while using video tracking. This study was designed to assess the difference in realism, haptic feedback, and didactic value between AR and VR laparoscopic simulation. Methods The ProMIS AR and LapSim VR simulators were used in this study. The participants performed a basic skills task and a suturing task on both simulators, after which they filled out a questionnaire about their demographics and their opinion of both simulators scored on a 5-point Likert scale. The participants were allotted to 3 groups depending on their experience: experts, intermediates and novices. Significant differences were calculated with the paired t-test. Results There was general consensus in all groups that the ProMIS AR laparoscopic simulator is more realistic than the LapSim VR laparoscopic simulator in both the basic skills task (mean 4.22 resp. 2.18, P < 0.000) as well as the suturing task (mean 4.15 resp. 1.85, P < 0.000). The ProMIS is regarded as having better haptic feedback (mean 3.92 resp. 1.92, P < 0.000) and as being more useful for training surgical residents (mean 4.51 resp. 2.94, P < 0.000). Conclusions In comparison with the VR simulator, the AR laparoscopic simulator was regarded by all participants as a better simulator for laparoscopic skills training on all tested features. PMID:17361356
Environments for online maritime simulators with cloud computing capabilities
NASA Astrophysics Data System (ADS)
Raicu, Gabriel; Raicu, Alexandra
2016-12-01
This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.
Virtual Reality and the Virtual Library.
ERIC Educational Resources Information Center
Oppenheim, Charles
1993-01-01
Explains virtual reality, including proper and improper uses of the term, and suggests ways that libraries might be affected by it. Highlights include elements of virtual reality systems; possible virtual reality applications, including architecture, the chemical industry, transport planning, armed forces, and entertainment; and the virtual…
Registration using natural features for augmented reality systems.
Yuan, M L; Ong, S K; Nee, A Y C
2006-01-01
Registration is one of the most difficult problems in augmented reality (AR) systems. In this paper, a simple registration method using natural features based on the projective reconstruction technique is proposed. This method consists of two steps: embedding and rendering. Embedding involves specifying four points to build the world coordinate system on which a virtual object will be superimposed. In rendering, the Kanade-Lucas-Tomasi (KLT) feature tracker is used to track the natural feature correspondences in the live video. The natural features that have been tracked are used to estimate the corresponding projective matrix in the image sequence. Next, the projective reconstruction technique is used to transfer the four specified points to compute the registration matrix for augmentation. This paper also proposes a robust method for estimating the projective matrix, where the natural features that have been tracked are normalized (translation and scaling) and used as the input data. The estimated projective matrix will be used as an initial estimate for a nonlinear optimization method that minimizes the actual residual errors based on the Levenberg-Marquardt (LM) minimization method, thus making the results more robust and stable. The proposed registration method has three major advantages: 1) It is simple, as no predefined fiducials or markers are used for registration for either indoor and outdoor AR applications. 2) It is robust, because it remains effective as long as at least six natural features are tracked during the entire augmentation, and the existence of the corresponding projective matrices in the live video is guaranteed. Meanwhile, the robust method to estimate the projective matrix can obtain stable results even when there are some outliers during the tracking process. 3) Virtual objects can still be superimposed on the specified areas, even if some parts of the areas are occluded during the entire process. Some indoor and outdoor experiments have been conducted to validate the performance of this proposed method.
1993-04-01
until exhausted. SECURITY CLASSIFICATION OF THIS PAGE All other editions are obsolete. UNCLASSIFIED " VIRTUAL REALITY JAMES F. DAILEY, LIEUTENANT COLONEL...US" This paper reviews the exciting field of virtual reality . The author describes the basic concepts of virtual reality and finds that its numerous...potential benefits to society could revolutionize everyday life. The various components that make up a virtual reality system are described in detail
Rothbaum, Barbara Olasov; Price, Matthew; Jovanovic, Tanja; Norrholm, Seth D.; Gerardi, Maryrose; Dunlop, Boadie; Davis, Michael; Bradley, Bekh; Duncan, Erica; Rizzo, Albert “Skip”; Ressler, Kerry J.
2014-01-01
Objective To determine the effectiveness of Virtual Reality Exposure (VRE) augmented with D-cycloserine (50mg) or alprazolam (0.25mg), compared to placebo, in reducing PTSD due to military trauma in Iraq and Afghanistan. Method A double-blind, placebo-controlled randomized clinical trial comparing augmentation methods for VRE for subjects (n= 156) with PTSD was conducted. Results PTSD symptoms significantly improved from pre- to post-treatment over the 6-session VRE treatment (p<.001) across all conditions and were maintained at 3, 6, and 12 months follow-up. There were no overall differences between the D-cycloserine group on symptoms at any time-point. The alprazolam and placebo conditions significantly differed on the post-treatment Clinician Administered PTSD scale (p = .006) and the 3-month post-treatment PTSD diagnosis, such that the alprazolam group showed greater rates of PTSD (79.2% alprazolam vs. 47.8% placebo). Between-session extinction learning was a treatment-specific enhancer of outcome for the D-cycloserine group only (p<.005). At post-treatment, the D-cycloserine group was the lowest on cortisol reactivity (p<.05) and startle response during VR scenes (p<.05). Conclusions A small number of VRE sessions were associated with reduced PTSD diagnosis and symptoms in Iraq/Afghanistan veterans, although there was no control condition for the VRE. Overall, there was no advantage of D-cycloserine on PTSD symptoms in primary analyses. In secondary analyses, benzodiazepine use during treatment may impair recovery, and D-cycloserine may enhance VRE in patients who demonstrate within-session learning. D-cycloserine augmentation treatment in PTSD patients may reduce cortisol and startle reactivity compared to the alprazolam and placebo treatment, consistent with the animal literature. PMID:24743802
SmartG: Spontaneous Malaysian Augmented Reality Tourist Guide
NASA Astrophysics Data System (ADS)
Kasinathan, Vinothini; Mustapha, Aida; Subramaniam, Tanabalan
2016-11-01
In effort to attract higher tourist expenditure along with higher tourist arrivals, this paper proposes a travel application called the SmartG, acronym for Spontaneous Malaysian Augmented Reality Tourist Guide, which operates by making recommendations to user based on the travel objective and individual budget constraints. The applications relies on augmented reality technology, whereby a three dimensional model is presented to the user based on input from real world environment. User testing returned a favorable feedback on the concept of using augmented reality in promoting Malaysian tourism.
Badiali, Giovanni; Ferrari, Vincenzo; Cutolo, Fabrizio; Freschi, Cinzia; Caramella, Davide; Bianchi, Alberto; Marchetti, Claudio
2014-12-01
We present a newly designed, localiser-free, head-mounted system featuring augmented reality as an aid to maxillofacial bone surgery, and assess the potential utility of the device by conducting a feasibility study and validation. Our head-mounted wearable system facilitating augmented surgery was developed as a stand-alone, video-based, see-through device in which the visual features were adapted to facilitate maxillofacial bone surgery. We implement a strategy designed to present augmented reality information to the operating surgeon. LeFort1 osteotomy was chosen as the test procedure. The system is designed to exhibit virtual planning overlaying the details of a real patient. We implemented a method allowing performance of waferless, augmented-reality assisted bone repositioning. In vitro testing was conducted on a physical replica of a human skull, and the augmented reality system was used to perform LeFort1 maxillary repositioning. Surgical accuracy was measured with the aid of an optical navigation system that recorded the coordinates of three reference points (located in anterior, posterior right, and posterior left positions) on the repositioned maxilla. The outcomes were compared with those expected to be achievable in a three-dimensional environment. Data were derived using three levels of surgical planning, of increasing complexity, and for nine different operators with varying levels of surgical skill. The mean error was 1.70 ± 0.51 mm. The axial errors were 0.89 ± 0.54 mm on the sagittal axis, 0.60 ± 0.20 mm on the frontal axis, and 1.06 ± 0.40 mm on the craniocaudal axis. The simplest plan was associated with a slightly lower mean error (1.58 ± 0.37 mm) compared with the more complex plans (medium: 1.82 ± 0.71 mm; difficult: 1.70 ± 0.45 mm). The mean error for the anterior reference point was lower (1.33 ± 0.58 mm) than those for both the posterior right (1.72 ± 0.24 mm) and posterior left points (2.05 ± 0.47 mm). No significant difference in terms of error was noticed among operators, despite variations in surgical experience. Feedback from surgeons was acceptable; all tests were completed within 15 min and the tool was considered to be both comfortable and usable in practice. We used a new localiser-free, head-mounted, wearable, stereoscopic, video see-through display to develop a useful strategy affording surgeons access to augmented reality information. Our device appears to be accurate when used to assist in waferless maxillary repositioning. Our results suggest that the method can potentially be extended for use with many surgical procedures on the facial skeleton. Further, our positive results suggest that it would be appropriate to proceed to in vivo testing to assess surgical accuracy under real clinical conditions. Copyright © 2014 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.
Digital Documentation and Archiving Low Cost: la Habana Vieja in Cuba
NASA Astrophysics Data System (ADS)
Morganti, C.; Bartolomei, C.
2017-11-01
This article deepens the subject of photo-modelling applied to architecture, on a medium and large scale and it shows all the possibilities to apply the last technologies of augmented reality and virtual reality to the historical and architectural contest of Havana City in Cuba. The context was quite unsuitable to our project because of different and complex reasons. The need to minimize the size of the tools, their weight and cost. Minimize the time of survey and photographic shot on site. To face the difficulties given by the continuing presence of a chaotic influx of people disturbing the work. Not least the difficulty of having a limited number of daily hours available to carry out photographic shots that requires special lighting conditions. This article describes the necessary steps to obtain a 3D dimensional textured model from reality through a photographic set.
[VR and AR Applications in Medical Practice and Education].
Hsieh, Min-Chai; Lin, Yu-Hsuan
2017-12-01
As technology advances, mobile devices have gradually turned into wearable devices. Furthermore, virtual reality (VR), augmented reality (AR), and mixed reality (MR) are being increasingly applied in medical fields such as medical education and training, surgical simulation, neurological rehabilitation, psychotherapy, and telemedicine. Research results demonstrate the ability of VR, AR, and MR to ameliorate the inconveniences that are often associated with traditional medical care, reduce incidents of medical malpractice caused by unskilled operations, and reduce the cost of medical education and training. What is more, the application of these technologies has enhanced the effectiveness of medical education and training, raised the level of diagnosis and treatment, improved the doctor-patient relationship, and boosted the efficiency of medical execution. The present study introduces VR, AR, and MR applications in medical practice and education with the aim of helping health professionals better understand the applications and use these technologies to improve the quality of medical care.
Applied Augmented Reality for High Precision Maintenance
NASA Astrophysics Data System (ADS)
Dever, Clark
Augmented Reality had a major consumer breakthrough this year with Pokemon Go. The underlying technologies that made that app a success with gamers can be applied to improve the efficiency and efficacy of workers. This session will explore some of the use cases for augmented reality in an industrial environment. In doing so, the environmental impacts and human factors that must be considered will be explored. Additionally, the sensors, algorithms, and visualization techniques used to realize augmented reality will be discussed. The benefits of augmented reality solutions in industrial environments include automated data recording, improved quality assurance, reduction in training costs and improved mean-time-to-resolution. As technology continues to follow Moore's law, more applications will become feasible as performance-per-dollar increases across all system components.
Rodríguez-Tizcareño, Mario H; Barajas, Lizbeth; Pérez-Gásque, Marisol; Gómez, Salvador
2012-06-01
This report presents a protocol used to transfer the virtual treatment plan data to the surgical and prosthetic reality and its clinical application, bone site augmentation with computer-custom milled bovine bone graft blocks to their ideal architecture form, implant insertion based on image-guided stent fabrication, and the restorative manufacturing process through computed tomography-based software programs and navigation systems and the computer-aided design and manufacturing techniques for the treatment of the edentulous maxilla.
Virtual Reality and Its Potential Application in Education and Training.
ERIC Educational Resources Information Center
Milheim, William D.
1995-01-01
An overview is provided of current trends in virtual reality research and development, including discussion of hardware, types of virtual reality, and potential problems with virtual reality. Implications for education and training are explored. (Author/JKP)
Davis, Matthew Christopher; Can, Dang D; Pindrik, Jonathan; Rocque, Brandon G; Johnston, James M
2016-02-01
Technology allowing a remote, experienced surgeon to provide real-time guidance to local surgeons has great potential for training and capacity building in medical centers worldwide. Virtual interactive presence and augmented reality (VIPAR), an iPad-based tool, allows surgeons to provide long-distance, virtual assistance wherever a wireless internet connection is available. Local and remote surgeons view a composite image of video feeds at each station, allowing for intraoperative telecollaboration in real time. Local and remote stations were established in Ho Chi Minh City, Vietnam, and Birmingham, Alabama, as part of ongoing neurosurgical collaboration. Endoscopic third ventriculostomy with choroid plexus coagulation with VIPAR was used for subjective and objective evaluation of system performance. VIPAR allowed both surgeons to engage in complex visual and verbal communication during the procedure. Analysis of 5 video clips revealed video delay of 237 milliseconds (range, 93-391 milliseconds) relative to the audio signal. Excellent image resolution allowed the remote neurosurgeon to visualize all critical anatomy. The remote neurosurgeon could gesture to structures with no detectable difference in accuracy between stations, allowing for submillimeter precision. Fifteen endoscopic third ventriculostomy with choroid plexus coagulation procedures have been performed with the use of VIPAR between Vietnam and the United States, with no significant complications. 80% of these patients remain shunt-free. Evolving technologies that allow long-distance, intraoperative guidance, and knowledge transfer hold great potential for highly efficient international neurosurgical education. VIPAR is one example of an inexpensive, scalable platform for increasing global neurosurgical capacity. Efforts to create a network of Vietnamese neurosurgeons who use VIPAR for collaboration are underway. Copyright © 2016 Elsevier Inc. All rights reserved.
[Display technologies for augmented reality in medical applications].
Eck, Ulrich; Winkler, Alexander
2018-04-01
One of the main challenges for modern surgery is the effective use of the many available imaging modalities and diagnostic methods. Augmented reality systems can be used in the future to blend patient and planning information into the view of surgeons, which can improve the efficiency and safety of interventions. In this article we present five visualization methods to integrate augmented reality displays into medical procedures and the advantages and disadvantages are explained. Based on an extensive literature review the various existing approaches for integration of augmented reality displays into medical procedures are divided into five categories and the most important research results for each approach are presented. A large number of mixed and augmented reality solutions for medical interventions have been developed as research prototypes; however, only very few systems have been tested on patients. In order to integrate mixed and augmented reality displays into medical practice, highly specialized solutions need to be developed. Such systems must comply with the requirements with respect to accuracy, fidelity, ergonomics and seamless integration into the surgical workflow.
A Virtual Reality-Based Simulation of Abdominal Surgery
1994-06-30
415) 591-7881 In! IhNiI 1 SHORT TITLE: A Virtual Reality -Based Simulation of Abdominal Surgery REPORTING PERIOD: October 31, 1993-June 30, 1994 The...Report - A Virtual Reality -Based Simulation Of Abdominal Surgery Page 2 June 21, 1994 TECHNICAL REPORT SUMMARY Virtual Reality is a marriage between...applications of this technology. Virtual reality systems can be used to teach surgical anatomy, diagnose surgical problems, plan operations. simulate and
Invisible waves and hidden realms: augmented reality and experimental art
NASA Astrophysics Data System (ADS)
Ruzanka, Silvia
2012-03-01
Augmented reality is way of both altering the visible and revealing the invisible. It offers new opportunities for artistic exploration through virtual interventions in real space. In this paper, the author describes the implementation of two art installations using different AR technologies, one using optical marker tracking on mobile devices and one integrating stereoscopic projections into the physical environment. The first artwork, De Ondas y Abejas (The Waves and the Bees), is based on the widely publicized (but unproven) hypothesis of a link between cellphone radiation and the phenomenon of bee colony collapse disorder. Using an Android tablet, viewers search out small fiducial markers in the shape of electromagnetic waves hidden throughout the gallery, which reveal swarms of bees scattered on the floor. The piece also creates a generative soundscape based on electromagnetic fields. The second artwork, Urban Fauna, is a series of animations in which features of the urban landscape become plants and animals. Surveillance cameras become flocks of birds while miniature cellphone towers, lampposts, and telephone poles grow like small seedlings in time-lapse animation. The animations are presented as small stereoscopic projections, integrated into the physical space of the gallery. These two pieces explore the relationship between nature and technology through the visualization of invisible forces and hidden alternate realities.
D Tracking Based Augmented Reality for Cultural Heritage Data Management
NASA Astrophysics Data System (ADS)
Battini, C.; Landi, G.
2015-02-01
The development of contactless documentation techniques is allowing researchers to collect high volumes of three-dimensional data in a short time but with high levels of accuracy. The digitalisation of cultural heritage opens up the possibility of using image processing and analysis, and computer graphics techniques, to preserve this heritage for future generations; augmenting it with additional information or with new possibilities for its enjoyment and use. The collection of precise datasets about cultural heritage status is crucial for its interpretation, its conservation and during the restoration processes. The application of digital-imaging solutions for various feature extraction, image data-analysis techniques, and three-dimensional reconstruction of ancient artworks, allows the creation of multidimensional models that can incorporate information coming from heterogeneous data sets, research results and historical sources. Real objects can be scanned and reconstructed virtually, with high levels of data accuracy and resolution. Real-time visualisation software and hardware is rapidly evolving and complex three-dimensional models can be interactively visualised and explored on applications developed for mobile devices. This paper will show how a 3D reconstruction of an object, with multiple layers of information, can be stored and visualised through a mobile application that will allow interaction with a physical object for its study and analysis, using 3D Tracking based Augmented Reality techniques.
Virtual Reality as Innovative Approach to the Interior Designing
NASA Astrophysics Data System (ADS)
Kaleja, Pavol; Kozlovská, Mária
2017-06-01
We can observe significant potential of information and communication technologies (ICT) in interior designing field, by development of software and hardware virtual reality tools. Using ICT tools offer realistic perception of proposal in its initial idea (the study). A group of real-time visualization, supported by hardware tools like Oculus Rift HTC Vive, provides free walkthrough and movement in virtual interior with the possibility of virtual designing. By improving of ICT software tools for designing in virtual reality we can achieve still more realistic virtual environment. The contribution presented proposal of an innovative approach of interior designing in virtual reality, using the latest software and hardware ICT virtual reality technologies
The Local Games Lab ABQ: Homegrown Augmented Reality
ERIC Educational Resources Information Center
Holden, Christopher
2014-01-01
Experiments in the use of augmented reality games formerly required extensive material resources and expertise to implement above and beyond what might be possible within the usual educational contexts. Currently, the more common availability of hardware in these contexts and the existence of easy-to-use, general purpose augmented reality design…
The virtues of virtual reality in exposure therapy.
Gega, Lina
2017-04-01
Virtual reality can be more effective and less burdensome than real-life exposure. Optimal virtual reality delivery should incorporate in situ direct dialogues with a therapist, discourage safety behaviours, allow for a mismatch between virtual and real exposure tasks, and encourage self-directed real-life practice between and beyond virtual reality sessions. © The Royal College of Psychiatrists 2017.
Virtual Reality in the Classroom.
ERIC Educational Resources Information Center
Pantelidis, Veronica S.
1993-01-01
Considers the concept of virtual reality; reviews its history; describes general uses of virtual reality, including entertainment, medicine, and design applications; discusses classroom uses of virtual reality, including a software program called Virtus WalkThrough for use with a computer monitor; and suggests future possibilities. (34 references)…
Teaching Basic Field Skills Using Screen-Based Virtual Reality Landscapes
NASA Astrophysics Data System (ADS)
Houghton, J.; Robinson, A.; Gordon, C.; Lloyd, G. E. E.; Morgan, D. J.
2016-12-01
We are using screen-based virtual reality landscapes, created using the Unity 3D game engine, to augment the training geoscience students receive in preparing for fieldwork. Students explore these landscapes as they would real ones, interacting with virtual outcrops to collect data, determine location, and map the geology. Skills for conducting field geological surveys - collecting, plotting and interpreting data; time management and decision making - are introduced interactively and intuitively. As with real landscapes, the virtual landscapes are open-ended terrains with embedded data. This means the game does not structure student interaction with the information as it is through experience the student learns the best methods to work successfully and efficiently. These virtual landscapes are not replacements for geological fieldwork rather virtual spaces between classroom and field in which to train and reinforcement essential skills. Importantly, these virtual landscapes offer accessible parallel provision for students unable to visit, or fully partake in visiting, the field. The project has received positive feedback from both staff and students. Results show students find it easier to focus on learning these basic field skills in a classroom, rather than field setting, and make the same mistakes as when learning in the field, validating the realistic nature of the virtual experience and providing opportunity to learn from these mistakes. The approach also saves time, and therefore resources, in the field as basic skills are already embedded. 70% of students report increased confidence with how to map boundaries and 80% have found the virtual training a useful experience. We are also developing landscapes based on real places with 3D photogrammetric outcrops, and a virtual urban landscape in which Engineering Geology students can conduct a site investigation. This project is a collaboration between the University of Leeds and Leeds College of Art, UK, and all our virtual landscapes are freely available online at www.see.leeds.ac.uk/virtual-landscapes/.
Model-based registration of multi-rigid-body for augmented reality
NASA Astrophysics Data System (ADS)
Ikeda, Sei; Hori, Hajime; Imura, Masataka; Manabe, Yoshitsugu; Chihara, Kunihiro
2009-02-01
Geometric registration between a virtual object and the real space is the most basic problem in augmented reality. Model-based tracking methods allow us to estimate three-dimensional (3-D) position and orientation of a real object by using a textured 3-D model instead of visual marker. However, it is difficult to apply existing model-based tracking methods to the objects that have movable parts such as a display of a mobile phone, because these methods suppose a single, rigid-body model. In this research, we propose a novel model-based registration method for multi rigid-body objects. For each frame, the 3-D models of each rigid part of the object are first rendered according to estimated motion and transformation from the previous frame. Second, control points are determined by detecting the edges of the rendered image and sampling pixels on these edges. Motion and transformation are then simultaneously calculated from distances between the edges and the control points. The validity of the proposed method is demonstrated through experiments using synthetic videos.
Display technologies for augmented reality
NASA Astrophysics Data System (ADS)
Lee, Byoungho; Lee, Seungjae; Jang, Changwon; Hong, Jong-Young; Li, Gang
2018-02-01
With the virtue of rapid progress in optics, sensors, and computer science, we are witnessing that commercial products or prototypes for augmented reality (AR) are penetrating into the consumer markets. AR is spotlighted as expected to provide much more immersive and realistic experience than ordinary displays. However, there are several barriers to be overcome for successful commercialization of AR. Here, we explore challenging and important topics for AR such as image combiners, enhancement of display performance, and focus cue reproduction. Image combiners are essential to integrate virtual images with real-world. Display performance (e.g. field of view and resolution) is important for more immersive experience and focus cue reproduction may mitigate visual fatigue caused by vergence-accommodation conflict. We also demonstrate emerging technologies to overcome these issues: index-matched anisotropic crystal lens (IMACL), retinal projection displays, and 3D display with focus cues. For image combiners, a novel optical element called IMACL provides relatively wide field of view. Retinal projection displays may enhance field of view and resolution of AR displays. Focus cues could be reconstructed via multi-layer displays and holographic displays. Experimental results of our prototypes are explained.
A review of existing and potential computer user interfaces for modern radiology.
Iannessi, Antoine; Marcy, Pierre-Yves; Clatz, Olivier; Bertrand, Anne-Sophie; Sugimoto, Maki
2018-05-16
The digitalization of modern imaging has led radiologists to become very familiar with computers and their user interfaces (UI). New options for display and command offer expanded possibilities, but the mouse and keyboard remain the most commonly utilized, for usability reasons. In this work, we review and discuss different UI and their possible application in radiology. We consider two-dimensional and three-dimensional imaging displays in the context of interventional radiology, and discuss interest in touchscreens, kinetic sensors, eye detection, and augmented or virtual reality. We show that UI design specifically for radiologists is key for future use and adoption of such new interfaces. Next-generation UI must fulfil professional needs, while considering contextual constraints. • The mouse and keyboard remain the most utilized user interfaces for radiologists. • Touchscreen, holographic, kinetic sensors and eye tracking offer new possibilities for interaction. • 3D and 2D imaging require specific user interfaces. • Holographic display and augmented reality provide a third dimension to volume imaging. • Good usability is essential for adoption of new user interfaces by radiologists.
Anastassova, Margarita; Burkhardt, Jean-Marie
2009-07-01
The paper presents an ergonomic analysis carried out in the early phases of an R&D project. The purpose was to investigate the functioning of today's Automotive Service Technicians (ASTs) training in order to inform the design of an Augmented Reality (AR) teaching aid. The first part of the paper presents a literature review of some major problems encountered by ASTs today. The benefits of AR as technological aid are also introduced. Then, the methodology and the results of two case studies are presented. The first study is based on interviews with trainers and trainees; the second one on observations in real training settings. The results support the assumption that today's ASTs' training could be regarded as a community-of-practice (CoP). Therefore, AR could be useful as a collaboration tool, offering a shared virtual representation of real vehicle's parts, which are normally invisible unless dismantled (e.g. the parts of a hydraulic automatic transmission). We conclude on the methods and the technologies to support the automotive CoP.
Lincoln, Peter; Blate, Alex; Singh, Montek; Whitted, Turner; State, Andrei; Lastra, Anselmo; Fuchs, Henry
2016-04-01
We describe an augmented reality, optical see-through display based on a DMD chip with an extremely fast (16 kHz) binary update rate. We combine the techniques of post-rendering 2-D offsets and just-in-time tracking updates with a novel modulation technique for turning binary pixels into perceived gray scale. These processing elements, implemented in an FPGA, are physically mounted along with the optical display elements in a head tracked rig through which users view synthetic imagery superimposed on their real environment. The combination of mechanical tracking at near-zero latency with reconfigurable display processing has given us a measured average of 80 µs of end-to-end latency (from head motion to change in photons from the display) and also a versatile test platform for extremely-low-latency display systems. We have used it to examine the trade-offs between image quality and cost (i.e. power and logical complexity) and have found that quality can be maintained with a fairly simple display modulation scheme.
Development of a virtual reality training system for endoscope-assisted submandibular gland removal.
Miki, Takehiro; Iwai, Toshinori; Kotani, Kazunori; Dang, Jianwu; Sawada, Hideyuki; Miyake, Minoru
2016-11-01
Endoscope-assisted surgery has widely been adopted as a basic surgical procedure, with various training systems using virtual reality developed for this procedure. In the present study, a basic training system comprising virtual reality for the removal of submandibular glands under endoscope assistance was developed. The efficacy of the training system was verified in novice oral surgeons. A virtual reality training system was developed using existing haptic devices. Virtual reality models were constructed from computed tomography data to ensure anatomical accuracy. Novice oral surgeons were trained using the developed virtual reality training system. The developed virtual reality training system included models of the submandibular gland and surrounding connective tissues and blood vessels entering the submandibular gland. Cutting or abrasion of the connective tissue and manipulations, such as elevation of blood vessels, were reproduced by the virtual reality system. A training program using the developed system was devised. Novice oral surgeons were trained in accordance with the devised training program. Our virtual reality training system for endoscope-assisted removal of the submandibular gland is effective in the training of novice oral surgeons in endoscope-assisted surgery. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Virtual reality: past, present and future.
Gobbetti, E; Scateni, R
1998-01-01
This report provides a short survey of the field of virtual reality, highlighting application domains, technological requirements, and currently available solutions. The report is organized as follows: section 1 presents the background and motivation of virtual environment research and identifies typical application domain, section 2 discusses the characteristics a virtual reality system must have in order to exploit the perceptual and spatial skills of users, section 3 surveys current input/output devices for virtual reality, section 4 surveys current software approaches to support the creation of virtual reality systems, and section 5 summarizes the report.
Virtual Reality: Toward Fundamental Improvements in Simulation-Based Training.
ERIC Educational Resources Information Center
Thurman, Richard A.; Mattoon, Joseph S.
1994-01-01
Considers the role and effectiveness of virtual reality in simulation-based training. The theoretical and practical implications of verity, integration, and natural versus artificial interface are discussed; a three-dimensional classification scheme for virtual reality is described; and the relationship between virtual reality and other…
Virtual Reality in Schools: The Ultimate Educational Technology.
ERIC Educational Resources Information Center
Reid, Robert D.; Sykes, Wylmarie
1999-01-01
Discusses the use of virtual reality as an educational tool. Highlights include examples of virtual reality in public schools that lead to a more active learning process, simulated environments, integrating virtual reality into any curriculum, benefits to teachers and students, and overcoming barriers to implementation. (LRW)
LivePhantom: Retrieving Virtual World Light Data to Real Environments.
Kolivand, Hoshang; Billinghurst, Mark; Sunar, Mohd Shahrizal
2016-01-01
To achieve realistic Augmented Reality (AR), shadows play an important role in creating a 3D impression of a scene. Casting virtual shadows on real and virtual objects is one of the topics of research being conducted in this area. In this paper, we propose a new method for creating complex AR indoor scenes using real time depth detection to exert virtual shadows on virtual and real environments. A Kinect camera was used to produce a depth map for the physical scene mixing into a single real-time transparent tacit surface. Once this is created, the camera's position can be tracked from the reconstructed 3D scene. Real objects are represented by virtual object phantoms in the AR scene enabling users holding a webcam and a standard Kinect camera to capture and reconstruct environments simultaneously. The tracking capability of the algorithm is shown and the findings are assessed drawing upon qualitative and quantitative methods making comparisons with previous AR phantom generation applications. The results demonstrate the robustness of the technique for realistic indoor rendering in AR systems.
LivePhantom: Retrieving Virtual World Light Data to Real Environments
2016-01-01
To achieve realistic Augmented Reality (AR), shadows play an important role in creating a 3D impression of a scene. Casting virtual shadows on real and virtual objects is one of the topics of research being conducted in this area. In this paper, we propose a new method for creating complex AR indoor scenes using real time depth detection to exert virtual shadows on virtual and real environments. A Kinect camera was used to produce a depth map for the physical scene mixing into a single real-time transparent tacit surface. Once this is created, the camera’s position can be tracked from the reconstructed 3D scene. Real objects are represented by virtual object phantoms in the AR scene enabling users holding a webcam and a standard Kinect camera to capture and reconstruct environments simultaneously. The tracking capability of the algorithm is shown and the findings are assessed drawing upon qualitative and quantitative methods making comparisons with previous AR phantom generation applications. The results demonstrate the robustness of the technique for realistic indoor rendering in AR systems. PMID:27930663
Combined virtual and real robotic test-bed for single operator control of multiple robots
NASA Astrophysics Data System (ADS)
Lee, Sam Y.-S.; Hunt, Shawn; Cao, Alex; Pandya, Abhilash
2010-04-01
Teams of heterogeneous robots with different dynamics or capabilities could perform a variety of tasks such as multipoint surveillance, cooperative transport and explorations in hazardous environments. In this study, we work with heterogeneous robots of semi-autonomous ground and aerial robots for contaminant localization. We developed a human interface system which linked every real robot to its virtual counterpart. A novel virtual interface has been integrated with Augmented Reality that can monitor the position and sensory information from video feed of ground and aerial robots in the 3D virtual environment, and improve user situational awareness. An operator can efficiently control the real multi-robots using the Drag-to-Move method on the virtual multi-robots. This enables an operator to control groups of heterogeneous robots in a collaborative way for allowing more contaminant sources to be pursued simultaneously. The advanced feature of the virtual interface system is guarded teleoperation. This can be used to prevent operators from accidently driving multiple robots into walls and other objects. Moreover, the feature of the image guidance and tracking is able to reduce operator workload.
The need for virtual reality simulators in dental education: A review.
Roy, Elby; Bakr, Mahmoud M; George, Roy
2017-04-01
Virtual reality simulators are becoming an essential part of modern education. The benefits of Virtual reality in dentistry is constantly being assessed as a method or an adjunct to improve fine motor skills, hand-eye coordination in pre-clinical settings and overcome the monetary and intellectual challenges involved with such training. This article, while providing an overview of the virtual reality dental simulators, also looks at the link between virtual reality simulation and current pedagogical knowledge.
Rodrigues-Baroni, Juliana M; Nascimento, Lucas R; Ada, Louise; Teixeira-Salmela, Luci F
2014-01-01
To systematically review the available evidence on the efficacy of walking training associated with virtual reality-based training in patients with stroke. The specific questions were: Is walking training associated with virtual reality-based training effective in increasing walking speed after stroke? Is this type of intervention more effective in increasing walking speed, than non-virtual reality-based walking interventions? A systematic review with meta-analysis of randomized clinical trials was conducted. Participants were adults with chronic stroke and the experimental intervention was walking training associated with virtual reality-based training to increase walking speed. The outcome data regarding walking speed were extracted from the eligible trials and were combined using a meta-analysis approach. Seven trials representing eight comparisons were included in this systematic review. Overall, the virtual reality-based training increased walking speed by 0.17 m/s (IC 95% 0.08 to 0.26), compared with placebo/nothing or non-walking interventions. In addition, the virtual reality-based training increased walking speed by 0.15 m/s (IC 95% 0.05 to 0.24), compared with non-virtual reality walking interventions. This review provided evidence that walking training associated with virtual reality-based training was effective in increasing walking speed after stroke, and resulted in better results than non-virtual reality interventions.
Rodrigues-Baroni, Juliana M.; Nascimento, Lucas R.; Ada, Louise; Teixeira-Salmela, Luci F.
2014-01-01
OBJECTIVE: To systematically review the available evidence on the efficacy of walking training associated with virtual reality-based training in patients with stroke. The specific questions were: Is walking training associated with virtual reality-based training effective in increasing walking speed after stroke? Is this type of intervention more effective in increasing walking speed, than non-virtual reality-based walking interventions? METHOD: A systematic review with meta-analysis of randomized clinical trials was conducted. Participants were adults with chronic stroke and the experimental intervention was walking training associated with virtual reality-based training to increase walking speed. The outcome data regarding walking speed were extracted from the eligible trials and were combined using a meta-analysis approach. RESULTS: Seven trials representing eight comparisons were included in this systematic review. Overall, the virtual reality-based training increased walking speed by 0.17 m/s (IC 95% 0.08 to 0.26), compared with placebo/nothing or non-walking interventions. In addition, the virtual reality-based training increased walking speed by 0.15 m/s (IC 95% 0.05 to 0.24), compared with non-virtual reality walking interventions. CONCLUSIONS: This review provided evidence that walking training associated with virtual reality-based training was effective in increasing walking speed after stroke, and resulted in better results than non-virtual reality interventions. PMID:25590442
NASA Astrophysics Data System (ADS)
Palestini, C.; Basso, A.
2017-11-01
In recent years, an increase in international investment in hardware and software technology to support programs that adopt algorithms for photomodeling or data management from laser scanners significantly reduced the costs of operations in support of Augmented Reality and Virtual Reality, designed to generate real-time explorable digital environments integrated to virtual stereoscopic headset. The research analyzes transversal methodologies related to the acquisition of these technologies in order to intervene directly on the phenomenon of acquiring the current VR tools within a specific workflow, in light of any issues related to the intensive use of such devices , outlining a quick overview of the possible "virtual migration" phenomenon, assuming a possible integration with the new internet hyper-speed systems, capable of triggering a massive cyberspace colonization process that paradoxically would also affect the everyday life and more in general, on human space perception. The contribution aims at analyzing the application systems used for low cost 3d photogrammetry by means of a precise pipeline, clarifying how a 3d model is generated, automatically retopologized, textured by color painting or photo-cloning techniques, and optimized for parametric insertion on virtual exploration platforms. Workflow analysis will follow some case studies related to photomodeling, digital retopology and "virtual 3d transfer" of some small archaeological artifacts and an architectural compartment corresponding to the pronaus of Aurum, a building designed in the 1940s by Michelucci. All operations will be conducted on cheap or free licensed software that today offer almost the same performance as their paid counterparts, progressively improving in the data processing speed and management.
Therapists' perception of benefits and costs of using virtual reality treatments.
Segal, Robert; Bhatia, Maneet; Drapeau, Martin
2011-01-01
Research indicates that virtual reality is effective in the treatment of many psychological difficulties and is being used more frequently. However, little is known about therapists' perception of the benefits and costs related to the use of virtual therapy in treatment delivery. In the present study, 271 therapists completed an online questionnaire that assessed their perceptions about the potential benefits and costs of using virtual reality in psychotherapy. Results indicated that therapists perceived the potential benefits as outweighing the potential costs. Therapists' self-reported knowledge of virtual reality, theoretical orientation, and interest in using virtual reality were found to be associated with perceptual measures. These findings contribute to the current knowledge of the perception of virtual reality amongst psychotherapists.
Huang, Cynthia Y; Thomas, Jonathan B; Alismail, Abdullah; Cohen, Avi; Almutairi, Waleed; Daher, Noha S; Terry, Michael H; Tan, Laren D
2018-01-01
The aim of this study was to investigate the feasibility of using augmented reality (AR) glasses in central line simulation by novice operators and compare its efficacy to standard central line simulation/teaching. This was a prospective randomized controlled study enrolling 32 novice operators. Subjects were randomized on a 1:1 basis to either simulation using the augmented virtual reality glasses or simulation using conventional instruction. The study was conducted in tertiary-care urban teaching hospital. A total of 32 adult novice central line operators with no visual or auditory impairments were enrolled. Medical doctors, respiratory therapists, and sleep technicians were recruited from the medical field. The mean time for AR placement in the AR group was 71±43 s, and the time to internal jugular (IJ) cannulation was 316±112 s. There was no significant difference in median (minimum, maximum) time (seconds) to IJ cannulation for those who were in the AR group and those who were not (339 [130, 550] vs 287 [35, 475], p =0.09), respectively. There was also no significant difference between the two groups in median total procedure time (524 [329, 792] vs 469 [198, 781], p =0.29), respectively. There was a significant difference in the adherence level between the two groups favoring the AR group ( p =0.003). AR simulation of central venous catheters in manikins is feasible and efficacious in novice operators as an educational tool. Future studies are recommended in this area as it is a promising area of medical education.
Learning anatomy via mobile augmented reality: Effects on achievement and cognitive load.
Küçük, Sevda; Kapakin, Samet; Göktaş, Yüksel
2016-10-01
Augmented reality (AR), a new generation of technology, has attracted the attention of educators in recent years. In this study, a MagicBook was developed for a neuroanatomy topic by using mobile augmented reality (mAR) technology. This technology integrates virtual learning objects into the real world and allow users to interact with the environment using mobile devices. The purpose of this study was to determine the effects of learning anatomy via mAR on medical students' academic achievement and cognitive load. The mixed method was applied in the study. The random sample consisted of 70 second-year undergraduate medical students: 34 in an experimental group and 36 in a control group. Academic achievement test and cognitive load scale were used as data collection tool. A one-way MANOVA test was used for analysis. The experimental group, which used mAR applications, reported higher achievement and lower cognitive load. The use of mAR applications in anatomy education contributed to the formation of an effective and productive learning environment. Student cognitive load decreased as abstract information became concrete in printed books via multimedia materials in mAR applications. Additionally, students were able to access the materials in the MagicBook anytime and anywhere they wanted. The mobile learning approach helped students learn better by exerting less cognitive effort. Moreover, the sensory experience and real time interaction with environment may provide learning satisfaction and enable students to structure their knowledge to complete the learning tasks. Anat Sci Educ 9: 411-421. © 2016 American Association of Anatomists. © 2016 American Association of Anatomists.
On Location Learning: Authentic Applied Science with Networked Augmented Realities
ERIC Educational Resources Information Center
Rosenbaum, Eric; Klopfer, Eric; Perry, Judy
2007-01-01
The learning of science can be made more like the practice of science through authentic simulated experiences. We have created a networked handheld Augmented Reality environment that combines the authentic role-playing of Augmented Realities and the underlying models of Participatory Simulations. This game, known as Outbreak @ The Institute, is…
Social Augmented Reality: Enhancing Context-Dependent Communication and Informal Learning at Work
ERIC Educational Resources Information Center
Pejoska, Jana; Bauters, Merja; Purma, Jukka; Leinonen, Teemu
2016-01-01
Our design proposal of social augmented reality (SoAR) grows from the observed difficulties of practical applications of augmented reality (AR) in workplace learning. In our research we investigated construction workers doing physical work in the field and analyzed the data using qualitative methods in various workshops. The challenges related to…
The Effect of Augmented Reality Applications in the Learning Process: A Meta-Analysis Study
ERIC Educational Resources Information Center
Ozdemir, Muzaffer; Sahin, Cavus; Arcagok, Serdar; Demir, M. Kaan
2018-01-01
Purpose: The aim of this research is to investigate the effect of Augmented Reality (AR) applications in the learning process. Problem: Research that determines the effectiveness of Augmented Reality (AR) applications in the learning process with different variables has not been encountered in national or international literature. Research…
Augmenting a Child's Reality: Using Educational Tablet Technology
ERIC Educational Resources Information Center
Tanner, Patricia; Karas, Carly; Schofield, Damian
2014-01-01
This study investigates the classroom integration of an innovative technology, augmented reality. Although the process of adding new technologies into a classroom setting can be daunting, the concept of augmented reality has demonstrated the ability to educate students and to assist with their comprehension of a procedural task. One half of the…
Augmented Reality Learning Experiences: Survey of Prototype Design and Evaluation
ERIC Educational Resources Information Center
Santos, Marc Ericson C.; Chen, Angie; Taketomi, Takafumi; Yamamoto, Goshiro; Miyazaki, Jun; Kato, Hirokazu
2014-01-01
Augmented reality (AR) technology is mature for creating learning experiences for K-12 (pre-school, grade school, and high school) educational settings. We reviewed the applications intended to complement traditional curriculum materials for K-12. We found 87 research articles on augmented reality learning experiences (ARLEs) in the IEEE Xplore…
Augmented Reality Trends in Education: A Systematic Review of Research and Applications
ERIC Educational Resources Information Center
Bacca, Jorge; Baldiris, Silvia; Fabregat, Ramon; Graf, Sabine; Kinshuk
2014-01-01
In recent years, there has been an increasing interest in applying Augmented Reality (AR) to create unique educational settings. So far, however, there is a lack of review studies with focus on investigating factors such as: the uses, advantages, limitations, effectiveness, challenges and features of augmented reality in educational settings.…
Enhancing Education through Mobile Augmented Reality
ERIC Educational Resources Information Center
Joan, D. R. Robert
2015-01-01
In this article, the author has discussed about the Mobile Augmented Reality and enhancing education through it. The aim of the present study was to give some general information about mobile augmented reality which helps to boost education. Purpose of the current study reveals the mobile networks which are used in the institution campus as well…
ERIC Educational Resources Information Center
Lan, Chung-Hsien; Chao, Stefan; Kinshuk; Chao, Kuo-Hung
2013-01-01
This study presents a conceptual framework for supporting mobile peer assessment by incorporating augmented reality technology to eliminate limitation of reviewing and assessing. According to the characteristics of mobile technology and augmented reality, students' work can be shown in various ways by considering the locations and situations. This…
ERIC Educational Resources Information Center
Rattanarungrot, Sasithorn; White, Martin; Newbury, Paul
2014-01-01
This paper describes the design of our service-oriented architecture to support mobile multiple object tracking augmented reality applications applied to education and learning scenarios. The architecture is composed of a mobile multiple object tracking augmented reality client, a web service framework, and dynamic content providers. Tracking of…
Yoo, Ha-Na; Chung, Eunjung; Lee, Byoung-Hee
2013-07-01
[Purpose] The purpose of this study was to determine the effects of augmented reality-based Otago exercise on balance, gait, and falls efficacy of elderly women. [Subjects] The subjects were 21 elderly women, who were randomly divided into two groups: an augmented reality-based Otago exercise group of 10 subjects and an Otago exercise group of 11 subjects. [Methods] All subjects were evaluated for balance (Berg Balance Scale, BBS), gait parameters (velocity, cadence, step length, and stride length), and falls efficacy. Within 12 weeks, Otago exercise for muscle strengthening and balance training was conducted three times, for a period of 60 minutes each, and subjects in the experimental group performed augmented reality-based Otago exercise. [Results] Following intervention, the augmented reality-based Otago exercise group showed significant increases in BBS, velocity, cadence, step length (right side), stride length (right side and left side) and falls efficacy. [Conclusion] The results of this study suggest the feasibility and suitability of this augmented reality-based Otago exercise for elderly women.
NASA Astrophysics Data System (ADS)
Yang, W. B.; Yen, Y. N.; Cheng, H. M.
2015-08-01
The integration of preservation of heritage and the digital technology is an important international trend in the 21st century. The digital technology not only is able to record and preserve detailed documents and information of heritage completely, but also brings the value-added features effectively. In this study, 3D laser scanning is used to perform the digitalized archives for the interior and exterior body work of the building which contains integration of 3D scanner technology, mobile scanning collaboration and multisystem reverse modeling and integration technology. The 3D model is built by combining with multi-media presentations and reversed modeling in real scale to perform the simulation of virtual reality (VR). With interactive teaching and presentation of augmented reality to perform the interaction technology to extend the continuously update in traditional architecture information. With the upgrade of the technology and value-added in digitalization, the cultural asset value can be experienced through 3D virtual reality which makes the information presentation from the traditional reading in the past toward user operation with sensory experience and keep exploring the possibilities and development of cultural asset preservation by using digital technology makes the presentation and learning of cultural asset information toward diversification.
An optical tracking system for virtual reality
NASA Astrophysics Data System (ADS)
Hrimech, Hamid; Merienne, Frederic
2009-03-01
In this paper we present a low-cost 3D tracking system which we have developed and tested in order to move away from traditional 2D interaction techniques (keyboard and mouse) in an attempt to improve user's experience while using a CVE. Such a tracking system is used to implement 3D interaction techniques that augment user experience, promote user's sense of transportation in the virtual world as well as user's awareness of their partners. The tracking system is a passive optical tracking system using stereoscopy a technique allowing the reconstruction of three-dimensional information from a couple of images. We have currently deployed our 3D tracking system on a collaborative research platform for investigating 3D interaction techniques in CVEs.