Mass production of holographic transparent components for augmented and virtual reality applications
NASA Astrophysics Data System (ADS)
Russo, Juan Manuel; Dimov, Fedor; Padiyar, Joy; Coe-Sullivan, Seth
2017-06-01
Diffractive optics such as holographic optical elements (HOEs) can provide transparent and narrow band components with arbitrary incident and diffracted angles for near-to-eye commercial electronic products for augmented reality (AR), virtual reality (VR), and smart glass applications. In this paper, we will summarize the operational parameters and general optical geometries relevant for near-to-eye displays, the holographic substrates available for these applications, and their performance characteristics and ease of manufacture. We will compare the holographic substrates available in terms of fabrication, manufacturability, and end-user performance characteristics. Luminit is currently emplacing the manufacturing capacity to serve this market, and this paper will discuss the capabilities and limitations of this unique facility.
Architecture and Key Techniques of Augmented Reality Maintenance Guiding System for Civil Aircrafts
NASA Astrophysics Data System (ADS)
hong, Zhou; Wenhua, Lu
2017-01-01
Augmented reality technology is introduced into the maintenance related field for strengthened information in real-world scenarios through integration of virtual assistant maintenance information with real-world scenarios. This can lower the difficulty of maintenance, reduce maintenance errors, and improve the maintenance efficiency and quality of civil aviation crews. Architecture of augmented reality virtual maintenance guiding system is proposed on the basis of introducing the definition of augmented reality and analyzing the characteristics of augmented reality virtual maintenance. Key techniques involved, such as standardization and organization of maintenance data, 3D registration, modeling of maintenance guidance information and virtual maintenance man-machine interaction, are elaborated emphatically, and solutions are given.
ERIC Educational Resources Information Center
Taçgin, Zeynep; Arslan, Ahmet
2017-01-01
The purpose of this study is to determine perception of postgraduate Computer Education and Instructional Technologies (CEIT) students regarding the concepts of Augmented Reality (AR), Virtual Reality (VR), Mixed Reality (MR), Augmented Virtuality (AV) and Mirror Reality; and to offer a table that includes differences and similarities between…
Transduction between worlds: using virtual and mixed reality for earth and planetary science
NASA Astrophysics Data System (ADS)
Hedley, N.; Lochhead, I.; Aagesen, S.; Lonergan, C. D.; Benoy, N.
2017-12-01
Virtual reality (VR) and augmented reality (AR) have the potential to transform the way we visualize multidimensional geospatial datasets in support of geoscience research, exploration and analysis. The beauty of virtual environments is that they can be built at any scale, users can view them at many levels of abstraction, move through them in unconventional ways, and experience spatial phenomena as if they had superpowers. Similarly, augmented reality allows you to bring the power of virtual 3D data visualizations into everyday spaces. Spliced together, these interface technologies hold incredible potential to support 21st-century geoscience. In my ongoing research, my team and I have made significant advances to connect data and virtual simulations with real geographic spaces, using virtual environments, geospatial augmented reality and mixed reality. These research efforts have yielded new capabilities to connect users with spatial data and phenomena. These innovations include: geospatial x-ray vision; flexible mixed reality; augmented 3D GIS; situated augmented reality 3D simulations of tsunamis and other phenomena interacting with real geomorphology; augmented visual analytics; and immersive GIS. These new modalities redefine the ways in which we can connect digital spaces of spatial analysis, simulation and geovisualization, with geographic spaces of data collection, fieldwork, interpretation and communication. In a way, we are talking about transduction between real and virtual worlds. Taking a mixed reality approach to this, we can link real and virtual worlds. This paper presents a selection of our 3D geovisual interface projects in terrestrial, coastal, underwater and other environments. Using rigorous applied geoscience data, analyses and simulations, our research aims to transform the novelty of virtual and augmented reality interface technologies into game-changing mixed reality geoscience.
NASA Astrophysics Data System (ADS)
Chao, Jie; Chiu, Jennifer L.; DeJaegher, Crystal J.; Pan, Edward A.
2016-02-01
Deep learning of science involves integration of existing knowledge and normative science concepts. Past research demonstrates that combining physical and virtual labs sequentially or side by side can take advantage of the unique affordances each provides for helping students learn science concepts. However, providing simultaneously connected physical and virtual experiences has the potential to promote connections among ideas. This paper explores the effect of augmenting a virtual lab with physical controls on high school chemistry students' understanding of gas laws. We compared students using the augmented virtual lab to students using a similar sensor-based physical lab with teacher-led discussions. Results demonstrate that students in the augmented virtual lab condition made significant gains from pretest and posttest and outperformed traditional students on some but not all concepts. Results provide insight into incorporating mixed-reality technologies into authentic classroom settings.
Methods and systems relating to an augmented virtuality environment
Nielsen, Curtis W; Anderson, Matthew O; McKay, Mark D; Wadsworth, Derek C; Boyce, Jodie R; Hruska, Ryan C; Koudelka, John A; Whetten, Jonathan; Bruemmer, David J
2014-05-20
Systems and methods relating to an augmented virtuality system are disclosed. A method of operating an augmented virtuality system may comprise displaying imagery of a real-world environment in an operating picture. The method may further include displaying a plurality of virtual icons in the operating picture representing at least some assets of a plurality of assets positioned in the real-world environment. Additionally, the method may include displaying at least one virtual item in the operating picture representing data sensed by one or more of the assets of the plurality of assets and remotely controlling at least one asset of the plurality of assets by interacting with a virtual icon associated with the at least one asset.
Davidson, Dennisa; Evans, Lois
2018-03-01
To explore online study groups as augmentation tools in preparing for the Royal Australian and New Zealand College of Psychiatrists Observed Structured Clinical Examinations (OSCE) for fellowship. An online survey of New Zealand trainees was carried out to assess exam preparedness and openness to virtual study groups and results analysed. Relevant material around virtual study groups for fellowship examinations was reviewed and used to inform a pilot virtual study group. Four New Zealand trainees took part in the pilot project, looking at using a virtual platform to augment OSCE preparation. Of the 50 respondents 36% felt adequately prepared for the OSCE. Sixty-four per cent were interested in using a virtual platform to augment their study. Virtual study groups were noted to be especially important for rural trainees, none of whom felt able to form study groups for themselves. The pilot virtual study group was trialled successfully. All four trainees reported the experience as subjectively beneficial to their examination preparation. Virtual platforms hold promise as an augmentation strategy for exam preparation, especially for rural trainees who are more geographically isolated and less likely to have peers preparing for the same examinations.
Fransson, Boel A; Chen, Chi-Ya; Noyes, Julie A; Ragle, Claude A
2016-11-01
To determine the construct and concurrent validity of instrument motion metrics for laparoscopic skills assessment in virtual reality and augmented reality simulators. Evaluation study. Veterinarian students (novice, n = 14) and veterinarians (experienced, n = 11) with no or variable laparoscopic experience. Participants' minimally invasive surgery (MIS) experience was determined by hospital records of MIS procedures performed in the Teaching Hospital. Basic laparoscopic skills were assessed by 5 tasks using a physical box trainer. Each participant completed 2 tasks for assessments in each type of simulator (virtual reality: bowel handling and cutting; augmented reality: object positioning and a pericardial window model). Motion metrics such as instrument path length, angle or drift, and economy of motion of each simulator were recorded. None of the motion metrics in a virtual reality simulator showed correlation with experience, or to the basic laparoscopic skills score. All metrics in augmented reality were significantly correlated with experience (time, instrument path, and economy of movement), except for the hand dominance metric. The basic laparoscopic skills score was correlated to all performance metrics in augmented reality. The augmented reality motion metrics differed between American College of Veterinary Surgeons diplomates and residents, whereas basic laparoscopic skills score and virtual reality metrics did not. Our results provide construct validity and concurrent validity for motion analysis metrics for an augmented reality system, whereas a virtual reality system was validated only for the time score. © Copyright 2016 by The American College of Veterinary Surgeons.
Virtual interactive presence and augmented reality (VIPAR) for remote surgical assistance.
Shenai, Mahesh B; Dillavou, Marcus; Shum, Corey; Ross, Douglas; Tubbs, Richard S; Shih, Alan; Guthrie, Barton L
2011-03-01
Surgery is a highly technical field that combines continuous decision-making with the coordination of spatiovisual tasks. We designed a virtual interactive presence and augmented reality (VIPAR) platform that allows a remote surgeon to deliver real-time virtual assistance to a local surgeon, over a standard Internet connection. The VIPAR system consisted of a "local" and a "remote" station, each situated over a surgical field and a blue screen, respectively. Each station was equipped with a digital viewpiece, composed of 2 cameras for stereoscopic capture, and a high-definition viewer displaying a virtual field. The virtual field was created by digitally compositing selected elements within the remote field into the local field. The viewpieces were controlled by workstations mutually connected by the Internet, allowing virtual remote interaction in real time. Digital renderings derived from volumetric MRI were added to the virtual field to augment the surgeon's reality. For demonstration, a fixed-formalin cadaver head and neck were obtained, and a carotid endarterectomy (CEA) and pterional craniotomy were performed under the VIPAR system. The VIPAR system allowed for real-time, virtual interaction between a local (resident) and remote (attending) surgeon. In both carotid and pterional dissections, major anatomic structures were visualized and identified. Virtual interaction permitted remote instruction for the local surgeon, and MRI augmentation provided spatial guidance to both surgeons. Camera resolution, color contrast, time lag, and depth perception were identified as technical issues requiring further optimization. Virtual interactive presence and augmented reality provide a novel platform for remote surgical assistance, with multiple applications in surgical training and remote expert assistance.
Virtual and Augmented Reality Systems for Renal Interventions: A Systematic Review.
Detmer, Felicitas J; Hettig, Julian; Schindele, Daniel; Schostak, Martin; Hansen, Christian
2017-01-01
Many virtual and augmented reality systems have been proposed to support renal interventions. This paper reviews such systems employed in the treatment of renal cell carcinoma and renal stones. A systematic literature search was performed. Inclusion criteria were virtual and augmented reality systems for radical or partial nephrectomy and renal stone treatment, excluding systems solely developed or evaluated for training purposes. In total, 52 research papers were identified and analyzed. Most of the identified literature (87%) deals with systems for renal cell carcinoma treatment. About 44% of the systems have already been employed in clinical practice, but only 20% in studies with ten or more patients. Main challenges remaining for future research include the consideration of organ movement and deformation, human factor issues, and the conduction of large clinical studies. Augmented and virtual reality systems have the potential to improve safety and outcomes of renal interventions. In the last ten years, many technical advances have led to more sophisticated systems, which are already applied in clinical practice. Further research is required to cope with current limitations of virtual and augmented reality assistance in clinical environments.
Virtual Reality-Based Center of Mass-Assisted Personalized Balance Training System.
Kumar, Deepesh; González, Alejandro; Das, Abhijit; Dutta, Anirban; Fraisse, Philippe; Hayashibe, Mitsuhiro; Lahiri, Uttama
2017-01-01
Poststroke hemiplegic patients often show altered weight distribution with balance disorders, increasing their risk of fall. Conventional balance training, though powerful, suffers from scarcity of trained therapists, frequent visits to clinics to get therapy, one-on-one therapy sessions, and monotony of repetitive exercise tasks. Thus, technology-assisted balance rehabilitation can be an alternative solution. Here, we chose virtual reality as a technology-based platform to develop motivating balance tasks. This platform was augmented with off-the-shelf available sensors such as Nintendo Wii balance board and Kinect to estimate one's center of mass (CoM). The virtual reality-based CoM-assisted balance tasks (Virtual CoMBaT) was designed to be adaptive to one's individualized weight-shifting capability quantified through CoM displacement. Participants were asked to interact with Virtual CoMBaT that offered tasks of varying challenge levels while adhering to ankle strategy for weight shifting. To facilitate the patients to use ankle strategy during weight-shifting, we designed a heel lift detection module. A usability study was carried out with 12 hemiplegic patients. Results indicate the potential of our system to contribute to improving one's overall performance in balance-related tasks belonging to different difficulty levels.
AR Feels "Softer" than VR: Haptic Perception of Stiffness in Augmented versus Virtual Reality.
Gaffary, Yoren; Le Gouis, Benoit; Marchal, Maud; Argelaguet, Ferran; Arnaldi, Bruno; Lecuyer, Anatole
2017-11-01
Does it feel the same when you touch an object in Augmented Reality (AR) or in Virtual Reality (VR)? In this paper we study and compare the haptic perception of stiffness of a virtual object in two situations: (1) a purely virtual environment versus (2) a real and augmented environment. We have designed an experimental setup based on a Microsoft HoloLens and a haptic force-feedback device, enabling to press a virtual piston, and compare its stiffness successively in either Augmented Reality (the virtual piston is surrounded by several real objects all located inside a cardboard box) or in Virtual Reality (the same virtual piston is displayed in a fully virtual scene composed of the same other objects). We have conducted a psychophysical experiment with 12 participants. Our results show a surprising bias in perception between the two conditions. The virtual piston is on average perceived stiffer in the VR condition compared to the AR condition. For instance, when the piston had the same stiffness in AR and VR, participants would select the VR piston as the stiffer one in 60% of cases. This suggests a psychological effect as if objects in AR would feel "softer" than in pure VR. Taken together, our results open new perspectives on perception in AR versus VR, and pave the way to future studies aiming at characterizing potential perceptual biases.
Zhu, Ming; Liu, Fei; Chai, Gang; Pan, Jun J.; Jiang, Taoran; Lin, Li; Xin, Yu; Zhang, Yan; Li, Qingfeng
2017-01-01
Augmented reality systems can combine virtual images with a real environment to ensure accurate surgery with lower risk. This study aimed to develop a novel registration and tracking technique to establish a navigation system based on augmented reality for maxillofacial surgery. Specifically, a virtual image is reconstructed from CT data using 3D software. The real environment is tracked by the augmented reality (AR) software. The novel registration strategy that we created uses an occlusal splint compounded with a fiducial marker (OSM) to establish a relationship between the virtual image and the real object. After the fiducial marker is recognized, the virtual image is superimposed onto the real environment, forming the “integrated image” on semi-transparent glass. Via the registration process, the integral image, which combines the virtual image with the real scene, is successfully presented on the semi-transparent helmet. The position error of this navigation system is 0.96 ± 0.51 mm. This augmented reality system was applied in the clinic and good surgical outcomes were obtained. The augmented reality system that we established for maxillofacial surgery has the advantages of easy manipulation and high accuracy, which can improve surgical outcomes. Thus, this system exhibits significant potential in clinical applications. PMID:28198442
Zhu, Ming; Liu, Fei; Chai, Gang; Pan, Jun J; Jiang, Taoran; Lin, Li; Xin, Yu; Zhang, Yan; Li, Qingfeng
2017-02-15
Augmented reality systems can combine virtual images with a real environment to ensure accurate surgery with lower risk. This study aimed to develop a novel registration and tracking technique to establish a navigation system based on augmented reality for maxillofacial surgery. Specifically, a virtual image is reconstructed from CT data using 3D software. The real environment is tracked by the augmented reality (AR) software. The novel registration strategy that we created uses an occlusal splint compounded with a fiducial marker (OSM) to establish a relationship between the virtual image and the real object. After the fiducial marker is recognized, the virtual image is superimposed onto the real environment, forming the "integrated image" on semi-transparent glass. Via the registration process, the integral image, which combines the virtual image with the real scene, is successfully presented on the semi-transparent helmet. The position error of this navigation system is 0.96 ± 0.51 mm. This augmented reality system was applied in the clinic and good surgical outcomes were obtained. The augmented reality system that we established for maxillofacial surgery has the advantages of easy manipulation and high accuracy, which can improve surgical outcomes. Thus, this system exhibits significant potential in clinical applications.
NASA Astrophysics Data System (ADS)
De Mauro, Alessandro; Ardanza, Aitor; Monge, Esther; Molina Rueda, Francisco
2013-03-01
Several studies have shown that both virtual and augmented reality are technologies suitable for rehabilitation therapy due to the inherent ability of simulating real daily life activities while improving patient motivation. In this paper we will first present the state of the art in the use of virtual and augmented reality applications for rehabilitation of motor disorders and second we will focus on the analysis of the results of our project. In particular, requirements of patients with cerebrovascular accidents, spinal cord injuries and cerebral palsy to the use of virtual and augmented reality systems will be detailed.
A novel augmented reality system of image projection for image-guided neurosurgery.
Mahvash, Mehran; Besharati Tabrizi, Leila
2013-05-01
Augmented reality systems combine virtual images with a real environment. To design and develop an augmented reality system for image-guided surgery of brain tumors using image projection. A virtual image was created in two ways: (1) MRI-based 3D model of the head matched with the segmented lesion of a patient using MRIcro software (version 1.4, freeware, Chris Rorden) and (2) Digital photograph based model in which the tumor region was drawn using image-editing software. The real environment was simulated with a head phantom. For direct projection of the virtual image to the head phantom, a commercially available video projector (PicoPix 1020, Philips) was used. The position and size of the virtual image was adjusted manually for registration, which was performed using anatomical landmarks and fiducial markers position. An augmented reality system for image-guided neurosurgery using direct image projection has been designed successfully and implemented in first evaluation with promising results. The virtual image could be projected to the head phantom and was registered manually. Accurate registration (mean projection error: 0.3 mm) was performed using anatomical landmarks and fiducial markers position. The direct projection of a virtual image to the patients head, skull, or brain surface in real time is an augmented reality system that can be used for image-guided neurosurgery. In this paper, the first evaluation of the system is presented. The encouraging first visualization results indicate that the presented augmented reality system might be an important enhancement of image-guided neurosurgery.
Reality Check: Basics of Augmented, Virtual, and Mixed Reality.
Brigham, Tara J
2017-01-01
Augmented, virtual, and mixed reality applications all aim to enhance a user's current experience or reality. While variations of this technology are not new, within the last few years there has been a significant increase in the number of artificial reality devices or applications available to the general public. This column will explain the difference between augmented, virtual, and mixed reality and how each application might be useful in libraries. It will also provide an overview of the concerns surrounding these different reality applications and describe how and where they are currently being used.
ERIC Educational Resources Information Center
Chang, Hsin-Yi; Wu, Hsin-Kai; Hsu, Ying-Shao
2013-01-01
virtual objects or information overlaying physical objects or environments, resulting in a mixed reality in which virtual objects and real environments coexist in a meaningful way to augment learning…
Mixed Reality with HoloLens: Where Virtual Reality Meets Augmented Reality in the Operating Room.
Tepper, Oren M; Rudy, Hayeem L; Lefkowitz, Aaron; Weimer, Katie A; Marks, Shelby M; Stern, Carrie S; Garfein, Evan S
2017-11-01
Virtual reality and augmented reality devices have recently been described in the surgical literature. The authors have previously explored various iterations of these devices, and although they show promise, it has become clear that virtual reality and/or augmented reality devices alone do not adequately meet the demands of surgeons. The solution may lie in a hybrid technology known as mixed reality, which merges many virtual reality and augmented realty features. Microsoft's HoloLens, the first commercially available mixed reality device, provides surgeons intraoperative hands-free access to complex data, the real environment, and bidirectional communication. This report describes the use of HoloLens in the operating room to improve decision-making and surgical workflow. The pace of mixed reality-related technological development will undoubtedly be rapid in the coming years, and plastic surgeons are ideally suited to both lead and benefit from this advance.
Augmenting the Thermal Flux Experiment: A Mixed Reality Approach with the HoloLens
ERIC Educational Resources Information Center
Strzys, M. P.; Kapp, S.; Thees, M.; Kuhn, J.; Lukowicz, P.; Knierim, P.; Schmidt, A.
2017-01-01
In the field of Virtual Reality (VR) and Augmented Reality (AR), technologies have made huge progress during the last years and also reached the field of education. The virtuality continuum, ranging from pure virtuality on one side to the real world on the other, has been successfully covered by the use of immersive technologies like head-mounted…
Al-Ardah, Aladdin; Alqahtani, Nasser; AlHelal, Abdulaziz; Goodacre, Brian; Swamidass, Rajesh; Garbacea, Antoanela; Lozada, Jaime
2018-05-02
This technique describes a novel approach for planning and augmenting a large bony defect using a titanium mesh (TiMe). A 3-dimensional (3D) surgical model was virtually created from a cone beam computed tomography (CBCT) and wax-pattern of the final prosthetic outcome. The required bone volume (horizontally and vertically) was digitally augmented and then 3D printed to create a bone model. The 3D model was then used to contour the TiMe in accordance with the digital augmentation. With the contoured / preformed TiMe on the 3D printed model a positioning jig was made to aid the placement of the TiMe as planned during surgery. Although this technique does not impact the final outcome of the augmentation procedure, it allows the clinician to virtually design the augmentation, preform and contour the TiMe, and create a positioning jig reducing surgical time and error.
Encarnação, L Miguel; Bimber, Oliver
2002-01-01
Collaborative virtual environments for diagnosis and treatment planning are increasingly gaining importance in our global society. Virtual and Augmented Reality approaches promised to provide valuable means for the involved interactive data analysis, but the underlying technologies still create a cumbersome work environment that is inadequate for clinical employment. This paper addresses two of the shortcomings of such technology: Intuitive interaction with multi-dimensional data in immersive and semi-immersive environments as well as stereoscopic multi-user displays combining the advantages of Virtual and Augmented Reality technology.
Hybrid Reality Lab Capabilities - Video 2
NASA Technical Reports Server (NTRS)
Delgado, Francisco J.; Noyes, Matthew
2016-01-01
Our Hybrid Reality and Advanced Operations Lab is developing incredibly realistic and immersive systems that could be used to provide training, support engineering analysis, and augment data collection for various human performance metrics at NASA. To get a better understanding of what Hybrid Reality is, let's go through the two most commonly known types of immersive realities: Virtual Reality, and Augmented Reality. Virtual Reality creates immersive scenes that are completely made up of digital information. This technology has been used to train astronauts at NASA, used during teleoperation of remote assets (arms, rovers, robots, etc.) and other activities. One challenge with Virtual Reality is that if you are using it for real time-applications (like landing an airplane) then the information used to create the virtual scenes can be old (i.e. visualized long after physical objects moved in the scene) and not accurate enough to land the airplane safely. This is where Augmented Reality comes in. Augmented Reality takes real-time environment information (from a camera, or see through window, and places digitally created information into the scene so that it matches with the video/glass information). Augmented Reality enhances real environment information collected with a live sensor or viewport (e.g. camera, window, etc.) with the information-rich visualization provided by Virtual Reality. Hybrid Reality takes Augmented Reality even further, by creating a higher level of immersion where interactivity can take place. Hybrid Reality takes Virtual Reality objects and a trackable, physical representation of those objects, places them in the same coordinate system, and allows people to interact with both objects' representations (virtual and physical) simultaneously. After a short period of adjustment, the individuals begin to interact with all the objects in the scene as if they were real-life objects. The ability to physically touch and interact with digitally created objects that have the same shape, size, location to their physical object counterpart in virtual reality environment can be a game changer when it comes to training, planning, engineering analysis, science, entertainment, etc. Our Project is developing such capabilities for various types of environments. The video outlined with this abstract is a representation of an ISS Hybrid Reality experience. In the video you can see various Hybrid Reality elements that provide immersion beyond just standard Virtual Reality or Augmented Reality.
Media-Augmented Exercise Machines
NASA Astrophysics Data System (ADS)
Krueger, T.
2002-01-01
Cardio-vascular exercise has been used to mitigate the muscle and cardiac atrophy associated with adaptation to micro-gravity environments. Several hours per day may be required. In confined spaces and long duration missions this kind of exercise is inevitably repetitive and rapidly becomes uninteresting. At the same time, there are pressures to accomplish as much as possible given the cost- per-hour for humans occupying orbiting or interplanetary. Media augmentation provides a the means to overlap activities in time by supplementing the exercise with social, recreational, training or collaborative activities and thereby reducing time pressures. In addition, the machine functions as an interface to a wide range of digital environments allowing for spatial variety in an otherwise confined environment. We hypothesize that the adoption of media augmented exercise machines will have a positive effect on psycho-social well-being on long duration missions. By organizing and supplementing exercise machines, data acquisition hardware, computers and displays into an interacting system this proposal increases functionality with limited additional mass. This paper reviews preliminary work on a project to augment exercise equipment in a manner that addresses these issues and at the same time opens possibilities for additional benefits. A testbed augmented exercise machine uses a specialty built cycle trainer as both input to a virtual environment and as an output device from it using spatialized sound, and visual displays, vibration transducers and variable resistance. The resulting interactivity increases a sense of engagement in the exercise, provides a rich experience of the digital environments. Activities in the virtual environment and accompanying physiological and psychological indicators may be correlated to track and evaluate the health of the crew.
Nifakos, Sokratis; Zary, Nabil
2014-01-01
The research community has called for the development of effective educational interventions for addressing prescription behaviour since antimicrobial resistance remains a global health issue. Examining the potential to displace the educational process from Personal Computers to Mobile devices, in this paper we investigated a new method of integration of Virtual Patients into Mobile devices with augmented reality technology, enriching the practitioner's education in prescription behavior. Moreover, we also explored which information are critical during the prescription behavior education and we visualized these information on real context with augmented reality technology, simultaneously with a running Virtual Patient's scenario. Following this process, we set the educational frame of experiential knowledge to a mixed (virtual and real) environment.
Virtual Reality-Based Center of Mass-Assisted Personalized Balance Training System
Kumar, Deepesh; González, Alejandro; Das, Abhijit; Dutta, Anirban; Fraisse, Philippe; Hayashibe, Mitsuhiro; Lahiri, Uttama
2018-01-01
Poststroke hemiplegic patients often show altered weight distribution with balance disorders, increasing their risk of fall. Conventional balance training, though powerful, suffers from scarcity of trained therapists, frequent visits to clinics to get therapy, one-on-one therapy sessions, and monotony of repetitive exercise tasks. Thus, technology-assisted balance rehabilitation can be an alternative solution. Here, we chose virtual reality as a technology-based platform to develop motivating balance tasks. This platform was augmented with off-the-shelf available sensors such as Nintendo Wii balance board and Kinect to estimate one’s center of mass (CoM). The virtual reality-based CoM-assisted balance tasks (Virtual CoMBaT) was designed to be adaptive to one’s individualized weight-shifting capability quantified through CoM displacement. Participants were asked to interact with Virtual CoMBaT that offered tasks of varying challenge levels while adhering to ankle strategy for weight shifting. To facilitate the patients to use ankle strategy during weight-shifting, we designed a heel lift detection module. A usability study was carried out with 12 hemiplegic patients. Results indicate the potential of our system to contribute to improving one’s overall performance in balance-related tasks belonging to different difficulty levels. PMID:29359128
NASA Technical Reports Server (NTRS)
Ellis, Stephen R.
2006-01-01
The visual requirements for augmented reality or virtual environments displays that might be used in real or virtual towers are reviewed with respect to similar displays already used in aircraft. As an example of the type of human performance studies needed to determine the useful specifications of augmented reality displays, an optical see-through display was used in an ATC Tower simulation. Three different binocular fields of view (14deg, 28deg, and 47deg) were examined to determine their effect on subjects ability to detect aircraft maneuvering and landing. The results suggest that binocular fields of view much greater than 47deg are unlikely to dramatically improve search performance and that partial binocular overlap is a feasible display technique for augmented reality Tower applications.
Augmented Virtual Reality: How to Improve Education Systems
ERIC Educational Resources Information Center
Fernandez, Manuel
2017-01-01
This essay presents and discusses the developing role of virtual and augmented reality technologies in education. Addressing the challenges in adapting such technologies to focus on improving students' learning outcomes, the author discusses the inclusion of experiential modes as a vehicle for improving students' knowledge acquisition.…
Meldrum, Dara; Herdman, Susan; Moloney, Roisin; Murray, Deirdre; Duffy, Douglas; Malone, Kareena; French, Helen; Hone, Stephen; Conroy, Ronan; McConn-Walsh, Rory
2012-03-26
Unilateral peripheral vestibular loss results in gait and balance impairment, dizziness and oscillopsia. Vestibular rehabilitation benefits patients but optimal treatment remains unknown. Virtual reality is an emerging tool in rehabilitation and provides opportunities to improve both outcomes and patient satisfaction with treatment. The Nintendo Wii Fit Plus® (NWFP) is a low cost virtual reality system that challenges balance and provides visual and auditory feedback. It may augment the motor learning that is required to improve balance and gait, but no trials to date have investigated efficacy. In a single (assessor) blind, two centre randomised controlled superiority trial, 80 patients with unilateral peripheral vestibular loss will be randomised to either conventional or virtual reality based (NWFP) vestibular rehabilitation for 6 weeks. The primary outcome measure is gait speed (measured with three dimensional gait analysis). Secondary outcomes include computerised posturography, dynamic visual acuity, and validated questionnaires on dizziness, confidence and anxiety/depression. Outcome will be assessed post treatment (8 weeks) and at 6 months. Advances in the gaming industry have allowed mass production of highly sophisticated low cost virtual reality systems that incorporate technology previously not accessible to most therapists and patients. Importantly, they are not confined to rehabilitation departments, can be used at home and provide an accurate record of adherence to exercise. The benefits of providing augmented feedback, increasing intensity of exercise and accurately measuring adherence may improve conventional vestibular rehabilitation but efficacy must first be demonstrated. Clinical trials.gov identifier: NCT01442623.
NASA Technical Reports Server (NTRS)
Ellis, Stephen R.
2006-01-01
The visual requirements for augmented reality or virtual environments displays that might be used in real or virtual towers are reviewed wi th respect to similar displays already used in aircraft. As an example of the type of human performance studies needed to determine the use ful specifications of augmented reality displays, an optical see-thro ugh display was used in an ATC Tower simulation. Three different binocular fields of view (14 deg, 28 deg, and 47 deg) were examined to det ermine their effect on subjects# ability to detect aircraft maneuveri ng and landing. The results suggest that binocular fields of view much greater than 47 deg are unlikely to dramatically improve search perf ormance and that partial binocular overlap is a feasible display tech nique for augmented reality Tower applications.
ERIC Educational Resources Information Center
Chen, Jingjing; Xu, Jianliang; Tang, Tao; Chen, Rongchao
2017-01-01
Interaction is critical for successful teaching and learning in a virtual learning environment (VLE). This paper presents a web-based interaction-aware VLE--WebIntera-classroom--which aims to augment learning interactions by increasing the learner-to-content and learner-to-instructor interactions. We design a ubiquitous interactive interface that…
ERIC Educational Resources Information Center
Orman, Evelyn K.; Price, Harry E.; Russell, Christine R.
2017-01-01
Acquiring nonverbal skills necessary to appropriately communicate and educate members of performing ensembles is essential for wind band conductors. Virtual reality learning environments (VRLEs) provide a unique setting for developing these proficiencies. For this feasibility study, we used an augmented immersive VRLE to enhance eye contact, torso…
NASA Astrophysics Data System (ADS)
Wozniak, Peter; Vauderwange, Oliver; Mandal, Avikarsha; Javahiraly, Nicolas; Curticapean, Dan
2016-09-01
Practical exercises are a crucial part of many curricula. Even simple exercises can improve the understanding of the underlying subject. Most experimental setups require special hardware. To carry out e. g. a lens experiments the students need access to an optical bench, various lenses, light sources, apertures and a screen. In our previous publication we demonstrated the use of augmented reality visualization techniques in order to let the students prepare with a simulated experimental setup. Within the context of our intended blended learning concept we want to utilize augmented or virtual reality techniques for stationary laboratory exercises. Unlike applications running on mobile devices, stationary setups can be extended more easily with additional interfaces and thus allow for more complex interactions and simulations in virtual reality (VR) and augmented reality (AR). The most significant difference is the possibility to allow interactions beyond touching a screen. The LEAP Motion controller is a small inexpensive device that allows for the tracking of the user's hands and fingers in three dimensions. It is conceivable to allow the user to interact with the simulation's virtual elements by the user's very hand position, movement and gesture. In this paper we evaluate possible applications of the LEAP Motion controller for simulated experiments in augmented and virtual reality. We pay particular attention to the devices strengths and weaknesses and want to point out useful and less useful application scenarios.
Cabrilo, Ivan; Bijlenga, Philippe; Schaller, Karl
2014-09-01
Augmented reality technology has been used for intraoperative image guidance through the overlay of virtual images, from preoperative imaging studies, onto the real-world surgical field. Although setups based on augmented reality have been used for various neurosurgical pathologies, very few cases have been reported for the surgery of arteriovenous malformations (AVM). We present our experience with AVM surgery using a system designed for image injection of virtual images into the operating microscope's eyepiece, and discuss why augmented reality may be less appealing in this form of surgery. N = 5 patients underwent AVM resection assisted by augmented reality. Virtual three-dimensional models of patients' heads, skulls, AVM nidi, and feeder and drainage vessels were selectively segmented and injected into the microscope's eyepiece for intraoperative image guidance, and their usefulness was assessed in each case. Although the setup helped in performing tailored craniotomies, in guiding dissection and in localizing drainage veins, it did not provide the surgeon with useful information concerning feeder arteries, due to the complexity of AVM angioarchitecture. The difficulty in intraoperatively conveying useful information on feeder vessels may make augmented reality a less engaging tool in this form of surgery, and might explain its underrepresentation in the literature. Integrating an AVM's hemodynamic characteristics into the augmented rendering could make it more suited to AVM surgery.
Augmented virtuality for arthroscopic knee surgery.
Li, John M; Bardana, Davide D; Stewart, A James
2011-01-01
This paper describes a computer system to visualize the location and alignment of an arthroscope using augmented virtuality. A 3D computer model of the patient's joint (from CT) is shown, along with a model of the tracked arthroscopic probe and the projection of the camera image onto the virtual joint. A user study, using plastic bones instead of live patients, was made to determine the effectiveness of this navigated display; the study showed that the navigated display improves target localization in novice residents.
The Influences of the 2D Image-Based Augmented Reality and Virtual Reality on Student Learning
ERIC Educational Resources Information Center
Liou, Hsin-Hun; Yang, Stephen J. H.; Chen, Sherry Y.; Tarng, Wernhuar
2017-01-01
Virtual reality (VR) learning environments can provide students with concepts of the simulated phenomena, but users are not allowed to interact with real elements. Conversely, augmented reality (AR) learning environments blend real-world environments so AR could enhance the effects of computer simulation and promote students' realistic experience.…
ERIC Educational Resources Information Center
Gavish, Nirit; Gutiérrez, Teresa; Webel, Sabine; Rodríguez, Jorge; Peveri, Matteo; Bockholt, Uli; Tecchia, Franco
2015-01-01
The current study evaluated the use of virtual reality (VR) and augmented reality (AR) platforms, developed within the scope of the SKILLS Integrated Project, for industrial maintenance and assembly (IMA) tasks training. VR and AR systems are now widely regarded as promising training platforms for complex and highly demanding IMA tasks. However,…
ERIC Educational Resources Information Center
Yang, Mau-Tsuen; Liao, Wan-Che
2014-01-01
The physical-virtual immersion and real-time interaction play an essential role in cultural and language learning. Augmented reality (AR) technology can be used to seamlessly merge virtual objects with real-world images to realize immersions. Additionally, computer vision (CV) technology can recognize free-hand gestures from live images to enable…
Virtually-augmented interfaces for tactical aircraft.
Haas, M W
1995-05-01
The term Fusion Interface is defined as a class of interface which integrally incorporates both virtual and non-virtual concepts and devices across the visual, auditory and haptic sensory modalities. A fusion interface is a multi-sensory virtually-augmented synthetic environment. A new facility has been developed within the Human Engineering Division of the Armstrong Laboratory dedicated to exploratory development of fusion-interface concepts. One of the virtual concepts to be investigated in the Fusion Interfaces for Tactical Environments facility (FITE) is the application of EEG and other physiological measures for virtual control of functions within the flight environment. FITE is a specialized flight simulator which allows efficient concept development through the use of rapid prototyping followed by direct experience of new fusion concepts. The FITE facility also supports evaluation of fusion concepts by operational fighter pilots in a high fidelity simulated air combat environment. The facility was utilized by a multi-disciplinary team composed of operational pilots, human-factors engineers, electronics engineers, computer scientists, and experimental psychologists to prototype and evaluate the first multi-sensory, virtually-augmented cockpit. The cockpit employed LCD-based head-down displays, a helmet-mounted display, three-dimensionally localized audio displays, and a haptic display. This paper will endeavor to describe the FITE facility architecture, some of the characteristics of the FITE virtual display and control devices, and the potential application of EEG and other physiological measures within the FITE facility.
NASA Astrophysics Data System (ADS)
Simonetto, E.; Froment, C.; Labergerie, E.; Ferré, G.; Séchet, B.; Chédorge, H.; Cali, J.; Polidori, L.
2013-07-01
Terrestrial Laser Scanning (TLS), 3-D modeling and its Web visualization are the three key steps needed to perform storage and grant-free and wide access to cultural heritage, as highlighted in many recent examples. The goal of this study is to set up 3-D Web resources for "virtually" visiting the exterior of the Abbaye de l'Epau, an old French abbey which has both a rich history and delicate architecture. The virtuality is considered in two ways: the flowing navigation in a virtual reality environment around the abbey and a game activity using augmented reality. First of all, the data acquisition consists in GPS and tacheometry survey, terrestrial laser scanning and photography acquisition. After data pre-processing, the meshed and textured 3-D model is generated using 3-D Reshaper commercial software. The virtual reality visit and augmented reality animation are then created using Unity software. This work shows the interest of such tools in bringing out the regional cultural heritage and making it attractive to the public.
NASA Astrophysics Data System (ADS)
Rankin, Adam; Moore, John; Bainbridge, Daniel; Peters, Terry
2016-03-01
In the past ten years, numerous new surgical and interventional techniques have been developed for treating heart valve disease without the need for cardiopulmonary bypass. Heart valve repair is now being performed in a blood-filled environment, reinforcing the need for accurate and intuitive imaging techniques. Previous work has demonstrated how augmenting ultrasound with virtual representations of specific anatomical landmarks can greatly simplify interventional navigation challenges and increase patient safety. These techniques often complicate interventions by requiring additional steps taken to manually define and initialize virtual models. Furthermore, overlaying virtual elements into real-time image data can also obstruct the view of salient image information. To address these limitations, a system was developed that uses real-time volumetric ultrasound alongside magnetically tracked tools presented in an augmented virtuality environment to provide a streamlined navigation guidance platform. In phantom studies simulating a beating-heart navigation task, procedure duration and tool path metrics have achieved comparable performance to previous work in augmented virtuality techniques, and considerable improvement over standard of care ultrasound guidance.
NASA Astrophysics Data System (ADS)
Starodubtsev, Illya
2017-09-01
The paper describes the implementation of the system of interaction with virtual objects based on gestures. The paper describes the common problems of interaction with virtual objects, specific requirements for the interfaces for virtual and augmented reality.
Markerless client-server augmented reality system with natural features
NASA Astrophysics Data System (ADS)
Ning, Shuangning; Sang, Xinzhu; Chen, Duo
2017-10-01
A markerless client-server augmented reality system is presented. In this research, the more extensive and mature virtual reality head-mounted display is adopted to assist the implementation of augmented reality. The viewer is provided an image in front of their eyes with the head-mounted display. The front-facing camera is used to capture video signals into the workstation. The generated virtual scene is merged with the outside world information received from the camera. The integrated video is sent to the helmet display system. The distinguishing feature and novelty is to realize the augmented reality with natural features instead of marker, which address the limitations of the marker, such as only black and white, the inapplicability of different environment conditions, and particularly cannot work when the marker is partially blocked. Further, 3D stereoscopic perception of virtual animation model is achieved. The high-speed and stable socket native communication method is adopted for transmission of the key video stream data, which can reduce the calculation burden of the system.
Augmented kinematic feedback from haptic virtual reality for dental skill acquisition.
Suebnukarn, Siriwan; Haddawy, Peter; Rhienmora, Phattanapon; Jittimanee, Pannapa; Viratket, Piyanuch
2010-12-01
We have developed a haptic virtual reality system for dental skill training. In this study we examined several kinds of kinematic information about the movement provided by the system supplement knowledge of results (KR) in dental skill acquisition. The kinematic variables examined involved force utilization (F) and mirror view (M). This created three experimental conditions that received augmented kinematic feedback (F, M, FM) and one control condition that did not (KR-only). Thirty-two dental students were randomly assigned to four groups. Their task was to perform access opening on the upper first molar with the haptic virtual reality system. An acquisition session consisted of two days of ten trials of practice in which augmented kinematic feedback was provided for the appropriate experimental conditions after each trial. One week after, a retention test consisting of two trials without augmented feedback was completed. The results showed that the augmented kinematic feedback groups had larger mean performance scores than the KR-only group in Day 1 of the acquisition and retention sessions (ANOVA, p<0.05). The apparent differences among feedback groups were not significant in Day 2 of the acquisition session (ANOVA, p>0.05). The trends in acquisition and retention sessions suggest that the augmented kinematic feedback can enhance the performance earlier in the skill acquisition and retention sessions.
Corrêa, Ana Grasielle Dionísio; de Assis, Gilda Aparecida; do Nascimento, Marilena; de Deus Lopes, Roseli
2017-04-01
Augmented Reality musical software (GenVirtual) is a technology, which primarily allows users to develop music activities for rehabilitation. This study aimed to analyse the perceptions of health care professionals regarding the clinical utility of GenVirtual. A second objective was to identify improvements to GenVirtual software and similar technologies. Music therapists, occupational therapists, physiotherapists and speech and language therapist who assist people with physical and cognitive disabilities were enrolled in three focus groups. The quantitative and qualitative data were collected through inductive thematic analysis. Three main themes were identified: the use of GenVirtual in health care areas; opportunities for realistic application of GenVirtual; and limitations in the use of GenVirtual. The registration units identified were: motor stimulation, cognitive stimulation, verbal learning, recreation activity, musicality, accessibility, motivation, sonic accuracy, interference of lighting, poor sound, children and adults. This research suggested that the GenVirtual is a complementary tool to conventional clinical practice and has great potential to motor and cognitive rehabilitation of children and adults. Implications for Rehabilitation Gaining health professional' perceptions of the Augmented Reality musical game (GenVirtual) give valuable information as to the clinical utility of the software. GenVirtual was perceived as a tool that could be used as enhancing the motor and cognitive rehabilitation process. GenVirtual was viewed as a tool that could enhance clinical practice and communication among various agencies, but it was suggested that it should be used with caution to avoid confusion and replacement of important services.
The virtual mirror: a new interaction paradigm for augmented reality environments.
Bichlmeier, Christoph; Heining, Sandro Michael; Feuerstein, Marco; Navab, Nassir
2009-09-01
Medical augmented reality (AR) has been widely discussed within the medical imaging as well as computer aided surgery communities. Different systems for exemplary medical applications have been proposed. Some of them produced promising results. One major issue still hindering AR technology to be regularly used in medical applications is the interaction between physician and the superimposed 3-D virtual data. Classical interaction paradigms, for instance with keyboard and mouse, to interact with visualized medical 3-D imaging data are not adequate for an AR environment. This paper introduces the concept of a tangible/controllable Virtual Mirror for medical AR applications. This concept intuitively augments the direct view of the surgeon with all desired views on volumetric medical imaging data registered with the operation site without moving around the operating table or displacing the patient. We selected two medical procedures to demonstrate and evaluate the potentials of the Virtual Mirror for the surgical workflow. Results confirm the intuitiveness of this new paradigm and its perceptive advantages for AR-based computer aided interventions.
Augmented Reality for the Improvement of Remote Laboratories: An Augmented Remote Laboratory
ERIC Educational Resources Information Center
Andujar, J. M.; Mejias, A.; Marquez, M. A.
2011-01-01
Augmented reality (AR) provides huge opportunities for online teaching in science and engineering, as these disciplines place emphasis on practical training and unsuited to completely nonclassroom training. This paper proposes a new concept in virtual and remote laboratories: the augmented remote laboratory (ARL). ARL is being tested in the first…
Baus, Oliver; Bouchard, Stéphane
2014-01-01
This paper reviews the move from virtual reality exposure-based therapy to augmented reality exposure-based therapy (ARET). Unlike virtual reality (VR), which entails a complete virtual environment (VE), augmented reality (AR) limits itself to producing certain virtual elements to then merge them into the view of the physical world. Although, the general public may only have become aware of AR in the last few years, AR type applications have been around since beginning of the twentieth century. Since, then, technological developments have enabled an ever increasing level of seamless integration of virtual and physical elements into one view. Like VR, AR allows the exposure to stimuli which, due to various reasons, may not be suitable for real-life scenarios. As such, AR has proven itself to be a medium through which individuals suffering from specific phobia can be exposed “safely” to the object(s) of their fear, without the costs associated with programing complete VEs. Thus, ARET can offer an efficacious alternative to some less advantageous exposure-based therapies. Above and beyond presenting what has been accomplished in ARET, this paper covers some less well-known aspects of the history of AR, raises some ARET related issues, and proposes potential avenues to be followed. These include the type of measures to be used to qualify the user’s experience in an augmented reality environment, the exclusion of certain AR-type functionalities from the definition of AR, as well as the potential use of ARET to treat non-small animal phobias, such as social phobia. PMID:24624073
Baus, Oliver; Bouchard, Stéphane
2014-01-01
This paper reviews the move from virtual reality exposure-based therapy to augmented reality exposure-based therapy (ARET). Unlike virtual reality (VR), which entails a complete virtual environment (VE), augmented reality (AR) limits itself to producing certain virtual elements to then merge them into the view of the physical world. Although, the general public may only have become aware of AR in the last few years, AR type applications have been around since beginning of the twentieth century. Since, then, technological developments have enabled an ever increasing level of seamless integration of virtual and physical elements into one view. Like VR, AR allows the exposure to stimuli which, due to various reasons, may not be suitable for real-life scenarios. As such, AR has proven itself to be a medium through which individuals suffering from specific phobia can be exposed "safely" to the object(s) of their fear, without the costs associated with programing complete VEs. Thus, ARET can offer an efficacious alternative to some less advantageous exposure-based therapies. Above and beyond presenting what has been accomplished in ARET, this paper covers some less well-known aspects of the history of AR, raises some ARET related issues, and proposes potential avenues to be followed. These include the type of measures to be used to qualify the user's experience in an augmented reality environment, the exclusion of certain AR-type functionalities from the definition of AR, as well as the potential use of ARET to treat non-small animal phobias, such as social phobia.
Real-Time Occlusion Handling in Augmented Reality Based on an Object Tracking Approach
Tian, Yuan; Guan, Tao; Wang, Cheng
2010-01-01
To produce a realistic augmentation in Augmented Reality, the correct relative positions of real objects and virtual objects are very important. In this paper, we propose a novel real-time occlusion handling method based on an object tracking approach. Our method is divided into three steps: selection of the occluding object, object tracking and occlusion handling. The user selects the occluding object using an interactive segmentation method. The contour of the selected object is then tracked in the subsequent frames in real-time. In the occlusion handling step, all the pixels on the tracked object are redrawn on the unprocessed augmented image to produce a new synthesized image in which the relative position between the real and virtual object is correct. The proposed method has several advantages. First, it is robust and stable, since it remains effective when the camera is moved through large changes of viewing angles and volumes or when the object and the background have similar colors. Second, it is fast, since the real object can be tracked in real-time. Last, a smoothing technique provides seamless merging between the augmented and virtual object. Several experiments are provided to validate the performance of the proposed method. PMID:22319278
Improved analytic extreme-mass-ratio inspiral model for scoping out eLISA data analysis
NASA Astrophysics Data System (ADS)
Chua, Alvin J. K.; Gair, Jonathan R.
2015-12-01
The space-based gravitational-wave detector eLISA has been selected as the ESA L3 mission, and the mission design will be finalized by the end of this decade. To prepare for mission formulation over the next few years, several outstanding and urgent questions in data analysis will be addressed using mock data challenges, informed by instrument measurements from the LISA Pathfinder satellite launching at the end of 2015. These data challenges will require accurate and computationally affordable waveform models for anticipated sources such as the extreme-mass-ratio inspirals (EMRIs) of stellar-mass compact objects into massive black holes. Previous data challenges have made use of the well-known analytic EMRI waveforms of Barack and Cutler, which are extremely quick to generate but dephase relative to more accurate waveforms within hours, due to their mismatched radial, polar and azimuthal frequencies. In this paper, we describe an augmented Barack-Cutler model that uses a frequency map to the correct Kerr frequencies, along with updated evolution equations and a simple fit to a more accurate model. The augmented waveforms stay in phase for months and may be generated with virtually no additional computational cost.
Mobile devices, Virtual Reality, Augmented Reality, and Digital Geoscience Education.
NASA Astrophysics Data System (ADS)
Crompton, H.; De Paor, D. G.; Whitmeyer, S. J.; Bentley, C.
2016-12-01
Mobile devices are playing an increasing role in geoscience education. Affordances include instructor-student communication and class management in large classrooms, virtual and augmented reality applications, digital mapping, and crowd-sourcing. Mobile technologies have spawned the sub field of mobile learning or m-learning, which is defined as learning across multiple contexts, through social and content interactions. Geoscientists have traditionally engaged in non-digital mobile learning via fieldwork, but digital devices are greatly extending the possibilities, especially for non-traditional students. Smartphones and tablets are the most common devices but smart glasses such as Pivothead enable live streaming of a first-person view (see for example, https://youtu.be/gWrDaYP5w58). Virtual reality headsets such as Google Cardboard create an immersive virtual field experience and digital imagery such as GigaPan and Structure from Motion enables instructors and/or students to create virtual specimens and outcrops that are sharable across the globe. Whereas virtual reality (VR) replaces the real world with a virtual representation, augmented reality (AR) overlays digital data on the live scene visible to the user in real time. We have previously reported on our use of the AR application called FreshAiR for geoscientific "egg hunts." The popularity of Pokémon Go demonstrates the potential of AR for mobile learning in the geosciences.
Innovative application of virtual display technique in virtual museum
NASA Astrophysics Data System (ADS)
Zhang, Jiankang
2017-09-01
Virtual museum refers to display and simulate the functions of real museum on the Internet in the form of 3 Dimensions virtual reality by applying interactive programs. Based on Virtual Reality Modeling Language, virtual museum building and its effective interaction with the offline museum lie in making full use of 3 Dimensions panorama technique, virtual reality technique and augmented reality technique, and innovatively taking advantages of dynamic environment modeling technique, real-time 3 Dimensions graphics generating technique, system integration technique and other key virtual reality techniques to make sure the overall design of virtual museum.3 Dimensions panorama technique, also known as panoramic photography or virtual reality, is a technique based on static images of the reality. Virtual reality technique is a kind of computer simulation system which can create and experience the interactive 3 Dimensions dynamic visual world. Augmented reality, also known as mixed reality, is a technique which simulates and mixes the information (visual, sound, taste, touch, etc.) that is difficult for human to experience in reality. These technologies make virtual museum come true. It will not only bring better experience and convenience to the public, but also be conducive to improve the influence and cultural functions of the real museum.
Kiryu, Tohru; So, Richard H Y
2007-09-25
Around three years ago, in the special issue on augmented and virtual reality in rehabilitation, the topics of simulator sickness was briefly discussed in relation to vestibular rehabilitation. Simulator sickness with virtual reality applications have also been referred to as visually induced motion sickness or cybersickness. Recently, study on cybersickness has been reported in entertainment, training, game, and medical environment in several journals. Virtual stimuli can enlarge sensation of presence, but they sometimes also evoke unpleasant sensation. In order to safely apply augmented and virtual reality for long-term rehabilitation treatment, sensation of presence and cybersickness should be appropriately controlled. This issue presents the results of five studies conducted to evaluate visually-induced effects and speculate influences of virtual rehabilitation. In particular, the influence of visual and vestibular stimuli on cardiovascular responses are reported in terms of academic contribution.
Kiryu, Tohru; So, Richard HY
2007-01-01
Around three years ago, in the special issue on augmented and virtual reality in rehabilitation, the topics of simulator sickness was briefly discussed in relation to vestibular rehabilitation. Simulator sickness with virtual reality applications have also been referred to as visually induced motion sickness or cybersickness. Recently, study on cybersickness has been reported in entertainment, training, game, and medical environment in several journals. Virtual stimuli can enlarge sensation of presence, but they sometimes also evoke unpleasant sensation. In order to safely apply augmented and virtual reality for long-term rehabilitation treatment, sensation of presence and cybersickness should be appropriately controlled. This issue presents the results of five studies conducted to evaluate visually-induced effects and speculate influences of virtual rehabilitation. In particular, the influence of visual and vestibular stimuli on cardiovascular responses are reported in terms of academic contribution. PMID:17894857
Hung, Andrew J; Shah, Swar H; Dalag, Leonard; Shin, Daniel; Gill, Inderbir S
2015-08-01
We developed a novel procedure specific simulation platform for robotic partial nephrectomy. In this study we prospectively evaluate its face, content, construct and concurrent validity. This hybrid platform features augmented reality and virtual reality. Augmented reality involves 3-dimensional robotic partial nephrectomy surgical videos overlaid with virtual instruments to teach surgical anatomy, technical skills and operative steps. Advanced technical skills are assessed with an embedded full virtual reality renorrhaphy task. Participants were classified as novice (no surgical training, 15), intermediate (less than 100 robotic cases, 13) or expert (100 or more robotic cases, 14) and prospectively assessed. Cohort performance was compared with the Kruskal-Wallis test (construct validity). Post-study questionnaire was used to assess the realism of simulation (face validity) and usefulness for training (content validity). Concurrent validity evaluated correlation between virtual reality renorrhaphy task and a live porcine robotic partial nephrectomy performance (Spearman's analysis). Experts rated the augmented reality content as realistic (median 8/10) and helpful for resident/fellow training (8.0-8.2/10). Experts rated the platform highly for teaching anatomy (9/10) and operative steps (8.5/10) but moderately for technical skills (7.5/10). Experts and intermediates outperformed novices (construct validity) in efficiency (p=0.0002) and accuracy (p=0.002). For virtual reality renorrhaphy, experts outperformed intermediates on GEARS metrics (p=0.002). Virtual reality renorrhaphy and in vivo porcine robotic partial nephrectomy performance correlated significantly (r=0.8, p <0.0001) (concurrent validity). This augmented reality simulation platform displayed face, content and construct validity. Performance in the procedure specific virtual reality task correlated highly with a porcine model (concurrent validity). Future efforts will integrate procedure specific virtual reality tasks and their global assessment. Copyright © 2015 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Barrile, V.; Bilotta, G.; Meduri, G. M.; De Carlo, D.; Nunnari, A.
2017-11-01
In this study, using technologies such as laser scanner and GPR it was desired to see their potential in the cultural heritage. Also with regard to the processing part we are compared the results obtained by the various commercial software and algorithms developed and implemented in Matlab. Moreover, Virtual Reality and Augmented Reality allow integrating the real world with historical-artistic information, laser scanners and georadar (GPR) data and virtual objects, virtually enriching it with multimedia elements, graphic and textual information accessible through smartphones and tablets.
The NASA Augmented/Virtual Reality Lab: The State of the Art at KSC
NASA Technical Reports Server (NTRS)
Little, William
2017-01-01
The NASA Augmented Virtual Reality (AVR) Lab at Kennedy Space Center is dedicated to the investigation of Augmented Reality (AR) and Virtual Reality (VR) technologies, with the goal of determining potential uses of these technologies as human-computer interaction (HCI) devices in an aerospace engineering context. Begun in 2012, the AVR Lab has concentrated on commercially available AR and VR devices that are gaining in popularity and use in a number of fields such as gaming, training, and telepresence. We are working with such devices as the Microsoft Kinect, the Oculus Rift, the Leap Motion, the HTC Vive, motion capture systems, and the Microsoft Hololens. The focus of our work has been on human interaction with the virtual environment, which in turn acts as a communications bridge to remote physical devices and environments which the operator cannot or should not control or experience directly. Particularly in reference to dealing with spacecraft and the oftentimes hazardous environments they inhabit, it is our hope that AR and VR technologies can be utilized to increase human safety and mission success by physically removing humans from those hazardous environments while virtually putting them right in the middle of those environments.
NASA Astrophysics Data System (ADS)
Ribeiro, Allan; Santos, Helen
With the advent of new information and communication technologies (ICTs), the communicative interaction changes the way of being and acting of people, at the same time that changes the way of work activities related to education. In this range of possibilities provided by the advancement of computational resources include virtual reality (VR) and augmented reality (AR), are highlighted as new forms of information visualization in computer applications. While the RV allows user interaction with a virtual environment totally computer generated; in RA the virtual images are inserted in real environment, but both create new opportunities to support teaching and learning in formal and informal contexts. Such technologies are able to express representations of reality or of the imagination, as systems in nanoscale and low dimensionality, being imperative to explore, in the most diverse areas of knowledge, the potential offered by ICT and emerging technologies. In this sense, this work presents computer applications of virtual and augmented reality developed with the use of modeling and simulation in computational approaches to topics related to nanoscience and nanotechnology, and articulated with innovative pedagogical practices.
Custom Titanium Ridge Augmentation Matrix (CTRAM): A Case Report.
Connors, Christopher A; Liacouras, Peter C; Grant, Gerald T
2016-01-01
This is a case report of a custom titanium ridge augmentation matrix (CTRAM). Using cone beam computed tomography (CBCT), a custom titanium space-maintaining device was developed. Alveolar ridges were virtually augmented, a matrix was virtually designed, and the CTRAM was additively manufactured with titanium (Ti6Al4V). Two cases are presented that resulted in sufficient increased horizontal bone volume with successful dental implant placement. The CTRAM design allows for preoperative planning for increasing alveolar ridge dimensions to support dental implants, reduces surgical time, and prevents the need for a second surgical site to gain sufficient alveolar ridge bone volume for dental implant therapy.
Three-Dimensional Sensor Common Operating Picture (3-D Sensor COP)
2017-01-01
created. Additionally, a 3-D model of the sensor itself can be created. Using these 3-D models, along with emerging virtual and augmented reality tools...augmented reality 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF PAGES 20 19a...iii Contents List of Figures iv 1. Introduction 1 2. The 3-D Sensor COP 2 3. Virtual Sensor Placement 7 4. Conclusions 10 5. References 11
An Analysis of Engagement in a Combination Indoor/Outdoor Augmented Reality Educational Game
ERIC Educational Resources Information Center
Folkestad, James; O'Shea, Patrick
2011-01-01
This paper describes the results of a qualitative analysis of video captured during a dual indoor/outdoor Augmented Reality experience. Augmented Reality is the layering of virtual information on top of the physical world. This Augmented Reality experience asked students to interact with the San Diego Museum of Art and the Botanical Gardens in San…
ARSC: Augmented Reality Student Card--An Augmented Reality Solution for the Education Field
ERIC Educational Resources Information Center
El Sayed, Neven A. M.; Zayed, Hala H.; Sharawy, Mohamed I.
2011-01-01
Augmented Reality (AR) is the technology of adding virtual objects to real scenes through enabling the addition of missing information in real life. As the lack of resources is a problem that can be solved through AR, this paper presents and explains the usage of AR technology we introduce Augmented Reality Student Card (ARSC) as an application of…
Cranial implant design using augmented reality immersive system.
Ai, Zhuming; Evenhouse, Ray; Leigh, Jason; Charbel, Fady; Rasmussen, Mary
2007-01-01
Software tools that utilize haptics for sculpting precise fitting cranial implants are utilized in an augmented reality immersive system to create a virtual working environment for the modelers. The virtual environment is designed to mimic the traditional working environment as closely as possible, providing more functionality for the users. The implant design process uses patient CT data of a defective area. This volumetric data is displayed in an implant modeling tele-immersive augmented reality system where the modeler can build a patient specific implant that precisely fits the defect. To mimic the traditional sculpting workspace, the implant modeling augmented reality system includes stereo vision, viewer centered perspective, sense of touch, and collaboration. To achieve optimized performance, this system includes a dual-processor PC, fast volume rendering with three-dimensional texture mapping, the fast haptic rendering algorithm, and a multi-threading architecture. The system replaces the expensive and time consuming traditional sculpting steps such as physical sculpting, mold making, and defect stereolithography. This augmented reality system is part of a comprehensive tele-immersive system that includes a conference-room-sized system for tele-immersive small group consultation and an inexpensive, easily deployable networked desktop virtual reality system for surgical consultation, evaluation and collaboration. This system has been used to design patient-specific cranial implants with precise fit.
Virtual reality, augmented reality…I call it i-Reality.
Grossmann, Rafael J
2015-01-01
The new term improved reality (i-Reality) is suggested to include virtual reality (VR) and augmented reality (AR). It refers to a real world that includes improved, enhanced and digitally created features that would offer an advantage on a particular occasion (i.e., a medical act). I-Reality may help us bridge the gap between the high demand for medical providers and the low supply of them by improving the interaction between providers and patients.
Augmentation of Cognition and Perception Through Advanced Synthetic Vision Technology
NASA Technical Reports Server (NTRS)
Prinzel, Lawrence J., III; Kramer, Lynda J.; Bailey, Randall E.; Arthur, Jarvis J.; Williams, Steve P.; McNabb, Jennifer
2005-01-01
Synthetic Vision System technology augments reality and creates a virtual visual meteorological condition that extends a pilot's cognitive and perceptual capabilities during flight operations when outside visibility is restricted. The paper describes the NASA Synthetic Vision System for commercial aviation with an emphasis on how the technology achieves Augmented Cognition objectives.
Teaching professionalism through virtual means.
McEvoy, Michelle; Butler, Bryan; MacCarrick, Geraldine
2012-02-01
Virtual patients are used across a variety of clinical disciplines for both teaching and assessment, but are they an appropriate environment in which to develop professional skills? This study aimed to evaluate students' perceived effectiveness of an online interactive virtual patient developed to augment a personal professional development curriculum, and to identify factors that would maximise the associated educational benefits. Student focus group discussions were conducted to explore students' views on the usefulness and acceptability of the virtual patient as an educational tool to teach professionalism, and to identify factors for improvement. A thematic content analysis was used to capture content and synthesise the range of opinions expressed. Overall there was a positive response to the virtual patient. The students recognised the need to teach and assess professionalism throughout their curriculum, and viewed the virtual patient as a potentially engaging and valuable addition to their curriculum. We identified factors for improvement to guide the development of future virtual patients. It is possible to improve approaches to teaching and learning professionalism by exploring students' views on innovative teaching developments designed to augment personal professional development curricula. © Blackwell Publishing Ltd 2012.
Virtual Technologies Trends in Education
ERIC Educational Resources Information Center
Martín-Gutiérrez, Jorge; Mora, Carlos Efrén; Añorbe-Díaz, Beatriz; González-Marrero, Antonio
2017-01-01
Virtual reality captures people's attention. This technology has been applied in many sectors such as medicine, industry, education, video games, or tourism. Perhaps its biggest area of interest has been leisure and entertainment. Regardless the sector, the introduction of virtual or augmented reality had several constraints: it was expensive, it…
Immersive Education, an Annotated Webliography
ERIC Educational Resources Information Center
Pricer, Wayne F.
2011-01-01
In this second installment of a two-part feature on immersive education a webliography will provide resources discussing the use of various types of computer simulations including: (a) augmented reality, (b) virtual reality programs, (c) gaming resources for teaching with technology, (d) virtual reality lab resources, (e) virtual reality standards…
Streepey, Jefferson W; Kenyon, Robert V; Keshner, Emily A
2007-01-01
We previously reported responses to induced postural instability in young healthy individuals viewing visual motion with a narrow (25 degrees in both directions) and wide (90 degrees and 55 degrees in the horizontal and vertical directions) field of view (FOV) as they stood on different sized blocks. Visual motion was achieved using an immersive virtual environment that moved realistically with head motion (natural motion) and translated sinusoidally at 0.1 Hz in the fore-aft direction (augmented motion). We observed that a subset of the subjects (steppers) could not maintain continuous stance on the smallest block when the virtual environment was in motion. We completed a posteriori analyses on the postural responses of the steppers and non-steppers that may inform us about the mechanisms underlying these differences in stability. We found that when viewing augmented motion with a wide FOV, there was a greater effect on the head and whole body center of mass and ankle angle root mean square (RMS) values of the steppers than of the non-steppers. FFT analyses revealed greater power at the frequency of the visual stimulus in the steppers compared to the non-steppers. Whole body COM time lags relative to the augmented visual scene revealed that the time-delay between the scene and the COM was significantly increased in the steppers. The increased responsiveness to visual information suggests a greater visual field-dependency of the steppers and suggests that the thresholds for shifting from a reliance on visual information to somatosensory information can differ even within a healthy population.
Utilizing virtual and augmented reality for educational and clinical enhancements in neurosurgery.
Pelargos, Panayiotis E; Nagasawa, Daniel T; Lagman, Carlito; Tenn, Stephen; Demos, Joanna V; Lee, Seung J; Bui, Timothy T; Barnette, Natalie E; Bhatt, Nikhilesh S; Ung, Nolan; Bari, Ausaf; Martin, Neil A; Yang, Isaac
2017-01-01
Neurosurgery has undergone a technological revolution over the past several decades, from trephination to image-guided navigation. Advancements in virtual reality (VR) and augmented reality (AR) represent some of the newest modalities being integrated into neurosurgical practice and resident education. In this review, we present a historical perspective of the development of VR and AR technologies, analyze its current uses, and discuss its emerging applications in the field of neurosurgery. Copyright © 2016 Elsevier Ltd. All rights reserved.
Augmented reality for breast imaging.
Rancati, Alberto; Angrigiani, Claudio; Nava, Maurizio B; Catanuto, Giuseppe; Rocco, Nicola; Ventrice, Fernando; Dorr, Julio
2018-06-01
Augmented reality (AR) enables the superimposition of virtual reality reconstructions onto clinical images of a real patient, in real time. This allows visualization of internal structures through overlying tissues, thereby providing a virtual transparency vision of surgical anatomy. AR has been applied to neurosurgery, which utilizes a relatively fixed space, frames, and bony references; the application of AR facilitates the relationship between virtual and real data. Augmented breast imaging (ABI) is described. Breast MRI studies for breast implant patients with seroma were performed using a Siemens 3T system with a body coil and a four-channel bilateral phased-array breast coil as the transmitter and receiver, respectively. Gadolinium was injected as a contrast agent (0.1 mmol/kg at 2 mL/s) using a programmable power injector. Dicom formatted images data from 10 MRI cases of breast implant seroma and 10 MRI cases with T1-2 N0 M0 breast cancer, were imported and transformed into augmented reality images. ABI demonstrated stereoscopic depth perception, focal point convergence, 3D cursor use, and joystick fly-through. ABI can improve clinical outcomes, providing an enhanced view of the structures to work on. It should be further studied to determine its utility in clinical practice.
See-through 3D technology for augmented reality
NASA Astrophysics Data System (ADS)
Lee, Byoungho; Lee, Seungjae; Li, Gang; Jang, Changwon; Hong, Jong-Young
2017-06-01
Augmented reality is recently attracting a lot of attention as one of the most spotlighted next-generation technologies. In order to get toward realization of ideal augmented reality, we need to integrate 3D virtual information into real world. This integration should not be noticed by users blurring the boundary between the virtual and real worlds. Thus, ultimate device for augmented reality can reconstruct and superimpose 3D virtual information on the real world so that they are not distinguishable, which is referred to as see-through 3D technology. Here, we introduce our previous researches to combine see-through displays and 3D technologies using emerging optical combiners: holographic optical elements and index matched optical elements. Holographic optical elements are volume gratings that have angular and wavelength selectivity. Index matched optical elements are partially reflective elements using a compensation element for index matching. Using these optical combiners, we could implement see-through 3D displays based on typical methodologies including integral imaging, digital holographic displays, multi-layer displays, and retinal projection. Some of these methods are expected to be optimized and customized for head-mounted or wearable displays. We conclude with demonstration and analysis of fundamental researches for head-mounted see-through 3D displays.
The Evolving Virtual Library: Visions and Case Studies.
ERIC Educational Resources Information Center
Saunders, Laverna M., Ed.
This book addresses many of the practical issues involved in developing the virtual library. Seven presentations from the Eighth Annual Computers in Libraries Conference are included in this book in augmented form. The papers are supplemented by "The Evolving Virtual Library: An Overview" (Laverna M. Saunders and Maurice Mitchell), a…
Human responses to augmented virtual scaffolding models.
Hsiao, Hongwei; Simeonov, Peter; Dotson, Brian; Ammons, Douglas; Kau, Tsui-Ying; Chiou, Sharon
2005-08-15
This study investigated the effect of adding real planks, in virtual scaffolding models of elevation, on human performance in a surround-screen virtual reality (SSVR) system. Twenty-four construction workers and 24 inexperienced controls performed walking tasks on real and virtual planks at three virtual heights (0, 6 m, 12 m) and two scaffolding-platform-width conditions (30, 60 cm). Gait patterns, walking instability measurements and cardiovascular reactivity were assessed. The results showed differences in human responses to real vs. virtual planks in walking patterns, instability score and heart-rate inter-beat intervals; it appeared that adding real planks in the SSVR virtual scaffolding model enhanced the quality of SSVR as a human - environment interface research tool. In addition, there were significant differences in performance between construction workers and the control group. The inexperienced participants were more unstable as compared to construction workers. Both groups increased their stride length with repetitions of the task, indicating a possibly confidence- or habit-related learning effect. The practical implications of this study are in the adoption of augmented virtual models of elevated construction environments for injury prevention research, and the development of programme for balance-control training to reduce the risk of falls at elevation before workers enter a construction job.
Augmented Reality for Close Quarters Combat
None
2018-01-16
Sandia National Laboratories has developed a state-of-the-art augmented reality training system for close-quarters combat (CQB). This system uses a wearable augmented reality system to place the user in a real environment while engaging enemy combatants in virtual space (Boston Dynamics DI-Guy). Umbra modeling and simulation environment is used to integrate and control the AR system.
Augmenting Your Own Reality: Student Authoring of Science-Based Augmented Reality Games
ERIC Educational Resources Information Center
Klopfer, Eric; Sheldon, Josh
2010-01-01
Augmented Reality (AR) simulations superimpose a virtual overlay of data and interactions onto a real-world context. The simulation engine at the heart of this technology is built to afford elements of game play that support explorations and learning in students' natural context--their own community and surroundings. In one of the more recent…
Learning Anatomy via Mobile Augmented Reality: Effects on Achievement and Cognitive Load
ERIC Educational Resources Information Center
Küçük, Sevda; Kapakin, Samet; Göktas, Yüksel
2016-01-01
Augmented reality (AR), a new generation of technology, has attracted the attention of educators in recent years. In this study, a MagicBook was developed for a neuroanatomy topic by using mobile augmented reality (mAR) technology. This technology integrates virtual learning objects into the real world and allow users to interact with the…
On Location Learning: Authentic Applied Science with Networked Augmented Realities
NASA Astrophysics Data System (ADS)
Rosenbaum, Eric; Klopfer, Eric; Perry, Judy
2007-02-01
The learning of science can be made more like the practice of science through authentic simulated experiences. We have created a networked handheld Augmented Reality environment that combines the authentic role-playing of Augmented Realities and the underlying models of Participatory Simulations. This game, known as Outbreak @ The Institute, is played across a university campus where players take on the roles of doctors, medical technicians, and public health experts to contain a disease outbreak. Players can interact with virtual characters and employ virtual diagnostic tests and medicines. They are challenged to identify the source and prevent the spread of an infectious disease that can spread among real and/or virtual characters according to an underlying model. In this paper, we report on data from three high school classes who played the game. We investigate students' perception of the authenticity of the game in terms of their personal embodiment in the game, their experience playing different roles, and their understanding of the dynamic model underlying the game.
Plessas, Anastasios
2017-10-01
In preclinical dental education, the acquisition of clinical, technical skills, and the transfer of these skills to the clinic are paramount. Phantom heads provide an efficient way to teach preclinical students dental procedures safely while increasing their dexterity skills considerably. Modern computerized phantom head training units incorporate features of virtual reality technology and the ability to offer concurrent augmented feedback. The aims of this review were to examine and evaluate the dental literature for evidence supporting their use and to discuss the role of augmented feedback versus the facilitator's instruction. Adjunctive training in these units seems to enhance student's learning and skill acquisition and reduce the required faculty supervision time. However, the virtual augmented feedback cannot be used as the sole method of feedback, and the facilitator's input is still critical. Well-powered longitudinal randomized trials exploring the impact of these units on student's clinical performance and issues of cost-effectiveness are warranted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gonzalez Anez, Francisco
This paper presents two development projects (STARMATE and VIRMAN) focused on supporting training on maintenance. Both projects aim at specifying, designing, developing, and demonstrating prototypes allowing computer guided maintenance of complex mechanical elements using Augmented and Virtual Reality techniques. VIRMAN is a Spanish development project. The objective is to create a computer tool for maintenance training course elaborations and training delivery based on 3D virtual reality models of complex components. The training delivery includes 3D record displays on maintenance procedures with all complementary information for intervention understanding. Users are requested to perform the maintenance intervention trying to follow up themore » procedure. Users can be evaluated about the level of knowledge achieved. Instructors can check the evaluation records left during the training sessions. VIRMAN is simple software supported by a regular computer and can be used in an Internet framework. STARMATE is a forward step in the area of virtual reality. STARMATE is a European Commission project in the frame of 'Information Societies Technologies'. A consortium of five companies and one research institute shares their expertise in this new technology. STARMATE provides two main functionalities (1) user assistance for achieving assembly/de-assembly and following maintenance procedures, and (2) workforce training. The project relies on Augmented Reality techniques, which is a growing area in Virtual Reality research. The idea of Augmented Reality is to combine a real scene, viewed by the user, with a virtual scene, generated by a computer, augmenting the reality with additional information. The user interface is see-through goggles, headphones, microphone and an optical tracking system. All these devices are integrated in a helmet connected with two regular computers. The user has his hands free for performing the maintenance intervention and he can navigate in the virtual world thanks to a voice recognition system and a virtual pointing device. The maintenance work is guided with audio instructions, 2D and 3D information are directly displayed into the user's goggles: There is a position-tracking system that allows 3D virtual models to be displayed in the real counterpart positions independently of the user allocation. The user can create his own virtual environment, placing the information required wherever he wants. The STARMATE system is applicable to a large variety of real work situations. (author)« less
Ranky, Richard G; Sivak, Mark L; Lewis, Jeffrey A; Gade, Venkata K; Deutsch, Judith E; Mavroidis, Constantinos
2014-06-05
Cycling has been used in the rehabilitation of individuals with both chronic and post-surgical conditions. Among the challenges with implementing bicycling for rehabilitation is the recruitment of both extremities, in particular when one is weaker or less coordinated. Feedback embedded in virtual reality (VR) augmented cycling may serve to address the requirement for efficacious cycling; specifically recruitment of both extremities and exercising at a high intensity. In this paper a mechatronic rehabilitation bicycling system with an interactive virtual environment, called Virtual Reality Augmented Cycling Kit (VRACK), is presented. Novel hardware components embedded with sensors were implemented on a stationary exercise bicycle to monitor physiological and biomechanical parameters of participants while immersing them in an augmented reality simulation providing the user with visual, auditory and haptic feedback. This modular and adaptable system attaches to commercially-available stationary bicycle systems and interfaces with a personal computer for simulation and data acquisition processes. The complete bicycle system includes: a) handle bars based on hydraulic pressure sensors; b) pedals that monitor pedal kinematics with an inertial measurement unit (IMU) and forces on the pedals while providing vibratory feedback; c) off the shelf electronics to monitor heart rate and d) customized software for rehabilitation. Bench testing for the handle and pedal systems is presented for calibration of the sensors detecting force and angle. The modular mechatronic kit for exercise bicycles was tested in bench testing and human tests. Bench tests performed on the sensorized handle bars and the instrumented pedals validated the measurement accuracy of these components. Rider tests with the VRACK system focused on the pedal system and successfully monitored kinetic and kinematic parameters of the rider's lower extremities. The VRACK system, a virtual reality mechatronic bicycle rehabilitation modular system was designed to convert most bicycles in virtual reality (VR) cycles. Preliminary testing of the augmented reality bicycle system was successful in demonstrating that a modular mechatronic kit can monitor and record kinetic and kinematic parameters of several riders.
Creating a Vision Channel for Observing Deep-Seated Anatomy in Medical Augmented Reality
NASA Astrophysics Data System (ADS)
Wimmer, Felix; Bichlmeier, Christoph; Heining, Sandro M.; Navab, Nassir
The intent of medical Augmented Reality (AR) is to augment the surgeon's real view on the patient with the patient's interior anatomy resulting from a suitable visualization of medical imaging data. This paper presents a fast and user-defined clipping technique for medical AR allowing for cutting away any parts of the virtual anatomy and images of the real part of the AR scene hindering the surgeon's view onto the deepseated region of interest. Modeled on cut-away techniques from scientific illustrations and computer graphics, the method creates a fixed vision channel to the inside of the patient. It enables a clear view on the focussed virtual anatomy and moreover improves the perception of spatial depth.
Vision-based augmented reality system
NASA Astrophysics Data System (ADS)
Chen, Jing; Wang, Yongtian; Shi, Qi; Yan, Dayuan
2003-04-01
The most promising aspect of augmented reality lies in its ability to integrate the virtual world of the computer with the real world of the user. Namely, users can interact with the real world subjects and objects directly. This paper presents an experimental augmented reality system with a video see-through head-mounted device to display visual objects, as if they were lying on the table together with real objects. In order to overlay virtual objects on the real world at the right position and orientation, the accurate calibration and registration are most important. A vision-based method is used to estimate CCD external parameters by tracking 4 known points with different colors. It achieves sufficient accuracy for non-critical applications such as gaming, annotation and so on.
Social Virtual Worlds for Technology-Enhanced Learning on an Augmented Learning Platform
ERIC Educational Resources Information Center
Jin, Li; Wen, Zhigang; Gough, Norman
2010-01-01
Virtual worlds have been linked with e-learning applications to create virtual learning environments (VLEs) for the past decade. However, while they can support many educational activities that extend both traditional on-campus teaching and distance learning, they are used primarily for learning content generated and managed by instructors. With…
The Potential for Scientific Collaboration in Virtual Ecosystems
ERIC Educational Resources Information Center
Magerko, Brian
2010-01-01
This article explores the potential benefits of creating "virtual ecosystems" from real-world data. These ecosystems are intended to be realistic virtual representations of environments that may be costly or difficult to access in person. They can be constructed as 3D worlds rendered from stereo video data, augmented with scientific data, and then…
What is going on in augmented reality simulation in laparoscopic surgery?
Botden, Sanne M B I; Jakimowicz, Jack J
2009-08-01
To prevent unnecessary errors and adverse results of laparoscopic surgery, proper training is of paramount importance. A safe way to train surgeons for laparoscopic skills is simulation. For this purpose traditional box trainers are often used, however they lack objective assessment of performance. Virtual reality laparoscopic simulators assess performance, but lack realistic haptic feedback. Augmented reality (AR) combines a virtual reality (VR) setting with real physical materials, instruments, and feedback. This article presents the current developments in augmented reality laparoscopic simulation. Pubmed searches were performed to identify articles regarding surgical simulation and augmented reality. Identified companies manufacturing an AR laparoscopic simulator received the same questionnaire referring to the features of the simulator. Seven simulators that fitted the definition of augmented reality were identified during the literature search. Five of the approached manufacturers returned a completed questionnaire, of which one simulator appeared to be VR and was therefore not applicable for this review. Several augmented reality simulators have been developed over the past few years and they are improving rapidly. We recommend the development of AR laparoscopic simulators for component tasks of procedural training. AR simulators should be implemented in current laparoscopic training curricula, in particular for laparoscopic suturing training.
Rothbaum, Barbara Olasov; Price, Matthew; Jovanovic, Tanja; Norrholm, Seth D; Gerardi, Maryrose; Dunlop, Boadie; Davis, Michael; Bradley, Bekh; Duncan, Erica J; Rizzo, Albert; Ressler, Kerry J
2014-06-01
The authors examined the effectiveness of virtual reality exposure augmented with D-cycloserine or alprazolam, compared with placebo, in reducing posttraumatic stress disorder (PTSD) due to military trauma. After an introductory session, five sessions of virtual reality exposure were augmented with D-cycloserine (50 mg) or alprazolam (0.25 mg) in a double-blind, placebo-controlled randomized clinical trial for 156 Iraq and Afghanistan war veterans with PTSD. PTSD symptoms significantly improved from pre- to posttreatment across all conditions and were maintained at 3, 6, and 12 months. There were no overall differences in symptoms between D-cycloserine and placebo at any time. Alprazolam and placebo differed significantly on the Clinician-Administered PTSD Scale score at posttreatment and PTSD diagnosis at 3 months posttreatment; the alprazolam group showed a higher rate of PTSD (82.8%) than the placebo group (47.8%). Between-session extinction learning was a treatment-specific enhancer of outcome for the D-cycloserine group only. At posttreatment, the D-cycloserine group had the lowest cortisol reactivity and smallest startle response during virtual reality scenes. A six-session virtual reality treatment was associated with reduction in PTSD diagnoses and symptoms in Iraq and Afghanistan veterans, although there was no control condition for the virtual reality exposure. There was no advantage of D-cycloserine for PTSD symptoms in primary analyses. In secondary analyses, alprazolam impaired recovery and D-cycloserine enhanced virtual reality outcome in patients who demonstrated within-session learning. D-cycloserine augmentation reduced cortisol and startle reactivity more than did alprazolam or placebo, findings that are consistent with those in the animal literature.
Augmented reality glass-free three-dimensional display with the stereo camera
NASA Astrophysics Data System (ADS)
Pang, Bo; Sang, Xinzhu; Chen, Duo; Xing, Shujun; Yu, Xunbo; Yan, Binbin; Wang, Kuiru; Yu, Chongxiu
2017-10-01
An improved method for Augmented Reality (AR) glass-free three-dimensional (3D) display based on stereo camera used for presenting parallax contents from different angle with lenticular lens array is proposed. Compared with the previous implementation method of AR techniques based on two-dimensional (2D) panel display with only one viewpoint, the proposed method can realize glass-free 3D display of virtual objects and real scene with 32 virtual viewpoints. Accordingly, viewers can get abundant 3D stereo information from different viewing angles based on binocular parallax. Experimental results show that this improved method based on stereo camera can realize AR glass-free 3D display, and both of virtual objects and real scene have realistic and obvious stereo performance.
Kamel Boulos, Maged N; Lu, Zhihan; Guerrero, Paul; Jennett, Charlene; Steed, Anthony
2017-02-20
The latest generation of virtual and mixed reality hardware has rekindled interest in virtual reality GIS (VRGIS) and augmented reality GIS (ARGIS) applications in health, and opened up new and exciting opportunities and possibilities for using these technologies in the personal and public health arenas. From smart urban planning and emergency training to Pokémon Go, this article offers a snapshot of some of the most remarkable VRGIS and ARGIS solutions for tackling public and environmental health problems, and bringing about safer and healthier living options to individuals and communities. The article also covers the main technical foundations and issues underpinning these solutions.
Application of Virtual, Augmented, and Mixed Reality to Urology.
Hamacher, Alaric; Kim, Su Jin; Cho, Sung Tae; Pardeshi, Sunil; Lee, Seung Hyun; Eun, Sung-Jong; Whangbo, Taeg Keun
2016-09-01
Recent developments in virtual, augmented, and mixed reality have introduced a considerable number of new devices into the consumer market. This momentum is also affecting the medical and health care sector. Although many of the theoretical and practical foundations of virtual reality (VR) were already researched and experienced in the 1980s, the vastly improved features of displays, sensors, interactivity, and computing power currently available in devices offer a new field of applications to the medical sector and also to urology in particular. The purpose of this review article is to review the extent to which VR technology has already influenced certain aspects of medicine, the applications that are currently in use in urology, and the future development trends that could be expected.
Application of Virtual, Augmented, and Mixed Reality to Urology
2016-01-01
Recent developments in virtual, augmented, and mixed reality have introduced a considerable number of new devices into the consumer market. This momentum is also affecting the medical and health care sector. Although many of the theoretical and practical foundations of virtual reality (VR) were already researched and experienced in the 1980s, the vastly improved features of displays, sensors, interactivity, and computing power currently available in devices offer a new field of applications to the medical sector and also to urology in particular. The purpose of this review article is to review the extent to which VR technology has already influenced certain aspects of medicine, the applications that are currently in use in urology, and the future development trends that could be expected. PMID:27706017
Magic cards: a new augmented-reality approach.
Demuynck, Olivier; Menendez, José Manuel
2013-01-01
Augmented reality (AR) commonly uses markers for detection and tracking. Such multimedia applications associate each marker with a virtual 3D model stored in the memory of the camera-equipped device running the application. Application users are limited in their interactions, which require knowing how to design and program 3D objects. This generally prevents them from developing their own entertainment AR applications. The Magic Cards application solves this problem by offering an easy way to create and manage an unlimited number of virtual objects that are encoded on special markers.
The Next Wave: Humans, Computers, and Redefining Reality
NASA Technical Reports Server (NTRS)
Little, William
2018-01-01
The Augmented/Virtual Reality (AVR) Lab at KSC is dedicated to " exploration into the growing computer fields of Extended Reality and the Natural User Interface (it is) a proving ground for new technologies that can be integrated into future NASA projects and programs." The topics of Human Computer Interface, Human Computer Interaction, Augmented Reality, Virtual Reality, and Mixed Reality are defined; examples of work being done in these fields in the AVR Lab are given. Current new and future work in Computer Vision, Speech Recognition, and Artificial Intelligence are also outlined.
NASA Astrophysics Data System (ADS)
Gonizzi Barsanti, S.; Malatesta, S. G.; Lella, F.; Fanini, B.; Sala, F.; Dodero, E.; Petacco, L.
2018-05-01
The best way to disseminate culture is, nowadays, the creation of scenarios with virtual and augmented reality that supply the visitors of museums with a powerful, interactive tool that allows to learn sometimes difficult concepts in an easy, entertaining way. 3D models derived from reality-based techniques are nowadays used to preserve, document and restore historical artefacts. These digital contents are also powerful instrument to interactively communicate their significance to non-specialist, making easier to understand concepts sometimes complicated or not clear. Virtual and Augmented Reality are surely a valid tool to interact with 3D models and a fundamental help in making culture more accessible to the wide public. These technologies can help the museum curators to adapt the cultural proposal and the information about the artefacts based on the different type of visitor's categories. These technologies allow visitors to travel through space and time and have a great educative function permitting to explain in an easy and attractive way information and concepts that could prove to be complicated. The aim of this paper is to create a virtual scenario and an augmented reality app to recreate specific spaces in the Capitoline Museum in Rome as they were during Winckelmann's time, placing specific statues in their original position in the 18th century.
Virtual imaging in sports broadcasting: an overview
NASA Astrophysics Data System (ADS)
Tan, Yi
2003-04-01
Virtual imaging technology is being used to augment television broadcasts -- virtual objects are seamlessly inserted into the video stream to appear as real entities to TV audiences. Virtual advertisements, the main application of this technology, are providing opportunities to improve the commercial value of television programming while enhancing the contents and the entertainment aspect of these programs. State-of-the-art technologies, such as image recognition, motion tracking and chroma keying, are central to a virtual imaging system. This paper reviews the general framework, the key techniques, and the sports broadcasting applications of virtual imaging technology.
ERIC Educational Resources Information Center
Montoya, Mauricio Hincapié; Díaz, Christian Andrés; Moreno, Gustavo Adolfo
2017-01-01
Nowadays, the use of technology to improve teaching and learning experiences in the classroom has been promoted. One of these technologies is augmented reality, which allows overlaying layers of virtual information on real scene with the aim of increasing the perception that user has of reality. Augmented reality has proved to offer several…
Nomura, Tsutomu; Mamada, Yasuhiro; Nakamura, Yoshiharu; Matsutani, Takeshi; Hagiwara, Nobutoshi; Fujita, Isturo; Mizuguchi, Yoshiaki; Fujikura, Terumichi; Miyashita, Masao; Uchida, Eiji
2015-11-01
Definitive assessment of laparoscopic skill improvement after virtual reality simulator training is best obtained during an actual operation. However, this is impossible in medical students. Therefore, we developed an alternative assessment technique using an augmented reality simulator. Nineteen medical students completed a 6-week training program using a virtual reality simulator (LapSim). The pretest and post-test were performed using an object-positioning module and cholecystectomy on an augmented reality simulator(ProMIS). The mean performance measures between pre- and post-training on the LapSim were compared with a paired t-test. In the object-positioning module, the execution time of the task (P < 0.001), left and right instrument path length (P = 0.001), and left and right instrument economy of movement (P < 0.001) were significantly shorter after than before the LapSim training. With respect to improvement in laparoscopic cholecystectomy using a gallbladder model, the execution time to identify, clip, and cut the cystic duct and cystic artery as well as the execution time to dissect the gallbladder away from the liver bed were both significantly shorter after than before the LapSim training (P = 0.01). Our training curriculum using a virtual reality simulator improved the operative skills of medical students as objectively evaluated by assessment using an augmented reality simulator instead of an actual operation. We hope that these findings help to establish an effective training program for medical students. © 2015 Japan Society for Endoscopic Surgery, Asia Endosurgery Task Force and Wiley Publishing Asia Pty Ltd.
Sun, Guo-Chen; Wang, Fei; Chen, Xiao-Lei; Yu, Xin-Guang; Ma, Xiao-Dong; Zhou, Ding-Biao; Zhu, Ru-Yuan; Xu, Bai-Nan
2016-12-01
The utility of virtual and augmented reality based on functional neuronavigation and intraoperative magnetic resonance imaging (MRI) for glioma surgery has not been previously investigated. The study population consisted of 79 glioma patients and 55 control subjects. Preoperatively, the lesion and related eloquent structures were visualized by diffusion tensor tractography and blood oxygen level-dependent functional MRI. Intraoperatively, microscope-based functional neuronavigation was used to integrate the reconstructed eloquent structure and the real head and brain, which enabled safe resection of the lesion. Intraoperative MRI was used to verify brain shift during the surgical process and provided quality control during surgery. The control group underwent surgery guided by anatomic neuronavigation. Virtual and augmented reality protocols based on functional neuronavigation and intraoperative MRI provided useful information for performing tailored and optimized surgery. Complete resection was achieved in 55 of 79 (69.6%) glioma patients and 20 of 55 (36.4%) control subjects, with average resection rates of 95.2% ± 8.5% and 84.9% ± 15.7%, respectively. Both the complete resection rate and average extent of resection differed significantly between the 2 groups (P < 0.01). Postoperatively, the rate of preservation of neural functions (motor, visual field, and language) was lower in controls than in glioma patients at 2 weeks and 3 months (P < 0.01). Combining virtual and augmented reality based on functional neuronavigation and intraoperative MRI can facilitate resection of gliomas involving eloquent areas. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Dörner, Ralf; Lok, Benjamin; Broll, Wolfgang
Backed by a large consumer market, entertainment and education applications have spurred developments in the fields of real-time rendering and interactive computer graphics. Relying on Computer Graphics methodologies, Virtual Reality and Augmented Reality benefited indirectly from this; however, there is no large scale demand for VR and AR in gaming and learning. What are the shortcomings of current VR/AR technology that prevent a widespread use in these application areas? What advances in VR/AR will be necessary? And what might future “VR-enhanced” gaming and learning look like? Which role can and will Virtual Humans play? Concerning these questions, this article analyzes the current situation and provides an outlook on future developments. The focus is on social gaming and learning.
White Paper for Virtual Control Room
NASA Technical Reports Server (NTRS)
Little, William; Tully-Hanson, Benjamin
2015-01-01
The Virtual Control Room (VCR) Proof of Concept (PoC) project is the result of an award given by the Fourth Annual NASA T&I Labs Challenge Project Call. This paper will outline the work done over the award period to build and enhance the capabilities of the Augmented/Virtual Reality (AVR) Lab at NASA's Kennedy Space Center (KSC) to create the VCR.
Fisher, J Brian; Porter, Susan M
2002-01-01
This paper describes an application of a display approach which uses chromakey techniques to composite real and computer-generated images allowing a user to see his hands and medical instruments collocated with the display of virtual objects during a medical training simulation. Haptic feedback is provided through the use of a PHANTOM force feedback device in addition to tactile augmentation, which allows the user to touch virtual objects by introducing corresponding real objects in the workspace. A simplified catheter introducer insertion simulation was developed to demonstrate the capabilities of this approach.
Augmented reality and photogrammetry: A synergy to visualize physical and virtual city environments
NASA Astrophysics Data System (ADS)
Portalés, Cristina; Lerma, José Luis; Navarro, Santiago
2010-01-01
Close-range photogrammetry is based on the acquisition of imagery to make accurate measurements and, eventually, three-dimensional (3D) photo-realistic models. These models are a photogrammetric product per se. They are usually integrated into virtual reality scenarios where additional data such as sound, text or video can be introduced, leading to multimedia virtual environments. These environments allow users both to navigate and interact on different platforms such as desktop PCs, laptops and small hand-held devices (mobile phones or PDAs). In very recent years, a new technology derived from virtual reality has emerged: Augmented Reality (AR), which is based on mixing real and virtual environments to boost human interactions and real-life navigations. The synergy of AR and photogrammetry opens up new possibilities in the field of 3D data visualization, navigation and interaction far beyond the traditional static navigation and interaction in front of a computer screen. In this paper we introduce a low-cost outdoor mobile AR application to integrate buildings of different urban spaces. High-accuracy 3D photo-models derived from close-range photogrammetry are integrated in real (physical) urban worlds. The augmented environment that is presented herein requires for visualization a see-through video head mounted display (HMD), whereas user's movement navigation is achieved in the real world with the help of an inertial navigation sensor. After introducing the basics of AR technology, the paper will deal with real-time orientation and tracking in combined physical and virtual city environments, merging close-range photogrammetry and AR. There are, however, some software and complex issues, which are discussed in the paper.
A brief review of augmented reality science learning
NASA Astrophysics Data System (ADS)
Gopalan, Valarmathie; Bakar, Juliana Aida Abu; Zulkifli, Abdul Nasir
2017-10-01
This paper reviews several literatures concerning the theories and model that could be applied for science motivation for upper secondary school learners (16-17 years old) in order to make the learning experience more amazing and useful. The embedment of AR in science could bring an awe-inspiring transformation on learners' viewpoint towards the respective subject matters. Augmented Reality is able to present the real and virtual learning experience with the addition of multiple media without replacing the real environment. Due to the unique feature of AR, it attracts the mass attention of researchers to implement AR in science learning. This impressive technology offers learners with the ultimate visualization and provides an astonishing and transparent learning experience by bringing to light the unseen perspective of the learning content. This paper will attract the attention of researchers in the related field as well as academicians in the related discipline. This paper aims to propose several related theoretical guidance that could be applied in science motivation to transform the learning in an effective way.
2014-01-01
Background Cycling has been used in the rehabilitation of individuals with both chronic and post-surgical conditions. Among the challenges with implementing bicycling for rehabilitation is the recruitment of both extremities, in particular when one is weaker or less coordinated. Feedback embedded in virtual reality (VR) augmented cycling may serve to address the requirement for efficacious cycling; specifically recruitment of both extremities and exercising at a high intensity. Methods In this paper a mechatronic rehabilitation bicycling system with an interactive virtual environment, called Virtual Reality Augmented Cycling Kit (VRACK), is presented. Novel hardware components embedded with sensors were implemented on a stationary exercise bicycle to monitor physiological and biomechanical parameters of participants while immersing them in an augmented reality simulation providing the user with visual, auditory and haptic feedback. This modular and adaptable system attaches to commercially-available stationary bicycle systems and interfaces with a personal computer for simulation and data acquisition processes. The complete bicycle system includes: a) handle bars based on hydraulic pressure sensors; b) pedals that monitor pedal kinematics with an inertial measurement unit (IMU) and forces on the pedals while providing vibratory feedback; c) off the shelf electronics to monitor heart rate and d) customized software for rehabilitation. Bench testing for the handle and pedal systems is presented for calibration of the sensors detecting force and angle. Results The modular mechatronic kit for exercise bicycles was tested in bench testing and human tests. Bench tests performed on the sensorized handle bars and the instrumented pedals validated the measurement accuracy of these components. Rider tests with the VRACK system focused on the pedal system and successfully monitored kinetic and kinematic parameters of the rider’s lower extremities. Conclusions The VRACK system, a virtual reality mechatronic bicycle rehabilitation modular system was designed to convert most bicycles in virtual reality (VR) cycles. Preliminary testing of the augmented reality bicycle system was successful in demonstrating that a modular mechatronic kit can monitor and record kinetic and kinematic parameters of several riders. PMID:24902780
Lin, Wei-Shao; Harris, Bryan T; Phasuk, Kamolphob; Llop, Daniel R; Morton, Dean
2018-02-01
This clinical report describes a digital workflow using the virtual smile design approach augmented with a static 3-dimensional (3D) virtual patient with photorealistic appearance to restore maxillary central incisors by using computer-aided design and computer-aided manufacturing (CAD-CAM) monolithic lithium disilicate ceramic veneers. Copyright © 2017 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Performance Of The IEEE 802.15.4 Protocol As The Marker Of Augmented Reality In Museum
NASA Astrophysics Data System (ADS)
Kurniawan Saputro, Adi; Sumpeno, Surya; Hariadi, Mochamad
2018-04-01
Museum is a place to keep the historic objects and historical education center to introduce the nation’s culture. Utilizing technology in a museum to become a smart city is a challenge. Internet of thing (IOT) is a technological advance in Information and communication (ICT) that can be applied in the museum The current ICT development is not only a transmission medium, but Augmented Reality technology is also being developed. Currently, Augmented Reality technology creates virtual objects into the real world using markers or images. In this study, researcher used signals to make virtual objects appear in the real world using the IEEE 802.14.5 protocol replacing the Augmented Reality marker. RSSI and triangulation are used as a substitute microlocation for AR objects. The result is the performance of Wireless Sensor Network could be used for data transmission in the museum. LOS research at a distance of 15 meters with 1000 ms delay found 1.4% error rate and NLOS with 2.3% error rate. So it can be concluded that utilization technology (IOT) using signal wireless sensor network as a replace for marker augmented reality can be used in museum
Frames of Reference in Mobile Augmented Reality Displays
ERIC Educational Resources Information Center
Mou, Weimin; Biocca, Frank; Owen, Charles B.; Tang, Arthur; Xiao, Fan; Lim, Lynette
2004-01-01
In 3 experiments, the authors investigated spatial updating in augmented reality environments. Participants learned locations of virtual objects on the physical floor. They were turned to appropriate facing directions while blindfolded before making pointing judgments (e.g., "Imagine you are facing X. Point to Y"). Experiments manipulated the…
ERIC Educational Resources Information Center
Chen, Yu-Hsuan; Wang, Chang-Hwa
2018-01-01
Although research has indicated that augmented reality (AR)-facilitated instruction improves learning performance, further investigation of the usefulness of AR from a psychological perspective has been recommended. Researchers consider presence a major psychological effect when users are immersed in virtual reality environments. However, most…
Berryman, Donna R
2012-01-01
Augmented reality is a technology that overlays digital information on objects or places in the real world for the purpose of enhancing the user experience. It is not virtual reality, that is, the technology that creates a totally digital or computer created environment. Augmented reality, with its ability to combine reality and digital information, is being studied and implemented in medicine, marketing, museums, fashion, and numerous other areas. This article presents an overview of augmented reality, discussing what it is, how it works, its current implementations, and its potential impact on libraries.
Virtual Reconstruction of Lost Architectures: from the Tls Survey to AR Visualization
NASA Astrophysics Data System (ADS)
Quattrini, R.; Pierdicca, R.; Frontoni, E.; Barcaglioni, R.
2016-06-01
The exploitation of high quality 3D models for dissemination of archaeological heritage is currently an investigated topic, although Mobile Augmented Reality platforms for historical architecture are not available, allowing to develop low-cost pipelines for effective contents. The paper presents a virtual anastylosis, starting from historical sources and from 3D model based on TLS survey. Several efforts and outputs in augmented or immersive environments, exploiting this reconstruction, are discussed. The work demonstrates the feasibility of a 3D reconstruction approach for complex architectural shapes starting from point clouds and its AR/VR exploitation, allowing the superimposition with archaeological evidences. Major contributions consist in the presentation and the discussion of a pipeline starting from the virtual model, to its simplification showing several outcomes, comparing also the supported data qualities and advantages/disadvantages due to MAR and VR limitations.
NASA Astrophysics Data System (ADS)
Soler, Luc; Marescaux, Jacques
2006-04-01
Technological innovations of the 20 th century provided medicine and surgery with new tools, among which virtual reality and robotics belong to the most revolutionary ones. Our work aims at setting up new techniques for detection, 3D delineation and 4D time follow-up of small abdominal lesions from standard mecial images (CT scsan, MRI). It also aims at developing innovative systems making tumor resection or treatment easier with the use of augmented reality and robotized systems, increasing gesture precision. It also permits a realtime great distance connection between practitioners so they can share a same 3D reconstructed patient and interact on a same patient, virtually before the intervention and for real during the surgical procedure thanks to a telesurgical robot. In preclinical studies, our first results obtained from a micro-CT scanner show that these technologies provide an efficient and precise 3D modeling of anatomical and pathological structures of rats and mice. In clinical studies, our first results show the possibility to improve the therapeutic choice thanks to a better detection and and representation of the patient before performing the surgical gesture. They also show the efficiency of augmented reality that provides virtual transparency of the patient in real time during the operative procedure. In the near future, through the exploitation of these systems, surgeons will program and check on the virtual patient clone an optimal procedure without errors, which will be replayed on the real patient by the robot under surgeon control. This medical dream is today about to become reality.
Using visuo-kinetic virtual reality to induce illusory spinal movement: the MoOVi Illusion
Smith, Ross T.; Hunter, Estin V.; Davis, Miles G.; Sterling, Michele; Moseley, G. Lorimer
2017-01-01
Background Illusions that alter perception of the body provide novel opportunities to target brain-based contributions to problems such as persistent pain. One example of this, mirror therapy, uses vision to augment perceived movement of a painful limb to treat pain. Since mirrors can’t be used to induce augmented neck or other spinal movement, we aimed to test whether such an illusion could be achieved using virtual reality, in advance of testing its potential therapeutic benefit. We hypothesised that perceived head rotation would depend on visually suggested movement. Method In a within-subjects repeated measures experiment, 24 healthy volunteers performed neck movements to 50o of rotation, while a virtual reality system delivered corresponding visual feedback that was offset by a factor of 50%–200%—the Motor Offset Visual Illusion (MoOVi)—thus simulating more or less movement than that actually occurring. At 50o of real-world head rotation, participants pointed in the direction that they perceived they were facing. The discrepancy between actual and perceived direction was measured and compared between conditions. The impact of including multisensory (auditory and visual) feedback, the presence of a virtual body reference, and the use of 360o immersive virtual reality with and without three-dimensional properties, was also investigated. Results Perception of head movement was dependent on visual-kinaesthetic feedback (p = 0.001, partial eta squared = 0.17). That is, altered visual feedback caused a kinaesthetic drift in the direction of the visually suggested movement. The magnitude of the drift was not moderated by secondary variables such as the addition of illusory auditory feedback, the presence of a virtual body reference, or three-dimensionality of the scene. Discussion Virtual reality can be used to augment perceived movement and body position, such that one can perform a small movement, yet perceive a large one. The MoOVi technique tested here has clear potential for assessment and therapy of people with spinal pain. PMID:28243537
Using visuo-kinetic virtual reality to induce illusory spinal movement: the MoOVi Illusion.
Harvie, Daniel S; Smith, Ross T; Hunter, Estin V; Davis, Miles G; Sterling, Michele; Moseley, G Lorimer
2017-01-01
Illusions that alter perception of the body provide novel opportunities to target brain-based contributions to problems such as persistent pain. One example of this, mirror therapy, uses vision to augment perceived movement of a painful limb to treat pain. Since mirrors can't be used to induce augmented neck or other spinal movement, we aimed to test whether such an illusion could be achieved using virtual reality, in advance of testing its potential therapeutic benefit. We hypothesised that perceived head rotation would depend on visually suggested movement. In a within-subjects repeated measures experiment, 24 healthy volunteers performed neck movements to 50 o of rotation, while a virtual reality system delivered corresponding visual feedback that was offset by a factor of 50%-200%-the Motor Offset Visual Illusion (MoOVi)-thus simulating more or less movement than that actually occurring. At 50 o of real-world head rotation, participants pointed in the direction that they perceived they were facing. The discrepancy between actual and perceived direction was measured and compared between conditions. The impact of including multisensory (auditory and visual) feedback, the presence of a virtual body reference, and the use of 360 o immersive virtual reality with and without three-dimensional properties, was also investigated. Perception of head movement was dependent on visual-kinaesthetic feedback ( p = 0.001, partial eta squared = 0.17). That is, altered visual feedback caused a kinaesthetic drift in the direction of the visually suggested movement. The magnitude of the drift was not moderated by secondary variables such as the addition of illusory auditory feedback, the presence of a virtual body reference, or three-dimensionality of the scene. Virtual reality can be used to augment perceived movement and body position, such that one can perform a small movement, yet perceive a large one. The MoOVi technique tested here has clear potential for assessment and therapy of people with spinal pain.
Baker, Benjamin; Amin, Kavit; Chan, Adrian; Patel, Ketan; Wong, Jason
2016-01-01
The continuing enhancement of the surgical environment in the digital age has led to a number of innovations being highlighted as potential disruptive technologies in the surgical workplace. Augmented reality (AR) and virtual reality (VR) are rapidly becoming increasingly available, accessible and importantly affordable, hence their application into healthcare to enhance the medical use of data is certain. Whether it relates to anatomy, intraoperative surgery, or post-operative rehabilitation, applications are already being investigated for their role in the surgeons armamentarium. Here we provide an introduction to the technology and the potential areas of development in the surgical arena. PMID:28090510
Khor, Wee Sim; Baker, Benjamin; Amin, Kavit; Chan, Adrian; Patel, Ketan; Wong, Jason
2016-12-01
The continuing enhancement of the surgical environment in the digital age has led to a number of innovations being highlighted as potential disruptive technologies in the surgical workplace. Augmented reality (AR) and virtual reality (VR) are rapidly becoming increasingly available, accessible and importantly affordable, hence their application into healthcare to enhance the medical use of data is certain. Whether it relates to anatomy, intraoperative surgery, or post-operative rehabilitation, applications are already being investigated for their role in the surgeons armamentarium. Here we provide an introduction to the technology and the potential areas of development in the surgical arena.
Virtual Reality and Augmented Reality in Plastic Surgery: A Review.
Kim, Youngjun; Kim, Hannah; Kim, Yong Oock
2017-05-01
Recently, virtual reality (VR) and augmented reality (AR) have received increasing attention, with the development of VR/AR devices such as head-mounted displays, haptic devices, and AR glasses. Medicine is considered to be one of the most effective applications of VR/AR. In this article, we describe a systematic literature review conducted to investigate the state-of-the-art VR/AR technology relevant to plastic surgery. The 35 studies that were ultimately selected were categorized into 3 representative topics: VR/AR-based preoperative planning, navigation, and training. In addition, future trends of VR/AR technology associated with plastic surgery and related fields are discussed.
A telescope with augmented reality functions
NASA Astrophysics Data System (ADS)
Hou, Qichao; Cheng, Dewen; Wang, Qiwei; Wang, Yongtian
2016-10-01
This study introduces a telescope with virtual reality (VR) and augmented reality (AR) functions. In this telescope, information on the micro-display screen is integrated to the reticule of telescope through a beam splitter and is then received by the observer. The design and analysis of telescope optical system with AR and VR ability is accomplished and the opto-mechanical structure is designed. Finally, a proof-of-concept prototype is fabricated and demonstrated. The telescope has an exit pupil diameter of 6 mm at an eye relief of 19 mm, 6° field of view, 5 to 8 times visual magnification , and a 30° field of view of the virtual image.
Virtual Reality and Augmented Reality in Plastic Surgery: A Review
Kim, Youngjun; Kim, Hannah
2017-01-01
Recently, virtual reality (VR) and augmented reality (AR) have received increasing attention, with the development of VR/AR devices such as head-mounted displays, haptic devices, and AR glasses. Medicine is considered to be one of the most effective applications of VR/AR. In this article, we describe a systematic literature review conducted to investigate the state-of-the-art VR/AR technology relevant to plastic surgery. The 35 studies that were ultimately selected were categorized into 3 representative topics: VR/AR-based preoperative planning, navigation, and training. In addition, future trends of VR/AR technology associated with plastic surgery and related fields are discussed. PMID:28573091
NASA Astrophysics Data System (ADS)
Hua, Hong
2017-02-01
Head-mounted light field displays render a true 3D scene by sampling either the projections of the 3D scene at different depths or the directions of the light rays apparently emitted by the 3D scene and viewed from different eye positions. They are capable of rendering correct or nearly correct focus cues and addressing the very well-known vergence-accommodation mismatch problem in conventional virtual and augmented reality displays. In this talk, I will focus on reviewing recent advancements of head-mounted light field displays for VR and AR applications. I will demonstrate examples of HMD systems developed in my group.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hayes, Birchard P; Michel, Kelly D; Few, Douglas A
From stereophonic, positional sound to high-definition imagery that is crisp and clean, high fidelity computer graphics enhance our view, insight, and intuition regarding our environments and conditions. Contemporary 3-D modeling tools offer an open architecture framework that enables integration with other technologically innovative arenas. One innovation of great interest is Augmented Reality, the merging of virtual, digital environments with physical, real-world environments creating a mixed reality where relevant data and information augments the real or actual experience in real-time by spatial or semantic context. Pairing 3-D virtual immersive models with a dynamic platform such as semi-autonomous robotics or personnel odometrymore » systems to create a mixed reality offers a new and innovative design information verification inspection capability, evaluation accuracy, and information gathering capability for nuclear facilities. Our paper discusses the integration of two innovative technologies, 3-D visualizations with inertial positioning systems, and the resulting augmented reality offered to the human inspector. The discussion in the paper includes an exploration of human and non-human (surrogate) inspections of a nuclear facility, integrated safeguards knowledge within a synchronized virtual model operated, or worn, by a human inspector, and the anticipated benefits to safeguards evaluations of facility operations.« less
Compact tracking of surgical instruments through structured markers.
Alberto Borghese, N; Frosio, I
2013-07-01
Virtual and augmented reality surgery calls for reliable and efficient tracking of the surgical instruments in the virtual or real operating theatre. The most diffused approach uses three or more not aligned markers, attached to each instrument and surveyed by a set of cameras. However, the structure required to carry the markers does modify the instrument's mass distribution and can interfere with surgeon movements. To overcome these problems, we propose here a new methodology, based on structured markers, to compute the six degrees of freedom of a surgical instrument. Two markers are attached on the instrument axis and one of them has a stripe painted over its surface. We also introduce a procedure to compute with high accuracy the markers center on the cameras image, even when partially occluded by the instrument's axis or by other structures. Experimental results demonstrate the reliability and accuracy of the proposed approach. The introduction of structured passive markers can open new possibilities to accurate tracking, combining markers detection with real-time image processing.
Augmented reality (AR) and virtual reality (VR) applied in dentistry.
Huang, Ta-Ko; Yang, Chi-Hsun; Hsieh, Yu-Hsin; Wang, Jen-Chyan; Hung, Chun-Cheng
2018-04-01
The OSCE is a reliable evaluation method to estimate the preclinical examination of dental students. The most ideal assessment for OSCE is used the augmented reality simulator to evaluate. This literature review investigated a recently developed in virtual reality (VR) and augmented reality (AR) starting of the dental history to the progress of the dental skill. As result of the lacking of technology, it needs to depend on other device increasing the success rate and decreasing the risk of the surgery. The development of tracking unit changed the surgical and educational way. Clinical surgery is based on mature education. VR and AR simultaneously affected the skill of the training lesson and navigation system. Widely, the VR and AR not only applied in the dental training lesson and surgery, but also improved all field in our life. Copyright © 2018. Published by Elsevier Taiwan.
a Framework for Low-Cost Multi-Platform VR and AR Site Experiences
NASA Astrophysics Data System (ADS)
Wallgrün, J. O.; Huang, J.; Zhao, J.; Masrur, A.; Oprean, D.; Klippel, A.
2017-11-01
Low-cost consumer-level immersive solutions have the potential to revolutionize education and research in many fields by providing virtual experiences of sites that are either inaccessible, too dangerous, or too expensive to visit, or by augmenting in-situ experiences using augmented and mixed reality methods. We present our approach for creating low-cost multi-platform virtual and augmented reality site experiences of real world places for education and research purposes, making extensive use of Structure-from-Motion methods as well as 360° photography and videography. We discuss several example projects, for the Mayan City of Cahal Pech, Iceland's Thrihnukar volcano, the Santa Marta informal settlement in Rio, and for the Penn State Campus, and we propose a framework for creating and maintaining such applications by combining declarative content specification methods with a central linked-data based spatio-temporal information system.
Enhancing a Multi-body Mechanism with Learning-Aided Cues in an Augmented Reality Environment
NASA Astrophysics Data System (ADS)
Singh Sidhu, Manjit
2013-06-01
Augmented Reality (AR) is a potential area of research for education, covering issues such as tracking and calibration, and realistic rendering of virtual objects. The ability to augment real world with virtual information has opened the possibility of using AR technology in areas such as education and training as well. In the domain of Computer Aided Learning (CAL), researchers have long been looking into enhancing the effectiveness of the teaching and learning process by providing cues that could assist learners to better comprehend the materials presented. Although a number of works were done looking into the effectiveness of learning-aided cues, but none has really addressed this issue for AR-based learning solutions. This paper discusses the design and model of an AR based software that uses visual cues to enhance the learning process and the outcome perception results of the cues.
An Augmented Virtuality Display for Improving UAV Usability
2005-01-01
cockpit. For a more universally-understood metaphor, we have turned to virtual environments of the type represented in video games . Many of the...people who have the need to fly UAVs (such as military personnel) have experience with playing video games . They are skilled in navigating virtual...Another aspect of tailoring the interface to those with video game experience is to use familiar controls. Microsoft has developed a popular and
Advanced Visual and Instruction Systems for Maintenance Support (AVIS-MS)
2006-12-01
Hayashi , "Augmentable Reality: Situated Communication through Physical and Digital Spaces," Proc. 2nd Int’l Symp. Wearable Computers, IEEE CS Press...H. Ohno , "An Optical See-through Display for Mutual Occlusion of Real and Virtual Environments," Proc. Int’l Symp. Augmented Reality 2000 (ISARO0
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Sandia National Laboratories has developed a state-of-the-art augmented reality training system for close-quarters combat (CQB). This system uses a wearable augmented reality system to place the user in a real environment while engaging enemy combatants in virtual space (Boston Dynamics DI-Guy). Umbra modeling and simulation environment is used to integrate and control the AR system.
The Design of Immersive English Learning Environment Using Augmented Reality
ERIC Educational Resources Information Center
Li, Kuo-Chen; Chen, Cheng-Ting; Cheng, Shein-Yung; Tsai, Chung-Wei
2016-01-01
The study uses augmented reality (AR) technology to integrate virtual objects into the real learning environment for language learning. The English AR classroom is constructed using the system prototyping method and evaluated by semi-structured in-depth interviews. According to the flow theory by Csikszenmihalyi in 1975 along with the immersive…
Enhancing and Transforming Global Learning Communities with Augmented Reality
ERIC Educational Resources Information Center
Frydenberg, Mark; Andone, Diana
2018-01-01
Augmented and virtual reality applications bring new insights to real world objects and scenarios. This paper shares research results of the TalkTech project, an ongoing study investigating the impact of learning about new technologies as members of global communities. This study shares results of a collaborative learning project about augmented…
A Collaborative Augmented Campus Based on Location-Aware Mobile Technology
ERIC Educational Resources Information Center
De Lucia, A.; Francese, R.; Passero, I.; Tortora, G.
2012-01-01
Mobile devices are changing the way people work and communicate. Most of the innovative devices offer the opportunity to integrate augmented reality in mobile applications, permitting the combination of the real world with virtual information. This feature can be particularly useful to enhance informal and formal didactic actions based on student…
Understanding the Conics through Augmented Reality
ERIC Educational Resources Information Center
Salinas, Patricia; Pulido, Ricardo
2017-01-01
This paper discusses the production of a digital environment to foster the learning of conics through augmented reality. The name conic refers to curves obtained by the intersection of a plane with a right circular conical surface. The environment gives students the opportunity to interact with the cone and the plane as virtual objects in real…
Archavlis, Eleftherios; Serrano, Lucas; Schwandt, Eike; Nimer, Amr; Molina-Fuentes, Moisés Felipe; Rahim, Tamim; Ackermann, Maximilian; Gutenberg, Angelika; Kantelhardt, Sven Rainer; Giese, Alf
2017-02-01
OBJECTIVE The goal of this study was to demonstrate the clinical and technical nuances of a minimally invasive, dorsolateral, tubular approach for partial odontoidectomy, autologous bone augmentation, and temporary C1-2 fixation to treat dens pseudarthrosis. METHODS A cadaveric feasibility study, a 3D virtual reality reconstruction study, and the subsequent application of this approach in 2 clinical cases are reported. Eight procedures were completed in 4 human cadavers. A minimally invasive, dorsolateral, tubular approach for odontoidectomy was performed with the aid of a tubular retraction system, using a posterolateral incision and an oblique approach angle. Fluoroscopy and postprocedural CT, using 3D volumetric averaging software, were used to evaluate the degree of bone removal of C1-2 lateral masses and the C-2 pars interarticularis. Two clinical cases were treated using the approach: a 23-year-old patient with an odontoid fracture and pseudarthrosis, and a 35-year-old patient with a history of failed conservative treatment for odontoid fracture. RESULTS At 8 cadaveric levels, the mean volumetric bone removal of the C1-2 lateral masses on 1 side was 3% ± 1%, and the mean resection of the pars interarticularis on 1 side was 2% ± 1%. The median angulation of the trajectory was 50°, and the median distance from the midline of the incision entry point on the skin surface was 67 mm. The authors measured the diameter of the working channel in relation to head positioning and assessed a greater working corridor of 12 ± 4 mm in 20° inclination, 15° contralateral rotation, and 5° lateral flexion to the contralateral side. There were no violations of the dura. The reliability of C-2 pedicle screws and C-1 lateral mass screws was 94% (15 of 16 screws) with a single lateral breach. The patients treated experienced excellent clinical outcomes. CONCLUSIONS A minimally invasive, dorsolateral, tubular odontoidectomy and autologous bone augmentation combined with C1-2 instrumentation has the ability to provide excellent 1-stage management of an odontoid pseudarthrosis. The procedure can be completed safely and successfully with minimal blood loss and little associated morbidity. This approach has the potential to provide not only a less invasive approach but also a function-preserving option to treat complex C1-2 anterior disease.
3D augmented reality with integral imaging display
NASA Astrophysics Data System (ADS)
Shen, Xin; Hua, Hong; Javidi, Bahram
2016-06-01
In this paper, a three-dimensional (3D) integral imaging display for augmented reality is presented. By implementing the pseudoscopic-to-orthoscopic conversion method, elemental image arrays with different capturing parameters can be transferred into the identical format for 3D display. With the proposed merging algorithm, a new set of elemental images for augmented reality display is generated. The newly generated elemental images contain both the virtual objects and real world scene with desired depth information and transparency parameters. The experimental results indicate the feasibility of the proposed 3D augmented reality with integral imaging.
Magnetosensitive e-skins with directional perception for augmented reality
Cañón Bermúdez, Gilbert Santiago; Karnaushenko, Dmitriy D.; Karnaushenko, Daniil; Lebanov, Ana; Bischoff, Lothar; Kaltenbrunner, Martin; Fassbender, Jürgen; Schmidt, Oliver G.; Makarov, Denys
2018-01-01
Electronic skins equipped with artificial receptors are able to extend our perception beyond the modalities that have naturally evolved. These synthetic receptors offer complimentary information on our surroundings and endow us with novel means of manipulating physical or even virtual objects. We realize highly compliant magnetosensitive skins with directional perception that enable magnetic cognition, body position tracking, and touchless object manipulation. Transfer printing of eight high-performance spin valve sensors arranged into two Wheatstone bridges onto 1.7-μm-thick polyimide foils ensures mechanical imperceptibility. This resembles a new class of interactive devices extracting information from the surroundings through magnetic tags. We demonstrate this concept in augmented reality systems with virtual knob-turning functions and the operation of virtual dialing pads, based on the interaction with magnetic fields. This technology will enable a cornucopia of applications from navigation, motion tracking in robotics, regenerative medicine, and sports and gaming to interaction in supplemented reality. PMID:29376121
An Interactive Augmented Reality Implementation of Hijaiyah Alphabet for Children Education
NASA Astrophysics Data System (ADS)
Rahmat, R. F.; Akbar, F.; Syahputra, M. F.; Budiman, M. A.; Hizriadi, A.
2018-03-01
Hijaiyah alphabet is letters used in the Qur’an. An attractive and exciting learning process of Hijaiyah alphabet is necessary for the children. One of the alternatives to create attractive and interesting learning process of Hijaiyah alphabet is to develop it into a mobile application using augmented reality technology. Augmented reality is a technology that combines two-dimensional or three-dimensional virtual objects into actual three-dimensional circles and projects them in real time. The purpose of application aims to foster the children interest in learning Hijaiyah alphabet. This application is using Smartphone and marker as the medium. It was built using Unity and augmented reality library, namely Vuforia, then using Blender as the 3D object modeling software. The output generated from this research is the learning application of Hijaiyah letters using augmented reality. How to use it is as follows: first, place marker that has been registered and printed; second, the smartphone camera will track the marker. If the marker is invalid, the user should repeat the tracking process. If the marker is valid and identified, the marker will have projected the objects of Hijaiyah alphabet in three-dimensional form. Lastly, the user can learn and understand the shape and pronunciation of Hijaiyah alphabet by touching the virtual button on the marker
Augmented reality in dentistry: a current perspective.
Kwon, Ho-Beom; Park, Young-Seok; Han, Jung-Suk
2018-02-21
Augmentation reality technology offers virtual information in addition to that of the real environment and thus opens new possibilities in various fields. The medical applications of augmentation reality are generally concentrated on surgery types, including neurosurgery, laparoscopic surgery and plastic surgery. Augmentation reality technology is also widely used in medical education and training. In dentistry, oral and maxillofacial surgery is the primary area of use, where dental implant placement and orthognathic surgery are the most frequent applications. Recent technological advancements are enabling new applications of restorative dentistry, orthodontics and endodontics. This review briefly summarizes the history, definitions, features, and components of augmented reality technology and discusses its applications and future perspectives in dentistry.
Yoo, Ji Won; Lee, Dong Ryul; Cha, Young Joo; You, Sung Hyun
2017-01-01
The purpose of the present study was to compare therapeutic effects of an electromyography (EMG) biofeedback augmented by virtual reality (VR) and EMG biofeedback alone on the triceps and biceps (T:B) muscle activity imbalance and elbow joint movement coordination during a reaching motor taskOBJECTIVE: To compare therapeutic effects of an electromyography (EMG) biofeedback augmented by virtual reality (VR) and EMG biofeedback alone on the triceps and biceps muscle activity imbalance and elbow joint movement coordination during a reaching motor task in normal children and children with spastic cerebral palsy (CP). 18 children with spastic CP (2 females; mean±standard deviation = 9.5 ± 1.96 years) and 8 normal children (3 females; mean ± standard deviation = 9.75 ± 2.55 years) were recruited from a local community center. All children with CP first underwent one intensive session of EMG feedback (30 minutes), followed by one session of the EMG-VR feedback (30 minutes) after a 1-week washout period. Clinical tests included elbow extension range of motion (ROM), biceps muscle strength, and box and block test. EMG triceps and biceps (T:B) muscle activity imbalance and reaching movement acceleration coordination were concurrently determined by EMG and 3-axis accelerometer measurements respectively. Independent t-test and one-way repeated analysis of variance (ANOVA) were performed at p < 0.05. The one-way repeated ANOVA was revealed to be significantly effective in elbow extension ROM (p = 0.01), biceps muscle strength (p = 0.01), and box and block test (p = 0.03). The one-way repeated ANOVA also revealed to be significantly effective in the peak triceps muscle activity (p = 0.01). However, one-way repeated ANOVA produced no statistical significance in the composite 3-dimensional movement acceleration coordination data (p = 0.12). The present study is a first clinical trial that demonstrated the superior benefits of the EMG biofeedback when augmented by virtual reality exercise games in children with spastic CP. The augmented EMG and VR feedback produced better neuromuscular balance control in the elbow joint than the EMG biofeedback alone.
Learning Application of Astronomy Based Augmented Reality using Android Platform
NASA Astrophysics Data System (ADS)
Maleke, B.; Paseru, D.; Padang, R.
2018-02-01
Astronomy is a branch of science involving observations of celestial bodies such as stars, planets, nebular comets, star clusters, and galaxies as well as natural phenomena occurring outside the Earth’s atmosphere. The way of learning of Astronomy is quite varied, such as by using a book or observe directly with a telescope. But both ways of learning have shortcomings, for example learning through books is only presented in the form of interesting 2D drawings. While learning with a telescope requires a fairly expensive cost to buy the equipment. This study will present a more interesting way of learning from the previous one, namely through Augmented Reality (AR) application using Android platform. Augmented Reality is a combination of virtual world (virtual) and real world (real) made by computer. Virtual objects can be text, animation, 3D models or videos that are combined with the actual environment so that the user feels the virtual object is in his environment. With the use of the Android platform, this application makes the learning method more interesting because it can be used on various Android smartphones so that learning can be done anytime and anywhere. The methodology used in making applications is Multimedia Lifecycle, along with C # language for AR programming and flowchart as a modelling tool. The results of research on some users stated that this application can run well and can be used as an alternative way of learning Astronomy with more interesting.
ERIC Educational Resources Information Center
Menorath, Darren; Antonczak, Laurent
2017-01-01
This paper examines the state of the art of mobile Augmented Reality (AR) and mobile Virtual Reality (VR) in relation to collaboration and professional practices in a creative digital environment and higher education. To support their discussion, the authors use a recent design-based research project named "Juxtapose," which explores…
ERIC Educational Resources Information Center
Harley, Jason M.; Poitras, Eric G.; Jarrell, Amanda; Duffy, Melissa C.; Lajoie, Susanne P.
2016-01-01
Research on the effectiveness of augmented reality (AR) on learning exists, but there is a paucity of empirical work that explores the role that positive emotions play in supporting learning in such settings. To address this gap, this study compared undergraduate students' emotions and learning outcomes during a guided historical tour using mobile…
Visual Environment for Designing Interactive Learning Scenarios with Augmented Reality
ERIC Educational Resources Information Center
Mota, José Miguel; Ruiz-Rube, Iván; Dodero, Juan Manuel; Figueiredo, Mauro
2016-01-01
Augmented Reality (AR) technology allows the inclusion of virtual elements on a vision of actual physical environment for the creation of a mixed reality in real time. This kind of technology can be used in educational settings. However, the current AR authoring tools present several drawbacks, such as, the lack of a mechanism for tracking the…
Integrating Augmented Reality in Higher Education: A Multidisciplinary Study of Student Perceptions
ERIC Educational Resources Information Center
Delello, Julie A.; McWhorter, Rochell R.; Camp, Kerri M.
2015-01-01
Augmented reality (AR) is an emerging technology that blends physical objects with virtual reality. Through the integration of digital and print media, a gap between the "on and offline" worlds are merged, radically shifting student-computer interaction in the classroom. This research examined the results of a multiple case study on the…
Pursuit of X-ray Vision for Augmented Reality
2012-01-01
applications. Virtual Reality 15(2–3), 175–184 (2011) 29. Livingston, M.A., Swan II, J.E., Gabbard , J.L., Höllerer, T.H., Hix, D., Julier, S.J., Baillot, Y...Brown, D., Baillot, Y., Gabbard , J.L., Hix, D.: A perceptual matching technique for depth judgments in optical, see-through augmented reality. In: IEEE
ERIC Educational Resources Information Center
Dunleavy, Matt; Dede, Chris; Mitchell, Rebecca
2009-01-01
The purpose of this study was to document how teachers and students describe and comprehend the ways in which participating in an augmented reality (AR) simulation aids or hinders teaching and learning. Like the multi-user virtual environment (MUVE) interface that underlies Internet games, AR is a good medium for immersive collaborative…
Virtually Nursing: Emerging Technologies in Nursing Education.
Foronda, Cynthia L; Alfes, Celeste M; Dev, Parvati; Kleinheksel, A J; Nelson, Douglas A; OʼDonnell, John M; Samosky, Joseph T
Augmented reality and virtual simulation technologies in nursing education are burgeoning. Preliminary evidence suggests that these innovative pedagogical approaches are effective. The aim of this article is to present 6 newly emerged products and systems that may improve nursing education. Technologies may present opportunities to improve teaching efforts, better engage students, and transform nursing education.
ERIC Educational Resources Information Center
Chao, Jie; Chiu, Jennifer L.; DeJaegher, Crystal J.; Pan, Edward A.
2016-01-01
Deep learning of science involves integration of existing knowledge and normative science concepts. Past research demonstrates that combining physical and virtual labs sequentially or side by side can take advantage of the unique affordances each provides for helping students learn science concepts. However, providing simultaneously connected…
Embodied information behavior, mixed reality and big data
NASA Astrophysics Data System (ADS)
West, Ruth; Parola, Max J.; Jaycen, Amelia R.; Lueg, Christopher P.
2015-03-01
A renaissance in the development of virtual (VR), augmented (AR), and mixed reality (MR) technologies with a focus on consumer and industrial applications is underway. As data becomes ubiquitous in our lives, a need arises to revisit the role of our bodies, explicitly in relation to data or information. Our observation is that VR/AR/MR technology development is a vision of the future framed in terms of promissory narratives. These narratives develop alongside the underlying enabling technologies and create new use contexts for virtual experiences. It is a vision rooted in the combination of responsive, interactive, dynamic, sharable data streams, and augmentation of the physical senses for capabilities beyond those normally humanly possible. In parallel to the varied definitions of information and approaches to elucidating information behavior, a myriad of definitions and methods of measuring and understanding presence in virtual experiences exist. These and other ideas will be tested by designers, developers and technology adopters as the broader ecology of head-worn devices for virtual experiences evolves in order to reap the full potential and benefits of these emerging technologies.
Augmenting the thermal flux experiment: A mixed reality approach with the HoloLens
NASA Astrophysics Data System (ADS)
Strzys, M. P.; Kapp, S.; Thees, M.; Kuhn, J.; Lukowicz, P.; Knierim, P.; Schmidt, A.
2017-09-01
In the field of Virtual Reality (VR) and Augmented Reality (AR), technologies have made huge progress during the last years and also reached the field of education. The virtuality continuum, ranging from pure virtuality on one side to the real world on the other, has been successfully covered by the use of immersive technologies like head-mounted displays, which allow one to embed virtual objects into the real surroundings, leading to a Mixed Reality (MR) experience. In such an environment, digital and real objects do not only coexist, but moreover are also able to interact with each other in real time. These concepts can be used to merge human perception of reality with digitally visualized sensor data, thereby making the invisible visible. As a first example, in this paper we introduce alongside the basic idea of this column an MR experiment in thermodynamics for a laboratory course for freshman students in physics or other science and engineering subjects that uses physical data from mobile devices for analyzing and displaying physical phenomena to students.
Working Group Reports and Presentations: Virtual Worlds and Virtual Exploration
NASA Technical Reports Server (NTRS)
LAmoreaux, Claudia
2006-01-01
Scientists and engineers are continually developing innovative methods to capitalize on recent developments in computational power. Virtual worlds and virtual exploration present a new toolset for project design, implementation, and resolution. Replication of the physical world in the virtual domain provides stimulating displays to augment current data analysis techniques and to encourage public participation. In addition, the virtual domain provides stakeholders with a low cost, low risk design and test environment. The following document defines a virtual world and virtual exploration, categorizes the chief motivations for virtual exploration, elaborates upon specific objectives, identifies roadblocks and enablers for realizing the benefits, and highlights the more immediate areas of implementation (i.e. the action items). While the document attempts a comprehensive evaluation of virtual worlds and virtual exploration, the innovative nature of the opportunities presented precludes completeness. The authors strongly encourage readers to derive additional means of utilizing the virtual exploration toolset.
Optical methods for enabling focus cues in head-mounted displays for virtual and augmented reality
NASA Astrophysics Data System (ADS)
Hua, Hong
2017-05-01
Developing head-mounted displays (HMD) that offer uncompromised optical pathways to both digital and physical worlds without encumbrance and discomfort confronts many grand challenges, both from technological perspectives and human factors. Among the many challenges, minimizing visual discomfort is one of the key obstacles. One of the key contributing factors to visual discomfort is the lack of the ability to render proper focus cues in HMDs to stimulate natural eye accommodation responses, which leads to the well-known accommodation-convergence cue discrepancy problem. In this paper, I will provide a summary on the various optical methods approaches toward enabling focus cues in HMDs for both virtual reality (VR) and augmented reality (AR).
The Reality of Virtual Reality Product Development
NASA Astrophysics Data System (ADS)
Dever, Clark
Virtual Reality and Augmented Reality are emerging areas of research and product development in enterprise companies. This talk will discuss industry standard tools and current areas of application in the commercial market. Attendees will gain insights into how to research, design, and (most importantly) ship, world class products. The presentation will recount the lessons learned to date developing a Virtual Reality tool to solve physics problems resulting from trying to perform aircraft maintenance on ships at sea.
Virtual Rehabilitation with Children: Challenges for Clinical Adoption [From the Field].
Glegg, Stephanie
2017-01-01
Virtual, augmented, and mixed reality environments are increasingly being developed and used to address functional rehabilitation goals related to physical, cognitive, social, and psychological impairments. For example, a child with an acquired brain injury may participate in virtual rehabilitation to address impairments in balance, attention, turn taking, and engagement in therapy. The trend toward virtual rehabilitation first gained momentum with the adoption of commercial off-the-shelf active video gaming consoles (e.g., Nintendo Wii and XBox). Now, we are seeing the rapid emergence of customized rehabilitation-specific systems that integrate technological advances in virtual reality, visual effects, motion tracking, physiological monitoring, and robotics.
Augmenting breath regulation using a mobile driven virtual reality therapy framework.
Abushakra, Ahmad; Faezipour, Miad
2014-05-01
This paper presents a conceptual framework of a virtual reality therapy to assist individuals, especially lung cancer patients or those with breathing disorders to regulate their breath through real-time analysis of respiration movements using a smartphone. Virtual reality technology is an attractive means for medical simulations and treatment, particularly for patients with cancer. The theories, methodologies and approaches, and real-world dynamic contents for all the components of this virtual reality therapy (VRT) via a conceptual framework using the smartphone will be discussed. The architecture and technical aspects of the offshore platform of the virtual environment will also be presented.
ERIC Educational Resources Information Center
Civelek, Turhan; Ucar, Erdem; Ustunel, Hakan; Aydin, Mehmet Kemal
2014-01-01
The current research aims to explore the effects of a haptic augmented simulation on students' achievement and their attitudes towards Physics in an immersive virtual reality environment (VRE). A quasi-experimental post-test design was employed utilizing experiment and control groups. The participants were 215 students from a K-12 school in…
ERIC Educational Resources Information Center
Ong, Alex
2010-01-01
The use of augmented reality (AR) tools, where virtual objects such as tables and graphs can be displayed and be interacted with in real scenes created from imaging devices, in mainstream school curriculum is uncommon, as they are potentially costly and sometimes bulky. Thus, such learning tools are mainly applied in tertiary institutions, such as…
Quantitative Predictions of Binding Free Energy Changes in Drug-Resistant Influenza Neuraminidase
2012-08-30
drug resistance to two antiviral drugs, zanamivir and oseltamivir. We augmented molecular dynamics (MD) with Hamiltonian Replica Exchange and...conformations that are virtually identical to WT [10]. Molecular simulations that rigorously model the microscopic structure and thermodynamics PLOS...influenza neuraminidase (NA) that confer drug resistance to two antiviral drugs, zanamivir and oseltamivir. We augmented molecular dynamics (MD) with
NASA Astrophysics Data System (ADS)
Kobayashi, Hayato; Osaki, Tsugutoyo; Okuyama, Tetsuro; Gramm, Joshua; Ishino, Akira; Shinohara, Ayumi
This paper describes an interactive experimental environment for autonomous soccer robots, which is a soccer field augmented by utilizing camera input and projector output. This environment, in a sense, plays an intermediate role between simulated environments and real environments. We can simulate some parts of real environments, e.g., real objects such as robots or a ball, and reflect simulated data into the real environments, e.g., to visualize the positions on the field, so as to create a situation that allows easy debugging of robot programs. The significant point compared with analogous work is that virtual objects are touchable in this system owing to projectors. We also show the portable version of our system that does not require ceiling cameras. As an application in the augmented environment, we address the learning of goalie strategies on real quadruped robots in penalty kicks. We make our robots utilize virtual balls in order to perform only quadruped locomotion in real environments, which is quite difficult to simulate accurately. Our robots autonomously learn and acquire more beneficial strategies without human intervention in our augmented environment than those in a fully simulated environment.
NASA Astrophysics Data System (ADS)
Tan, Kian Lam; Lim, Chen Kim
2017-10-01
In the last decade, cultural heritage including historical sites are reconstructed into digital heritage. Based on UNESCO, digital heritage defines as "cultural, educational, scientific and administrative resources, as well as technical, legal, medical and other kinds of information created digitally, or converted into digital form from existing analogue resources". In addition, the digital heritage is doubling in size every two years and expected will grow tenfold between 2013 and 2020. In order to attract and stir the interest of younger generations about digital heritage, gamification has been widely promoted. In this research, a virtual walkthrough combine with gamifications are proposed for learning and exploring historical places in Malaysia by using mobile device. In conjunction with Visit Perak 2017 Campaign, this virtual walkthrough is proposed for Kellie's Castle at Perak. The objectives of this research is two folds 1) modelling and design of innovative mobile game for virtual walkthrough application, and 2) to attract tourist to explore and learn historical places by using sophisticated graphics from Augmented Reality. The efficiency and effectiveness of the mobile virtual walkthrough will be accessed by the International and local tourists. In conclusion, this research is speculated to be pervasively improve the cultural and historical knowledge of the learners.
Augmenting your own reality: student authoring of science-based augmented reality games.
Klopfer, Eric; Sheldon, Josh
2010-01-01
Augmented Reality (AR) simulations superimpose a virtual overlay of data and interactions onto a real-world context. The simulation engine at the heart of this technology is built to afford elements of game play that support explorations and learning in students' natural context--their own community and surroundings. In one of the more recent games, TimeLab 2100, players role-play citizens of the early 22nd century when global climate change is out of control. Through AR, they see their community as it might be nearly one hundred years in the future. TimeLab and other similar AR games balance location specificity and portability--they are games that are tied to a location and games that are movable from place to place. Focusing students on developing their own AR games provides the best of both virtual and physical worlds: a more portable solution that deeply connects young people to their own surroundings. A series of initiatives has focused on technical and pedagogical solutions to supporting students authoring their own games.
Augmented Virtual Reality Laboratory
NASA Technical Reports Server (NTRS)
Tully-Hanson, Benjamin
2015-01-01
Real time motion tracking hardware has for the most part been cost prohibitive for research to regularly take place until recently. With the release of the Microsoft Kinect in November 2010, researchers now have access to a device that for a few hundred dollars is capable of providing redgreenblue (RGB), depth, and skeleton data. It is also capable of tracking multiple people in real time. For its original intended purposes, i.e. gaming, being used with the Xbox 360 and eventually Xbox One, it performs quite well. However, researchers soon found that although the sensor is versatile, it has limitations in real world applications. I was brought aboard this summer by William Little in the Augmented Virtual Reality (AVR) Lab at Kennedy Space Center to find solutions to these limitations.
Habanapp: Havana's Architectural Heritage a Click Away
NASA Astrophysics Data System (ADS)
Morganti, C.; Bartolomei, C.
2018-05-01
The research treats the application of technologies related with augmented and virtual reality to architectural and historical context in the city of Havana, Cuba, on the basis of historical studies and Range-Imaging techniques on buildings bordering old city's five main squares. The specific aim is to transfer all of the data received thanks to the most recent mobiles apps about Augmented Reality (AR) and Virtual reality (VR), in order to give birth to an innovative App never seen before in Cuba. The "Oficina del Historiador de la ciudad de La Habana", institution supervising architectural and cultural asset in Cuba, is widely interested in the topic in order to develop a new educational, cultural and artistic tool to be used both online and offline.
A Context-Aware Method for Authentically Simulating Outdoors Shadows for Mobile Augmented Reality.
Barreira, Joao; Bessa, Maximino; Barbosa, Luis; Magalhaes, Luis
2018-03-01
Visual coherence between virtual and real objects is a major issue in creating convincing augmented reality (AR) applications. To achieve this seamless integration, actual light conditions must be determined in real time to ensure that virtual objects are correctly illuminated and cast consistent shadows. In this paper, we propose a novel method to estimate daylight illumination and use this information in outdoor AR applications to render virtual objects with coherent shadows. The illumination parameters are acquired in real time from context-aware live sensor data. The method works under unprepared natural conditions. We also present a novel and rapid implementation of a state-of-the-art skylight model, from which the illumination parameters are derived. The Sun's position is calculated based on the user location and time of day, with the relative rotational differences estimated from a gyroscope, compass and accelerometer. The results illustrated that our method can generate visually credible AR scenes with consistent shadows rendered from recovered illumination.
Augmented Reality to Preserve Hidden Vestiges in Historical Cities. a Case Study
NASA Astrophysics Data System (ADS)
Martínez, J. L.; Álvareza, S.; Finat, J.; Delgado, F. J.; Finat, J.
2015-02-01
Mobile devices provide an increasingly sophisticated support to enhanced experiences and understanding the remote past in an interactive way. The use of augmented reality technologies allows to develop mobile applications for indoor exploration of virtually reconstructed archaeological places. In our work we have built a virtual reconstruction of a Roman Villa with data arising from an urgent partial excavation which were performed in order to build a car parking in the historical city of Valladolid (Spain). In its current state, the archaeological site is covered by an urban garden. Localization and tracking are performed using a combination of GPS and inertial sensors of the mobile device. In this work we prove how to perform an interactive navigation around the 3D virtual model showing an interpretation of the way it was. The user experience is enhanced by answering some simple questions, performing minor tasks and puzzles which are presented with multimedia contents linked to key features of the archaeological site.
CSI, optimal control, and accelerometers: Trials and tribulations
NASA Technical Reports Server (NTRS)
Benjamin, Brian J.; Sesak, John R.
1994-01-01
New results concerning optimal design with accelerometers are presented. These results show that the designer must be concerned with the stability properties of two Linear Quadratic Gaussian (LQG) compensators, one of which does not explicitly appear in the closed-loop system dynamics. The new concepts of virtual and implemented compensators are introduced to cope with these subtleties. The virtual compensator appears in the closed-loop system dynamics and the implemented compensator appears in control electronics. The stability of one compensator does not guarantee the stability of the other. For strongly stable (robust) systems, both compensators should be stable. The presence of controlled and uncontrolled modes in the system results in two additional forms of the compensator with corresponding terms that are of like form, but opposite sign, making simultaneous stabilization of both the virtual and implemented compensator difficult. A new design algorithm termed sensor augmentation is developed that aids stabilization of these compensator forms by incorporating a static augmentation term associated with the uncontrolled modes in the design process.
Augmented reality visualization of deformable tubular structures for surgical simulation.
Ferrari, Vincenzo; Viglialoro, Rosanna Maria; Nicoli, Paola; Cutolo, Fabrizio; Condino, Sara; Carbone, Marina; Siesto, Mentore; Ferrari, Mauro
2016-06-01
Surgical simulation based on augmented reality (AR), mixing the benefits of physical and virtual simulation, represents a step forward in surgical training. However, available systems are unable to update the virtual anatomy following deformations impressed on actual anatomy. A proof-of-concept solution is described providing AR visualization of hidden deformable tubular structures using nitinol tubes sensorized with electromagnetic sensors. This system was tested in vitro on a setup comprised of sensorized cystic, left and right hepatic, and proper hepatic arteries. In the trial session, the surgeon deformed the tubular structures with surgical forceps in 10 positions. The mean, standard deviation, and maximum misalignment between virtual and real arteries were 0.35, 0.22, and 0.99 mm, respectively. The alignment accuracy obtained demonstrates the feasibility of the approach, which can be adopted in advanced AR simulations, in particular as an aid to the identification and isolation of tubular structures. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Realistic Reflections for Marine Environments in Augmented Reality Training Systems
2009-09-01
Static Backgrounds. Top: Agua Background. Bottom: Blue Background.............48 Figure 27. Ship Textures Used to Generate Reflections. In Order from...Like virtual simulations, augmented reality trainers can be configured to meet specific training needs and can be restarted and reused to train...Wave Distortion, Blurring and Shadow Many of the same methods outlined in Full Reflection shader were reused for the Physics shader. The same
Using Augmented Reality and Virtual Environments in Historic Places to Scaffold Historical Empathy
ERIC Educational Resources Information Center
Sweeney, Sara K.; Newbill, Phyllis; Ogle, Todd; Terry, Krista
2018-01-01
The authors explore how 3D visualizations of historical sites can be used as pedagogical tools to support historical empathy. They provide three visualizations created by a team at Virginia Tech as examples. They discuss virtual environments and how the digital restoration process is applied. They also define historical empathy, explain why it is…
Kilby, Melissa C; Slobounov, Semyon M; Newell, Karl M
2016-06-01
The experiment manipulated real-time kinematic feedback of the motion of the whole body center of mass (COM) and center of pressure (COP) in anterior-posterior (AP) and medial-lateral (ML) directions to investigate the variables actively controlled in quiet standing of young adults. The feedback reflected the current 2D postural positions within the 2D functional stability boundary that was scaled to 75%, 30% and 12% of its original size. The findings showed that the distance of both COP and COM to the respective stability boundary was greater during the feedback trials compared to a no feedback condition. However, the temporal safety margin of the COP, that is, the virtual time-to-contact (VTC), was higher without feedback. The coupling relation of COP-COM showed stable in-phase synchronization over all of the feedback conditions for frequencies below 1Hz. For higher frequencies (up to 5Hz), there was progressive reduction of COP-COM synchronization and local adaptation under the presence of augmented feedback. The findings show that the augmented feedback of COM and COP motion differentially and adaptively influences spatial and temporal properties of postural motion relative to the stability boundary while preserving the organization of the COM-COP coupling in postural control. Copyright © 2016. Published by Elsevier B.V.
Tomson, Tanja; Zary, Nabil
2014-01-01
Introduction. Antimicrobial resistance is a global health issue. Studies have shown that improved antibiotic prescription education among healthcare professionals reduces mistakes during the antibiotic prescription process. The aim of this study was to investigate novel educational approaches that through the use of Augmented Reality technology could make use of the real physical context and thereby enrich the educational process of antibiotics prescription. The objective is to investigate which type of information related to antibiotics could be used in an augmented reality application for antibiotics education. Methods. This study followed the Design-Based Research Methodology composed of the following main steps: problem analysis, investigation of information that should be visualized for the training session, and finally the involvement of the end users the development and evaluation processes of the prototype. Results. Two of the most important aspects in the antibiotic prescription process, to represent in an augmented reality application, are the antibiotic guidelines and the side effects. Moreover, this study showed how this information could be visualized from a mobile device using an Augmented Reality scanner and antibiotic drug boxes as markers. Discussion. In this study we investigated the usage of objects from a real physical context such as drug boxes and how they could be used as educational resources. The logical next steps are to examine how this approach of combining physical and virtual contexts through Augmented Reality applications could contribute to the improvement of competencies among healthcare professionals and its impact on the decrease of antibiotics resistance. PMID:25548733
Nifakos, Sokratis; Tomson, Tanja; Zary, Nabil
2014-01-01
Introduction. Antimicrobial resistance is a global health issue. Studies have shown that improved antibiotic prescription education among healthcare professionals reduces mistakes during the antibiotic prescription process. The aim of this study was to investigate novel educational approaches that through the use of Augmented Reality technology could make use of the real physical context and thereby enrich the educational process of antibiotics prescription. The objective is to investigate which type of information related to antibiotics could be used in an augmented reality application for antibiotics education. Methods. This study followed the Design-Based Research Methodology composed of the following main steps: problem analysis, investigation of information that should be visualized for the training session, and finally the involvement of the end users the development and evaluation processes of the prototype. Results. Two of the most important aspects in the antibiotic prescription process, to represent in an augmented reality application, are the antibiotic guidelines and the side effects. Moreover, this study showed how this information could be visualized from a mobile device using an Augmented Reality scanner and antibiotic drug boxes as markers. Discussion. In this study we investigated the usage of objects from a real physical context such as drug boxes and how they could be used as educational resources. The logical next steps are to examine how this approach of combining physical and virtual contexts through Augmented Reality applications could contribute to the improvement of competencies among healthcare professionals and its impact on the decrease of antibiotics resistance.
Ketelhut, Diane Jass; Niemi, Steven M
2007-01-01
This article examines several new and exciting communication technologies. Many of the technologies were developed by the entertainment industry; however, other industries are adopting and modifying them for their own needs. These new technologies allow people to collaborate across distance and time and to learn in simulated work contexts. The article explores the potential utility of these technologies for advancing laboratory animal care and use through better education and training. Descriptions include emerging technologies such as augmented reality and multi-user virtual environments, which offer new approaches with different capabilities. Augmented reality interfaces, characterized by the use of handheld computers to infuse the virtual world into the real one, result in deeply immersive simulations. In these simulations, users can access virtual resources and communicate with real and virtual participants. Multi-user virtual environments enable multiple participants to simultaneously access computer-based three-dimensional virtual spaces, called "worlds," and to interact with digital tools. They allow for authentic experiences that promote collaboration, mentoring, and communication. Because individuals may learn or train differently, it is advantageous to combine the capabilities of these technologies and applications with more traditional methods to increase the number of students who are served by using current methods alone. The use of these technologies in animal care and use programs can create detailed training and education environments that allow students to learn the procedures more effectively, teachers to assess their progress more objectively, and researchers to gain insights into animal care.
Evaluation of Wearable Haptic Systems for the Fingers in Augmented Reality Applications.
Maisto, Maurizio; Pacchierotti, Claudio; Chinello, Francesco; Salvietti, Gionata; De Luca, Alessandro; Prattichizzo, Domenico
2017-01-01
Although Augmented Reality (AR) has been around for almost five decades, only recently we have witnessed AR systems and applications entering in our everyday life. Representative examples of this technological revolution are the smartphone games "Pokémon GO" and "Ingress" or the Google Translate real-time sign interpretation app. Even if AR applications are already quite compelling and widespread, users are still not able to physically interact with the computer-generated reality. In this respect, wearable haptics can provide the compelling illusion of touching the superimposed virtual objects without constraining the motion or the workspace of the user. In this paper, we present the experimental evaluation of two wearable haptic interfaces for the fingers in three AR scenarios, enrolling 38 participants. In the first experiment, subjects were requested to write on a virtual board using a real chalk. The haptic devices provided the interaction forces between the chalk and the board. In the second experiment, subjects were asked to pick and place virtual and real objects. The haptic devices provided the interaction forces due to the weight of the virtual objects. In the third experiment, subjects were asked to balance a virtual sphere on a real cardboard. The haptic devices provided the interaction forces due to the weight of the virtual sphere rolling on the cardboard. Providing haptic feedback through the considered wearable device significantly improved the performance of all the considered tasks. Moreover, subjects significantly preferred conditions providing wearable haptic feedback.
Applying Augmented Reality in practical classes for engineering students
NASA Astrophysics Data System (ADS)
Bazarov, S. E.; Kholodilin, I. Yu; Nesterov, A. S.; Sokhina, A. V.
2017-10-01
In this article the Augmented Reality application for teaching engineering students of electrical and technological specialties is introduced. In order to increase the motivation for learning and the independence of students, new practical guidelines on Augmented Reality were developed in the application to practical classes. During the application development, the authors used software such as Unity 3D and Vuforia. The Augmented Reality content consists of 3D-models, images and animations, which are superimposed on real objects, helping students to study specific tasks. A user who has a smartphone, a tablet PC, or Augmented Reality glasses can visualize on-screen virtual objects added to a real environment. Having analyzed the current situation in higher education: the learner’s interest in studying, their satisfaction with the educational process, and the impact of the Augmented Reality application on students, a questionnaire was developed and offered to students; the study involved 24 learners.
Augmented reality on poster presentations, in the field and in the classroom
NASA Astrophysics Data System (ADS)
Hawemann, Friedrich; Kolawole, Folarin
2017-04-01
Augmented reality (AR) is the direct addition of virtual information through an interface to a real-world environment. In practice, through a mobile device such as a tablet or smartphone, information can be projected onto a target- for example, an image on a poster. Mobile devices are widely distributed today such that augmented reality is easily accessible to almost everyone. Numerous studies have shown that multi-dimensional visualization is essential for efficient perception of the spatial, temporal and geometrical configuration of geological structures and processes. Print media, such as posters and handouts lack the ability to display content in the third and fourth dimensions, which might be in space-domain as seen in three-dimensional (3-D) objects, or time-domain (four-dimensional, 4-D) expressible in the form of videos. Here, we show that augmented reality content can be complimentary to geoscience poster presentations, hands-on material and in the field. In the latter example, location based data is loaded and for example, a virtual geological profile can be draped over a real-world landscape. In object based AR, the application is trained to recognize an image or object through the camera of the user's mobile device, such that specific content is automatically downloaded and displayed on the screen of the device, and positioned relative to the trained image or object. We used ZapWorks, a commercially-available software application to create and present examples of content that is poster-based, in which important supplementary information is presented as interactive virtual images, videos and 3-D models. We suggest that the flexibility and real-time interactivity offered by AR makes it an invaluable tool for effective geoscience poster presentation, class-room and field geoscience learning.
The Architectonic Experience of Body and Space in Augmented Interiors
Pasqualini, Isabella; Blefari, Maria Laura; Tadi, Tej; Serino, Andrea; Blanke, Olaf
2018-01-01
The environment shapes our experience of space in constant interaction with the body. Architectonic interiors amplify the perception of space through the bodily senses; an effect also known as embodiment. The interaction of the bodily senses with the space surrounding the body can be tested experimentally through the manipulation of multisensory stimulation and measured via a range of behaviors related to bodily self-consciousness. Many studies have used Virtual Reality to show that visuotactile conflicts mediated via a virtual body or avatar can disrupt the unified subjective experience of the body and self. In the full-body illusion paradigm, participants feel as if the avatar was their body (ownership, self-identification) and they shift their center of awareness toward the position of the avatar (self-location). However, the influence of non-bodily spatial cues around the body on embodiment remains unclear, and data about the impact of architectonic space on human perception and self-conscious states are sparse. We placed participants into a Virtual Reality arena, where large and narrow virtual interiors were displayed with and without an avatar. We then applied synchronous or asynchronous visuotactile strokes to the back of the participants and avatar, or, to the front wall of the void interiors. During conditions of illusory self-identification with the avatar, participants reported sensations of containment, drift, and touch with the architectonic environment. The absence of the avatar suppressed such feelings, yet, in the large space, we found an effect of continuity between the physical and the virtual interior depending on the full-body illusion. We discuss subjective feelings evoked by architecture and compare the full-body illusion in augmented interiors to architectonic embodiment. A relevant outcome of this study is the potential to dissociate the egocentric, first-person view from the physical point of view through augmented architectonic space. PMID:29755378
The Architectonic Experience of Body and Space in Augmented Interiors.
Pasqualini, Isabella; Blefari, Maria Laura; Tadi, Tej; Serino, Andrea; Blanke, Olaf
2018-01-01
The environment shapes our experience of space in constant interaction with the body. Architectonic interiors amplify the perception of space through the bodily senses; an effect also known as embodiment. The interaction of the bodily senses with the space surrounding the body can be tested experimentally through the manipulation of multisensory stimulation and measured via a range of behaviors related to bodily self-consciousness. Many studies have used Virtual Reality to show that visuotactile conflicts mediated via a virtual body or avatar can disrupt the unified subjective experience of the body and self. In the full-body illusion paradigm, participants feel as if the avatar was their body (ownership, self-identification) and they shift their center of awareness toward the position of the avatar (self-location). However, the influence of non-bodily spatial cues around the body on embodiment remains unclear, and data about the impact of architectonic space on human perception and self-conscious states are sparse. We placed participants into a Virtual Reality arena, where large and narrow virtual interiors were displayed with and without an avatar. We then applied synchronous or asynchronous visuotactile strokes to the back of the participants and avatar, or, to the front wall of the void interiors. During conditions of illusory self-identification with the avatar, participants reported sensations of containment, drift, and touch with the architectonic environment. The absence of the avatar suppressed such feelings, yet, in the large space, we found an effect of continuity between the physical and the virtual interior depending on the full-body illusion. We discuss subjective feelings evoked by architecture and compare the full-body illusion in augmented interiors to architectonic embodiment. A relevant outcome of this study is the potential to dissociate the egocentric, first-person view from the physical point of view through augmented architectonic space.
Twitter-Augmented Journal Club: Educational Engagement and Experience So Far.
Udani, Ankeet D; Moyse, Daniel; Peery, Charles Andrew; Taekman, Jeffrey M
2016-04-15
Social media is a nascent medical educational technology. The benefits of Twitter include (1) easy adoption; (2) access to experts, peers, and patients across the globe; (3) 24/7 connectivity; (4) creation of virtual, education-based communities using hashtags; and (5) crowdsourcing information using retweets. We report on a novel Twitter-augmented journal club for anesthesia residents: its design, implementation, and impact. Our inaugural anesthesia Twitter-augmented journal club succeeded in engaging the anesthesia community and increasing residents' professional use of Twitter. Notably, our experience suggests that anesthesia residents are willing to use social media for their education.
Augmented reality for anatomical education.
Thomas, Rhys Gethin; John, Nigel William; Delieu, John Michael
2010-03-01
The use of Virtual Environments has been widely reported as a method of teaching anatomy. Generally such environments only convey the shape of the anatomy to the student. We present the Bangor Augmented Reality Education Tool for Anatomy (BARETA), a system that combines Augmented Reality (AR) technology with models produced using Rapid Prototyping (RP) technology, to provide the student with stimulation for touch as well as sight. The principal aims of this work were to provide an interface more intuitive than a mouse and keyboard, and to evaluate such a system as a viable supplement to traditional cadaver based education.
Advanced helmet mounted display (AHMD)
NASA Astrophysics Data System (ADS)
Sisodia, Ashok; Bayer, Michael; Townley-Smith, Paul; Nash, Brian; Little, Jay; Cassarly, William; Gupta, Anurag
2007-04-01
Due to significantly increased U.S. military involvement in deterrent, observer, security, peacekeeping and combat roles around the world, the military expects significant future growth in the demand for deployable virtual reality trainers with networked simulation capability of the battle space visualization process. The use of HMD technology in simulated virtual environments has been initiated by the demand for more effective training tools. The AHMD overlays computer-generated data (symbology, synthetic imagery, enhanced imagery) augmented with actual and simulated visible environment. The AHMD can be used to support deployable reconfigurable training solutions as well as traditional simulation requirements, UAV augmented reality, air traffic control and Command, Control, Communications, Computers, Intelligence, Surveillance, and Reconnaissance (C4ISR) applications. This paper will describe the design improvements implemented for production of the AHMD System.
NASA Astrophysics Data System (ADS)
Lin, Chien-Liang; Su, Yu-Zheng; Hung, Min-Wei; Huang, Kuo-Cheng
2010-08-01
In recent years, Augmented Reality (AR)[1][2][3] is very popular in universities and research organizations. The AR technology has been widely used in Virtual Reality (VR) fields, such as sophisticated weapons, flight vehicle development, data model visualization, virtual training, entertainment and arts. AR has characteristics to enhance the display output as a real environment with specific user interactive functions or specific object recognitions. It can be use in medical treatment, anatomy training, precision instrument casting, warplane guidance, engineering and distance robot control. AR has a lot of vantages than VR. This system developed combines sensors, software and imaging algorithms to make users feel real, actual and existing. Imaging algorithms include gray level method, image binarization method, and white balance method in order to make accurate image recognition and overcome the effects of light.
Using virtual reality to augment perception, enhance sensorimotor adaptation, and change our minds.
Wright, W Geoffrey
2014-01-01
Technological advances that involve human sensorimotor processes can have both intended and unintended effects on the central nervous system (CNS). This mini review focuses on the use of virtual environments (VE) to augment brain functions by enhancing perception, eliciting automatic motor behavior, and inducing sensorimotor adaptation. VE technology is becoming increasingly prevalent in medical rehabilitation, training simulators, gaming, and entertainment. Although these VE applications have often been shown to optimize outcomes, whether it be to speed recovery, reduce training time, or enhance immersion and enjoyment, there are inherent drawbacks to environments that can potentially change sensorimotor calibration. Across numerous VE studies over the years, we have investigated the effects of combining visual and physical motion on perception, motor control, and adaptation. Recent results from our research involving exposure to dynamic passive motion within a visually-depicted VE reveal that short-term exposure to augmented sensorimotor discordance can result in systematic aftereffects that last beyond the exposure period. Whether these adaptations are advantageous or not, remains to be seen. Benefits as well as risks of using VE-driven sensorimotor stimulation to enhance brain processes will be discussed.
Mirelman, Anat; Rochester, Lynn; Reelick, Miriam; Nieuwhof, Freek; Pelosin, Elisa; Abbruzzese, Giovanni; Dockx, Kim; Nieuwboer, Alice; Hausdorff, Jeffrey M
2013-02-06
Recent work has demonstrated that fall risk can be attributed to cognitive as well as motor deficits. Indeed, everyday walking in complex environments utilizes executive function, dual tasking, planning and scanning, all while walking forward. Pilot studies suggest that a multi-modal intervention that combines treadmill training to target motor function and a virtual reality obstacle course to address the cognitive components of fall risk may be used to successfully address the motor-cognitive interactions that are fundamental for fall risk reduction. The proposed randomized controlled trial will evaluate the effects of treadmill training augmented with virtual reality on fall risk. Three hundred older adults with a history of falls will be recruited to participate in this study. This will include older adults (n=100), patients with mild cognitive impairment (n=100), and patients with Parkinson's disease (n=100). These three sub-groups will be recruited in order to evaluate the effects of the intervention in people with a range of motor and cognitive deficits. Subjects will be randomly assigned to the intervention group (treadmill training with virtual reality) or to the active-control group (treadmill training without virtual reality). Each person will participate in a training program set in an outpatient setting 3 times per week for 6 weeks. Assessments will take place before, after, and 1 month and 6 months after the completion of the training. A falls calendar will be kept by each participant for 6 months after completing the training to assess fall incidence (i.e., the number of falls, multiple falls and falls rate). In addition, we will measure gait under usual and dual task conditions, balance, community mobility, health related quality of life, user satisfaction and cognitive function. This randomized controlled trial will demonstrate the extent to which an intervention that combines treadmill training augmented by virtual reality reduces fall risk, improves mobility and enhances cognitive function in a diverse group of older adults. In addition, the comparison to an active control group that undergoes treadmill training without virtual reality will provide evidence as to the added value of addressing motor cognitive interactions as an integrated unit. (NIH)-NCT01732653.
2013-01-01
Background Recent work has demonstrated that fall risk can be attributed to cognitive as well as motor deficits. Indeed, everyday walking in complex environments utilizes executive function, dual tasking, planning and scanning, all while walking forward. Pilot studies suggest that a multi-modal intervention that combines treadmill training to target motor function and a virtual reality obstacle course to address the cognitive components of fall risk may be used to successfully address the motor-cognitive interactions that are fundamental for fall risk reduction. The proposed randomized controlled trial will evaluate the effects of treadmill training augmented with virtual reality on fall risk. Methods/Design Three hundred older adults with a history of falls will be recruited to participate in this study. This will include older adults (n=100), patients with mild cognitive impairment (n=100), and patients with Parkinson’s disease (n=100). These three sub-groups will be recruited in order to evaluate the effects of the intervention in people with a range of motor and cognitive deficits. Subjects will be randomly assigned to the intervention group (treadmill training with virtual reality) or to the active-control group (treadmill training without virtual reality). Each person will participate in a training program set in an outpatient setting 3 times per week for 6 weeks. Assessments will take place before, after, and 1 month and 6 months after the completion of the training. A falls calendar will be kept by each participant for 6 months after completing the training to assess fall incidence (i.e., the number of falls, multiple falls and falls rate). In addition, we will measure gait under usual and dual task conditions, balance, community mobility, health related quality of life, user satisfaction and cognitive function. Discussion This randomized controlled trial will demonstrate the extent to which an intervention that combines treadmill training augmented by virtual reality reduces fall risk, improves mobility and enhances cognitive function in a diverse group of older adults. In addition, the comparison to an active control group that undergoes treadmill training without virtual reality will provide evidence as to the added value of addressing motor cognitive interactions as an integrated unit. Trial Registration (NIH)–NCT01732653 PMID:23388087
Fiia: A Model-Based Approach to Engineering Collaborative Augmented Reality
NASA Astrophysics Data System (ADS)
Wolfe, Christopher; Smith, J. David; Phillips, W. Greg; Graham, T. C. Nicholas
Augmented reality systems often involve collaboration among groups of people. While there are numerous toolkits that aid the development of such augmented reality groupware systems (e.g., ARToolkit and Groupkit), there remains an enormous gap between the specification of an AR groupware application and its implementation. In this chapter, we present Fiia, a toolkit which simplifies the development of collaborative AR applications. Developers specify the structure of their applications using the Fiia modeling language, which abstracts details of networking and provides high-level support for specifying adapters between the physical and virtual world. The Fiia.Net runtime system then maps this conceptual model to a runtime implementation. We illustrate Fiia via Raptor, an augmented reality application used to help small groups collaboratively prototype video games.
An augmented reality tool for learning spatial anatomy on mobile devices.
Jain, Nishant; Youngblood, Patricia; Hasel, Matthew; Srivastava, Sakti
2017-09-01
Augmented Realty (AR) offers a novel method of blending virtual and real anatomy for intuitive spatial learning. Our first aim in the study was to create a prototype AR tool for mobile devices. Our second aim was to complete a technical evaluation of our prototype AR tool focused on measuring the system's ability to accurately render digital content in the real world. We imported Computed Tomography (CT) data derived virtual surface models into a 3D Unity engine environment and implemented an AR algorithm to display these on mobile devices. We investigated the accuracy of the virtual renderings by comparing a physical cube with an identical virtual cube for dimensional accuracy. Our comparative study confirms that our AR tool renders 3D virtual objects with a high level of accuracy as evidenced by the degree of similarity between measurements of the dimensions of a virtual object (a cube) and the corresponding physical object. We developed an inexpensive and user-friendly prototype AR tool for mobile devices that creates highly accurate renderings. This prototype demonstrates an intuitive, portable, and integrated interface for spatial interaction with virtual anatomical specimens. Integrating this AR tool with a library of CT derived surface models provides a platform for spatial learning in the anatomy curriculum. The segmentation methodology implemented to optimize human CT data for mobile viewing can be extended to include anatomical variations and pathologies. The ability of this inexpensive educational platform to deliver a library of interactive, 3D models to students worldwide demonstrates its utility as a supplemental teaching tool that could greatly benefit anatomical instruction. Clin. Anat. 30:736-741, 2017. © 2017Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
A virtual tour of geological heritage: Valourising geodiversity using Google Earth and QR code
NASA Astrophysics Data System (ADS)
Martínez-Graña, A. M.; Goy, J. L.; Cimarra, C. A.
2013-12-01
When making land-use plans, it is necessary to inventory and catalogue the geological heritage and geodiversity of a site to establish an apolitical conservation protection plan to meet the educational and social needs of society. New technologies make it possible to create virtual databases using virtual globes - e.g., Google Earth - and other personal-use geomatics applications (smartphones, tablets, PDAs) for accessing geological heritage information in “real time” for scientific, educational, and cultural purposes via a virtual geological itinerary. Seventeen mapped and georeferenced geosites have been created in Keyhole Markup Language for use in map layers used in geological itinerary stops for different applications. A virtual tour has been developed for Las Quilamas Natural Park, which is located in the Spanish Central System, using geological layers and topographic and digital terrain models that can be overlaid in a 3D model. The Google Earth application was used to import the geosite placemarks. For each geosite, a tab has been developed that shows a description of the geology with photographs and diagrams and that evaluates the scientific, educational, and tourism quality. Augmented reality allows the user to access these georeferenced thematic layers and overlay data, images, and graphics in real time on their mobile devices. These virtual tours can be incorporated into subject guides designed by public. Seven educational and interpretive panels describing some of the geosites were designed and tagged with a QR code that could be printed at each stop or in the printed itinerary. These QR codes can be scanned with the camera found on most mobile devices, and video virtual tours can be viewed on these devices. The virtual tour of the geological heritage can be used to show tourists the geological history of the Las Quilamas Natural Park using new geomatics technologies (virtual globes, augmented reality, and QR codes).
Hoermann, Simon; Ferreira Dos Santos, Luara; Morkisch, Nadine; Jettkowski, Katrin; Sillis, Moran; Devan, Hemakumar; Kanagasabai, Parimala S; Schmidt, Henning; Krüger, Jörg; Dohle, Christian; Regenbrecht, Holger; Hale, Leigh; Cutfield, Nicholas J
2017-07-01
New rehabilitation strategies for post-stroke upper limb rehabilitation employing visual stimulation show promising results, however, cost-efficient and clinically feasible ways to provide these interventions are still lacking. An integral step is to translate recent technological advances, such as in virtual and augmented reality, into therapeutic practice to improve outcomes for patients. This requires research on the adaptation of the technology for clinical use as well as on the appropriate guidelines and protocols for sustainable integration into therapeutic routines. Here, we present and evaluate a novel and affordable augmented reality system (Augmented Reflection Technology, ART) in combination with a validated mirror therapy protocol for upper limb rehabilitation after stroke. We evaluated components of the therapeutic intervention, from the patients' and the therapists' points of view in a clinical feasibility study at a rehabilitation centre. We also assessed the integration of ART as an adjunct therapy for the clinical rehabilitation of subacute patients at two different hospitals. The results showed that the combination and application of the Berlin Protocol for Mirror Therapy together with ART was feasible for clinical use. This combination was integrated into the therapeutic plan of subacute stroke patients at the two clinical locations where the second part of this research was conducted. Our findings pave the way for using technology to provide mirror therapy in clinical settings and show potential for the more effective use of inpatient time and enhanced recoveries for patients. Implications for Rehabilitation Computerised Mirror Therapy is feasible for clinical use Augmented Reflection Technology can be integrated as an adjunctive therapeutic intervention for subacute stroke patients in an inpatient setting Virtual Rehabilitation devices such as Augmented Reflection Technology have considerable potential to enhance stroke rehabilitation.
The development, assessment and validation of virtual reality for human anatomy instruction
NASA Technical Reports Server (NTRS)
Marshall, Karen Benn
1996-01-01
This research project seeks to meet the objective of science training by developing, assessing, validating and utilizing VR as a human anatomy training medium. Current anatomy instruction is primarily in the form of lectures and usage of textbooks. In ideal situations, anatomic models, computer-based instruction, and cadaver dissection are utilized to augment traditional methods of instruction. At many institutions, lack of financial resources limits anatomy instruction to textbooks and lectures. However, human anatomy is three-dimensional, unlike the one-dimensional depiction found in textbooks and the two-dimensional depiction found on the computer. Virtual reality allows one to step through the computer screen into a 3-D artificial world. The primary objective of this project is to produce a virtual reality application of the abdominopelvic region of a human cadaver that can be taken back to the classroom. The hypothesis is that an immersive learning environment affords quicker anatomic recognition and orientation and a greater level of retention in human anatomy instruction. The goal is to augment not replace traditional modes of instruction.
An augmented reality system validation for the treatment of cockroach phobia.
Bretón-López, Juani; Quero, Soledad; Botella, Cristina; García-Palacios, Azucena; Baños, Rosa Maria; Alcañiz, Mariano
2010-12-01
Augmented reality (AR) is a new technology in which various virtual elements are incorporated into the user's perception of the real world. The most significant aspect of AR is that the virtual elements add relevant and helpful information to the real scene. AR shares some important characteristics with virtual reality as applied in clinical psychology. However, AR offers additional features that might be crucial for treating certain problems. An AR system designed to treat insect phobia has been used for treating phobia of small animals, and positive preliminary data about the global efficacy of the system have been obtained. However, it is necessary to determine the capacity of similar AR systems and their elements that are designed to evoke anxiety in participants; this is achieved by testing the correspondence between the inclusion of feared stimuli and the induction of anxiety. The objective of the present work is to validate whether the stimuli included in the AR-Insect Phobia system are capable of inducing anxiety in six participants diagnosed with cockroach phobia. Results support the adequacy of each element of the system in inducing anxiety in all participants.
NASA Astrophysics Data System (ADS)
McFadden, D.; Tavakkoli, A.; Regenbrecht, J.; Wilson, B.
2017-12-01
Virtual Reality (VR) and Augmented Reality (AR) applications have recently seen an impressive growth, thanks to the advent of commercial Head Mounted Displays (HMDs). This new visualization era has opened the possibility of presenting researchers from multiple disciplines with data visualization techniques not possible via traditional 2D screens. In a purely VR environment researchers are presented with the visual data in a virtual environment, whereas in a purely AR application, a piece of virtual object is projected into the real world with which researchers could interact. There are several limitations to the purely VR or AR application when taken within the context of remote planetary exploration. For example, in a purely VR environment, contents of the planet surface (e.g. rocks, terrain, or other features) should be created off-line from a multitude of images using image processing techniques to generate 3D mesh data that will populate the virtual surface of the planet. This process usually takes a tremendous amount of computational resources and cannot be delivered in real-time. As an alternative, video frames may be superimposed on the virtual environment to save processing time. However, such rendered video frames will lack 3D visual information -i.e. depth information. In this paper, we present a technique to utilize a remotely situated robot's stereoscopic cameras to provide a live visual feed from the real world into the virtual environment in which planetary scientists are immersed. Moreover, the proposed technique will blend the virtual environment with the real world in such a way as to preserve both the depth and visual information from the real world while allowing for the sensation of immersion when the entire sequence is viewed via an HMD such as Oculus Rift. The figure shows the virtual environment with an overlay of the real-world stereoscopic video being presented in real-time into the virtual environment. Notice the preservation of the object's shape, shadows, and depth information. The distortions shown in the image are due to the rendering of the stereoscopic data into a 2D image for the purposes of taking screenshots.
Immersive Collaboration Simulations: Multi-User Virtual Environments and Augmented Realities
NASA Technical Reports Server (NTRS)
Dede, Chris
2008-01-01
Emerging information technologies are reshaping the following: shifts in the knowledge and skills society values, development of new methods of teaching and learning, and changes in the characteristics of learning.
The Virtual Clinical Practicum: an innovative telehealth model for clinical nursing education.
Grady, Janet L
2011-01-01
The Virtual Clinical Practicum (VCP) involves a clinical nursing education delivery strategy that uses video teleconferencing technology to address time, distance, and resource barriers. Technology-delivered education can augment the existing curriculum by increasing student access to clinical experts in specialty areas, thus supporting efficient use of faculty resources. This article describes the implementation of the VCP process and student perceptions of its effectiveness and usefulness. The VCP was shown to be a successful method of clinical nursing education, offering students exposure to clinical situations not available by other means. Opportunities for dialogue, critical reflection, and synthesis allowed students to experience the benefits of a traditional experience, enhanced through technology and tailored to the specific needs of the students. Respondents overwhelmingly recommended further use of the VCP to augment existing clinical nursing education methods.
Khademi, Maryam; Hondori, Hossein Mousavi; Dodakian, Lucy; Cramer, Steve; Lopes, Cristina V
2013-01-01
Introducing computer games to the rehabilitation market led to development of numerous Virtual Reality (VR) training applications. Although VR has provided tremendous benefit to the patients and caregivers, it has inherent limitations, some of which might be solved by replacing it with Augmented Reality (AR). The task of pick-and-place, which is part of many activities of daily living (ADL's), is one of the major affected functions stroke patients mainly expect to recover. We developed an exercise consisting of moving an object between various points, following a flash light that indicates the next target. The results show superior performance of subjects in spatial AR versus non-immersive VR setting. This could be due to the extraneous hand-eye coordination which exists in VR whereas it is eliminated in spatial AR.
An augmented-reality edge enhancement application for Google Glass.
Hwang, Alex D; Peli, Eli
2014-08-01
Google Glass provides a platform that can be easily extended to include a vision enhancement tool. We have implemented an augmented vision system on Glass, which overlays enhanced edge information over the wearer's real-world view, to provide contrast-improved central vision to the Glass wearers. The enhanced central vision can be naturally integrated with scanning. Google Glass' camera lens distortions were corrected by using an image warping. Because the camera and virtual display are horizontally separated by 16 mm, and the camera aiming and virtual display projection angle are off by 10°, the warped camera image had to go through a series of three-dimensional transformations to minimize parallax errors before the final projection to the Glass' see-through virtual display. All image processes were implemented to achieve near real-time performance. The impacts of the contrast enhancements were measured for three normal-vision subjects, with and without a diffuser film to simulate vision loss. For all three subjects, significantly improved contrast sensitivity was achieved when the subjects used the edge enhancements with a diffuser film. The performance boost is limited by the Glass camera's performance. The authors assume that this accounts for why performance improvements were observed only with the diffuser filter condition (simulating low vision). Improvements were measured with simulated visual impairments. With the benefit of see-through augmented reality edge enhancement, natural visual scanning process is possible and suggests that the device may provide better visual function in a cosmetically and ergonomically attractive format for patients with macular degeneration.
Shen, Xin; Javidi, Bahram
2018-03-01
We have developed a three-dimensional (3D) dynamic integral-imaging (InIm)-system-based optical see-through augmented reality display with enhanced depth range of a 3D augmented image. A focus-tunable lens is adopted in the 3D display unit to relay the elemental images with various positions to the micro lens array. Based on resolution priority integral imaging, multiple lenslet image planes are generated to enhance the depth range of the 3D image. The depth range is further increased by utilizing both the real and virtual 3D imaging fields. The 3D reconstructed image and the real-world scene are overlaid using an optical see-through display for augmented reality. The proposed system can significantly enhance the depth range of a 3D reconstructed image with high image quality in the micro InIm unit. This approach provides enhanced functionality for augmented information and adjusts the vergence-accommodation conflict of a traditional augmented reality display.
Virtual rehabilitation--benefits and challenges.
Burdea, G C
2003-01-01
To discuss the advantages and disadvantages of rehabilitation applications of virtual reality. VR can be used as an enhancement to conventional therapy for patients with conditions ranging from musculoskeletal problems, to stroke-induced paralysis, to cognitive deficits. This approach is called "VR-augmented rehabilitation." Alternately, VR can replace conventional interventions altogether, in which case the rehabilitation is "VR-based." If the intervention is done at a distance, then it is called "telerehabilitation." Simulation exercises for post-stroke patients have been developed using a "teacher object" approach or a video game approach. Simulations for musculo-skeletal patients use virtual replicas of rehabilitation devices (such as rubber ball, power putty, peg board). Phobia-inducing virtual environments are prescribed for patients with cognitive deficits. VR-augmented rehabilitation has been shown effective for stroke patients in the chronic phase of the disease. VR-based rehabilitation has been improving patients with fear of flying, Vietnam syndrome, fear of heights, and chronic stroke patients. Telerehabilitation interventions using VR have improved musculo-skeletal and post-stroke patients, however less data is available at this time. Virtual reality presents significant advantages when applied to rehabilitation of patients with varied conditions. These advantages include patient motivation, adaptability and variability based on patient baseline, transparent data storage, online remote data access, economy of scale, reduced medical costs. Challenges in VR use for rehabilitation relate to lack of computer skills on the part of therapists, lack of support infrastructure, expensive equipment (initially), inadequate communication infrastructure (for telerehabilitation in rural areas), and patient safety concerns.
Telemedicine with mobile devices and augmented reality for early postoperative care.
Ponce, Brent A; Brabston, Eugene W; Shin Zu; Watson, Shawna L; Baker, Dustin; Winn, Dennis; Guthrie, Barton L; Shenai, Mahesh B
2016-08-01
Advanced features are being added to telemedicine paradigms to enhance usability and usefulness. Virtual Interactive Presence (VIP) is a technology that allows a surgeon and patient to interact in a "merged reality" space, to facilitate both verbal, visual, and manual interaction. In this clinical study, a mobile VIP iOS application was introduced into routine post-operative orthopedic and neurosurgical care. Survey responses endorse the usefulness of this tool, as it relates to The virtual interaction provides needed virtual follow-up in instances where in-person follow-up may be limited, and enhances the subjective patient experience.
Feasibility of virtual reality augmented cycling for health promotion of people poststroke.
Deutsch, Judith E; Myslinski, Mary Jane; Kafri, Michal; Ranky, Richard; Sivak, Mark; Mavroidis, Constantinos; Lewis, Jeffrey A
2013-09-01
A virtual reality (VR) augmented cycling kit (VRACK) was developed to address motor control and fitness deficits of individuals with chronic stroke. In this article, we report on the safety, feasibility, and efficacy of using the VR augmented cycling kit to improve cardiorespiratory (CR) fitness of individuals in the chronic phase poststroke. Four individuals with chronic stroke (47-65 years old and ≥3 years poststroke), with residual lower extremity impairments (Fugl-Meyer 24-26/34), who were limited community ambulators (gait speed range 0.56-1.1 m/s) participated in this study. Safety was defined as the absence of adverse events. Feasibility was measured using attendance, total exercise time, and "involvement" measured with the presence questionnaire (PQ). Efficacy of CR fitness was evaluated using a submaximal bicycle ergometer test before and after an 8-week training program. The intervention was safe and feasible with participants having 1 adverse event, 100% adherence, achieving between 90 and 125 minutes of cycling each week, and a mean PQ score of 39 (SD 3.3). There was a statistically significant (13%; P = 0.035) improvement in peak VO(2), with a range of 6% to 24.5%. For these individuals, poststroke, VR augmented cycling, using their heart rate to set their avatar's speed, fostered training of sufficient duration and intensity to promote CR fitness. In addition, there was a transfer of training from the bicycle to walking endurance. VR augmented cycling may be an addition to the therapist's tools for concurrent training of mobility and health promotion of individuals poststroke.
Wagner, A; Ploder, O; Enislidis, G; Truppe, M; Ewers, R
1996-04-01
Interventional video tomography (IVT), a new imaging modality, achieves virtual visualization of anatomic structures in three dimensions for intraoperative stereotactic navigation. Partial immersion into a virtual data space, which is orthotopically coregistered to the surgical field, enhances, by means of a see-through head-mounted display (HMD), the surgeon's visual perception and technique by providing visual access to nonvisual data of anatomy, physiology, and function. The presented cases document the potential of augmented reality environments in maxillofacial surgery.
Mixed Reality Technology at NASA JPL
2016-05-16
NASA's JPL is a center of innovation in virtual and augmented reality, producing groundbreaking applications of these technologies to support a variety of missions. This video is a collection of unedited scenes released to the media.
A standardized set of 3-D objects for virtual reality research and applications.
Peeters, David
2018-06-01
The use of immersive virtual reality as a research tool is rapidly increasing in numerous scientific disciplines. By combining ecological validity with strict experimental control, immersive virtual reality provides the potential to develop and test scientific theories in rich environments that closely resemble everyday settings. This article introduces the first standardized database of colored three-dimensional (3-D) objects that can be used in virtual reality and augmented reality research and applications. The 147 objects have been normed for name agreement, image agreement, familiarity, visual complexity, and corresponding lexical characteristics of the modal object names. The availability of standardized 3-D objects for virtual reality research is important, because reaching valid theoretical conclusions hinges critically on the use of well-controlled experimental stimuli. Sharing standardized 3-D objects across different virtual reality labs will allow for science to move forward more quickly.
Nanotechnology: toxicologic pathology.
Hubbs, Ann F; Sargent, Linda M; Porter, Dale W; Sager, Tina M; Chen, Bean T; Frazer, David G; Castranova, Vincent; Sriram, Krishnan; Nurkiewicz, Timothy R; Reynolds, Steven H; Battelli, Lori A; Schwegler-Berry, Diane; McKinney, Walter; Fluharty, Kara L; Mercer, Robert R
2013-02-01
Nanotechnology involves technology, science, and engineering in dimensions less than 100 nm. A virtually infinite number of potential nanoscale products can be produced from many different molecules and their combinations. The exponentially increasing number of nanoscale products will solve critical needs in engineering, science, and medicine. However, the virtually infinite number of potential nanotechnology products is a challenge for toxicologic pathologists. Because of their size, nanoparticulates can have therapeutic and toxic effects distinct from micron-sized particulates of the same composition. In the nanoscale, distinct intercellular and intracellular translocation pathways may provide a different distribution than that obtained by micron-sized particulates. Nanoparticulates interact with subcellular structures including microtubules, actin filaments, centrosomes, and chromatin; interactions that may be facilitated in the nanoscale. Features that distinguish nanoparticulates from fine particulates include increased surface area per unit mass and quantum effects. In addition, some nanotechnology products, including the fullerenes, have a novel and reactive surface. Augmented microscopic procedures including enhanced dark-field imaging, immunofluorescence, field-emission scanning electron microscopy, transmission electron microscopy, and confocal microscopy are useful when evaluating nanoparticulate toxicologic pathology. Thus, the pathology assessment is facilitated by understanding the unique features at the nanoscale and the tools that can assist in evaluating nanotoxicology studies.
Nanotechnology: Toxicologic Pathology
Hubbs, Ann F.; Sargent, Linda M.; Porter, Dale W.; Sager, Tina M.; Chen, Bean T.; Frazer, David G.; Castranova, Vincent; Sriram, Krishnan; Nurkiewicz, Timothy R.; Reynolds, Steven H.; Battelli, Lori A.; Schwegler-Berry, Diane; McKinney, Walter; Fluharty, Kara L.; Mercer, Robert R.
2015-01-01
Nanotechnology involves technology, science, and engineering in dimensions less than 100 nm. A virtually infinite number of potential nanoscale products can be produced from many different molecules and their combinations. The exponentially increasing number of nanoscale products will solve critical needs in engineering, science, and medicine. However, the virtually infinite number of potential nanotechnology products is a challenge for toxicologic pathologists. Because of their size, nanoparticulates can have therapeutic and toxic effects distinct from micron-sized particulates of the same composition. In the nanoscale, distinct intercellular and intracellular translocation pathways may provide a different distribution than that obtained by micron-sized particulates. Nanoparticulates interact with subcellular structures including microtubules, actin filaments, centrosomes, and chromatin; interactions that may be facilitated in the nanoscale. Features that distinguish nanoparticulates from fine particulates include increased surface area per unit mass and quantum effects. In addition, some nanotechnology products, including the fullerenes, have a novel and reactive surface. Augmented microscopic procedures including enhanced dark-field imaging, immunofluorescence, field-emission scanning electron microscopy, transmission electron microscopy, and confocal microscopy are useful when evaluating nanoparticulate toxicologic pathology. Thus, the pathology assessment is facilitated by understanding the unique features at the nanoscale and the tools that can assist in evaluating nanotoxicology studies. PMID:23389777
Using virtual reality to augment perception, enhance sensorimotor adaptation, and change our minds
Wright, W. Geoffrey
2014-01-01
Technological advances that involve human sensorimotor processes can have both intended and unintended effects on the central nervous system (CNS). This mini review focuses on the use of virtual environments (VE) to augment brain functions by enhancing perception, eliciting automatic motor behavior, and inducing sensorimotor adaptation. VE technology is becoming increasingly prevalent in medical rehabilitation, training simulators, gaming, and entertainment. Although these VE applications have often been shown to optimize outcomes, whether it be to speed recovery, reduce training time, or enhance immersion and enjoyment, there are inherent drawbacks to environments that can potentially change sensorimotor calibration. Across numerous VE studies over the years, we have investigated the effects of combining visual and physical motion on perception, motor control, and adaptation. Recent results from our research involving exposure to dynamic passive motion within a visually-depicted VE reveal that short-term exposure to augmented sensorimotor discordance can result in systematic aftereffects that last beyond the exposure period. Whether these adaptations are advantageous or not, remains to be seen. Benefits as well as risks of using VE-driven sensorimotor stimulation to enhance brain processes will be discussed. PMID:24782724
Wang, Yu-Jen; Chen, Po-Ju; Liang, Xiao; Lin, Yi-Hsin
2017-03-27
Augmented reality (AR), which use computer-aided projected information to augment our sense, has important impact on human life, especially for the elder people. However, there are three major challenges regarding the optical system in the AR system, which are registration, vision correction, and readability under strong ambient light. Here, we solve three challenges simultaneously for the first time using two liquid crystal (LC) lenses and polarizer-free attenuator integrated in optical-see-through AR system. One of the LC lens is used to electrically adjust the position of the projected virtual image which is so-called registration. The other LC lens with larger aperture and polarization independent characteristic is in charge of vision correction, such as myopia and presbyopia. The linearity of lens powers of two LC lenses is also discussed. The readability of virtual images under strong ambient light is solved by electrically switchable transmittance of the LC attenuator originating from light scattering and light absorption. The concept demonstrated in this paper could be further extended to other electro-optical devices as long as the devices exhibit the capability of phase modulations and amplitude modulations.
Optical augmented reality assisted navigation system for neurosurgery teaching and planning
NASA Astrophysics Data System (ADS)
Wu, Hui-Qun; Geng, Xing-Yun; Wang, Li; Zhang, Yuan-Peng; Jiang, Kui; Tang, Le-Min; Zhou, Guo-Min; Dong, Jian-Cheng
2013-07-01
This paper proposed a convenient navigation system for neurosurgeon's pre-operative planning and teaching with augmented reality (AR) technique, which maps the three-dimensional reconstructed virtual anatomy structures onto a skull model. This system included two parts, a virtual reality system and a skull model scence. In our experiment, a 73 year old right-handed man initially diagnosed with astrocytoma was selected as an example to vertify our system. His imaging data from different modalities were registered and the skull soft tissue, brain and inside vessels as well as tumor were reconstructed. Then the reconstructed models were overlayed on the real scence. Our findings showed that the reconstructed tissues were augmented into the real scence and the registration results were in good alignment. The reconstructed brain tissue was well distributed in the skull cavity. The probe was used by a neurosurgeon to explore the surgical pathway which could be directly posed into the tumor while not injuring important vessels. In this way, the learning cost for students and patients' education about surgical risks reduced. Therefore, this system could be a selective protocol for image guided surgery(IGS), and is promising for neurosurgeon's pre-operative planning and teaching.
Beam steering for virtual/augmented reality displays with a cycloidal diffractive waveplate.
Chen, Haiwei; Weng, Yishi; Xu, Daming; Tabiryan, Nelson V; Wu, Shin-Tson
2016-04-04
We proposed a switchable beam steering device with cycloidal diffractive waveplate (CDW) for eye tracking in a virtual reality (VR) or augmented reality (AR) display system. Such a CDW diffracts the incident circularly polarized light to the first order with over 95% efficiency. To convert the input linearly polarized light to right-handed or left-handed circular polarization, we developed a broadband polarization switch consisting of a twisted nematic liquid crystal cell and an achromatic quarter-wave retardation film. By cascading 2-3 CDWs together, multiple diffraction angles can be achieved. To suppress the color dispersion, we proposed two approaches to obtain the same diffraction angle for red, green, and blue LEDs-based full color displays. Our device exhibits several advantages, such as high diffraction efficiency, fast response time, low power consumption, and low cost. It holds promise for the emerging VR/AR displays.
Linte, Cristian A; White, James; Eagleson, Roy; Guiraudon, Gérard M; Peters, Terry M
2010-01-01
Virtual and augmented reality environments have been adopted in medicine as a means to enhance the clinician's view of the anatomy and facilitate the performance of minimally invasive procedures. Their value is truly appreciated during interventions where the surgeon cannot directly visualize the targets to be treated, such as during cardiac procedures performed on the beating heart. These environments must accurately represent the real surgical field and require seamless integration of pre- and intra-operative imaging, surgical tracking, and visualization technology in a common framework centered around the patient. This review begins with an overview of minimally invasive cardiac interventions, describes the architecture of a typical surgical guidance platform including imaging, tracking, registration and visualization, highlights both clinical and engineering accuracy limitations in cardiac image guidance, and discusses the translation of the work from the laboratory into the operating room together with typically encountered challenges.
NASA Technical Reports Server (NTRS)
Schulte, Erin
2017-01-01
As augmented and virtual reality grows in popularity, and more researchers focus on its development, other fields of technology have grown in the hopes of integrating with the up-and-coming hardware currently on the market. Namely, there has been a focus on how to make an intuitive, hands-free human-computer interaction (HCI) utilizing AR and VR that allows users to control their technology with little to no physical interaction with hardware. Computer vision, which is utilized in devices such as the Microsoft Kinect, webcams and other similar hardware has shown potential in assisting with the development of a HCI system that requires next to no human interaction with computing hardware and software. Object and facial recognition are two subsets of computer vision, both of which can be applied to HCI systems in the fields of medicine, security, industrial development and other similar areas.
de Bruin, E D; Schoene, D; Pichierri, G; Smith, S T
2010-08-01
Virtual augmented exercise, an emerging technology that can help to promote physical activity and combine the strengths of indoor and outdoor exercise, has recently been proposed as having the potential to increase exercise behavior in older adults. By creating a strong presence in a virtual, interactive environment, distraction can be taken to greater levels while maintaining the benefits of indoor exercises which may result in a shift from negative to positive thoughts about exercise. Recent findings on young participants show that virtual reality training enhances mood, thus, increasing enjoyment and energy. For older adults virtual, interactive environments can influence postural control and fall events by stimulating the sensory cues that are responsible in maintaining balance and orientation. However, the potential of virtual reality training has yet to be explored for older adults. This manuscript describes the potential of dance pad training protocols in the elderly and reports on the theoretical rationale of combining physical game-like exercises with sensory and cognitive challenges in a virtual environment.
Visual Stability of Objects and Environments Viewed through Head-Mounted Displays
NASA Technical Reports Server (NTRS)
Ellis, Stephen R.; Adelstein, Bernard D.
2015-01-01
Virtual Environments (aka Virtual Reality) is again catching the public imagination and a number of startups (e.g. Oculus) and even not-so-startup companies (e.g. Microsoft) are trying to develop display systems to capitalize on this renewed interest. All acknowledge that this time they will get it right by providing the required dynamic fidelity, visual quality, and interesting content for the concept of VR to take off and change the world in ways it failed to do so in past incarnations. Some of the surprisingly long historical background of the technology that the form of direct simulation that underlies virtual environment and augmented reality displays will be briefly reviewed. An example of a mid 1990's augmented reality display system with good dynamic performance from our lab will be used to illustrate some of the underlying phenomena and technology concerning visual stability of virtual environments and objects during movement. In conclusion some idealized performance characteristics for a reference system will be proposed. Interestingly, many systems more or less on the market now may actually meet many of these proposed technical requirements. This observation leads to the conclusion that the current success of the IT firms trying to commercialize the technology will depend on the hidden costs of using the systems as well as the development of interesting and compelling content.
Gallizzi, Michael A.; Kuhns, Craig A.; Jenkins, Tyler J.; Pfeiffer, Ferris M.
2014-01-01
Study Design Biomechanical analysis of lateral mass screw pullout strength. Objective We compare the pullout strength of our bone cement–revised lateral mass screw with the standard lateral mass screw. Methods In cadaveric cervical spines, we simulated lateral mass screw “cutouts” unilaterally from C3 to C7. We salvaged fixation in the cutout side with polymethyl methacrylate (PMMA) or Cortoss cement (Orthovita, Malvern, Pennsylvania, United States), allowed the cement to harden, and then drilled and placed lateral mass screws back into the cement-augmented lateral masses. On the contralateral side, we placed standard lateral mass screws into the native, or normal lateral, masses and then compared pullout strength of the cement-augmented side to the standard lateral mass screw. For pullout testing, each augmentation group was fixed to a servohydraulic load frame and a specially designed pullout fixture was attached to each lateral mass screw head. Results Quick-mix PMMA-salvaged lateral mass screws required greater force to fail when compared with native lateral mass screws. Cortoss cement and PMMA standard-mix cement-augmented screws demonstrated less strength of fixation when compared with control-side lateral mass screws. Attempts at a second round of cement salvage of the same lateral masses led to more variations in load to failure, but quick-mix PMMA again demonstrated greater load to failure when compared with the nonaugmented control lateral mass screws. Conclusion Quick-mix PMMA cement revision equips the spinal surgeon with a much needed salvage option for a failed lateral mass screw in the subaxial cervical spine. PMID:25649421
Gibby, Jacob T; Swenson, Samuel A; Cvetko, Steve; Rao, Raj; Javan, Ramin
2018-06-22
Augmented reality has potential to enhance surgical navigation and visualization. We determined whether head-mounted display augmented reality (HMD-AR) with superimposed computed tomography (CT) data could allow the wearer to percutaneously guide pedicle screw placement in an opaque lumbar model with no real-time fluoroscopic guidance. CT imaging was obtained of a phantom composed of L1-L3 Sawbones vertebrae in opaque silicone. Preprocedural planning was performed by creating virtual trajectories of appropriate angle and depth for ideal approach into the pedicle, and these data were integrated into the Microsoft HoloLens using the Novarad OpenSight application allowing the user to view the virtual trajectory guides and CT images superimposed on the phantom in two and three dimensions. Spinal needles were inserted following the virtual trajectories to the point of contact with bone. Repeat CT revealed actual needle trajectory, allowing comparison with the ideal preprocedural paths. Registration of AR to phantom showed a roughly circular deviation with maximum average radius of 2.5 mm. Users took an average of 200 s to place a needle. Extrapolation of needle trajectory into the pedicle showed that of 36 needles placed, 35 (97%) would have remained within the pedicles. Needles placed approximated a mean distance of 4.69 mm in the mediolateral direction and 4.48 mm in the craniocaudal direction from pedicle bone edge. To our knowledge, this is the first peer-reviewed report and evaluation of HMD-AR with superimposed 3D guidance utilizing CT for spinal pedicle guide placement for the purpose of cannulation without the use of fluoroscopy.
An Augmented-Reality Edge Enhancement Application for Google Glass
Hwang, Alex D.; Peli, Eli
2014-01-01
Purpose Google Glass provides a platform that can be easily extended to include a vision enhancement tool. We have implemented an augmented vision system on Glass, which overlays enhanced edge information over the wearer’s real world view, to provide contrast-improved central vision to the Glass wearers. The enhanced central vision can be naturally integrated with scanning. Methods Goggle Glass’s camera lens distortions were corrected by using an image warping. Since the camera and virtual display are horizontally separated by 16mm, and the camera aiming and virtual display projection angle are off by 10°, the warped camera image had to go through a series of 3D transformations to minimize parallax errors before the final projection to the Glass’ see-through virtual display. All image processes were implemented to achieve near real-time performance. The impacts of the contrast enhancements were measured for three normal vision subjects, with and without a diffuser film to simulate vision loss. Results For all three subjects, significantly improved contrast sensitivity was achieved when the subjects used the edge enhancements with a diffuser film. The performance boost is limited by the Glass camera’s performance. The authors assume this accounts for why performance improvements were observed only with the diffuser filter condition (simulating low vision). Conclusions Improvements were measured with simulated visual impairments. With the benefit of see-through augmented reality edge enhancement, natural visual scanning process is possible, and suggests that the device may provide better visual function in a cosmetically and ergonomically attractive format for patients with macular degeneration. PMID:24978871
Villiger, Michael; Bohli, Dominik; Kiper, Daniel; Pyk, Pawel; Spillmann, Jeremy; Meilick, Bruno; Curt, Armin; Hepp-Reymond, Marie-Claude; Hotz-Boendermaker, Sabina; Eng, Kynan
2013-10-01
Neurorehabilitation interventions to improve lower limb function and neuropathic pain have had limited success in people with chronic, incomplete spinal cord injury (iSCI). We hypothesized that intense virtual reality (VR)-augmented training of observed and executed leg movements would improve limb function and neuropathic pain. Patients used a VR system with a first-person view of virtual lower limbs, controlled via movement sensors fitted to the patient's own shoes. Four tasks were used to deliver intensive training of individual muscles (tibialis anterior, quadriceps, leg ad-/abductors). The tasks engaged motivation through feedback of task success. Fourteen chronic iSCI patients were treated over 4 weeks in 16 to 20 sessions of 45 minutes. Outcome measures were 10 Meter Walking Test, Berg Balance Scale, Lower Extremity Motor Score, Spinal Cord Independence Measure, Locomotion and Neuropathic Pain Scale (NPS), obtained at the start and at 4 to 6 weeks before intervention. In addition to positive changes reported by the patients (Patients' Global Impression of Change), measures of walking capacity, balance, and strength revealed improvements in lower limb function. Intensity and unpleasantness of neuropathic pain in half of the affected participants were reduced on the NPS test. Overall findings remained stable 12 to 16 weeks after termination of the training. In a pretest/posttest, uncontrolled design, VR-augmented training was associated with improvements in motor function and neuropathic pain in persons with chronic iSCI, several of which reached the level of a minimal clinically important change. A controlled trial is needed to compare this intervention to active training alone or in combination.
von Segesser, Ludwig Karl; Berdajs, Denis; Abdel-Sayed, Saad; Tozzi, Piergiorgio; Ferrari, Enrico; Maisano, Francesco
2016-01-01
Inadequate venous drainage during minimally invasive cardiac surgery becomes most evident when the blood trapped in the pulmonary circulation floods the surgical field. The present study was designed to assess the in vivo performance of new, thinner, virtually wall-less, venous cannulas designed for augmented venous drainage in comparison to traditional thin-wall cannulas. Remote cannulation was realized in 5 bovine experiments (74.0 ± 2.4 kg) with percutaneous venous access over the wire, serial dilation up to 18 F and insertion of either traditional 19 F thin wall, wire-wound cannulas, or through the same access channel, new, thinner, virtually wall-less, braided cannulas designed for augmented venous drainage. A standard minimal extracorporeal circuit set with a centrifugal pump and a hollow fiber membrane oxygenator, but no in-line reservoir was used. One hundred fifty pairs of pump-flow and required pump inlet pressure values were recorded with calibrated pressure transducers and a flowmeter calibrated by a volumetric tank and timer at increasing pump speed from 1500 RPM to 3500 RPM (500-RPM increments). Pump flow accounted for 1.73 ± 0.85 l/min for wall-less versus 1.17 ± 0.45 l/min for thin wall at 1500 RPM, 3.91 ± 0.86 versus 3.23 ± 0.66 at 2500 RPM, 5.82 ± 1.05 versus 4.96 ± 0.81 at 3500 RPM. Pump inlet pressure accounted for 9.6 ± 9.7 mm Hg versus 4.2 ± 18.8 mm Hg for 1500 RPM, -42.4 ± 26.7 versus -123 ± 51.1 at 2500 RPM, and -126.7 ± 55.3 versus -313 ± 116.7 for 3500 RPM. At the well-accepted pump inlet pressure of -80 mm Hg, the new, thinner, virtually wall-less, braided cannulas provide unmatched venous drainage in vivo. Early clinical analyses have confirmed these findings.
Yudkowsky, Rachel; Luciano, Cristian; Banerjee, Pat; Schwartz, Alan; Alaraj, Ali; Lemole, G Michael; Charbel, Fady; Smith, Kelly; Rizzi, Silvio; Byrne, Richard; Bendok, Bernard; Frim, David
2013-02-01
Ventriculostomy is a neurosurgical procedure for providing therapeutic cerebrospinal fluid drainage. Complications may arise during repeated attempts at placing the catheter in the ventricle. We studied the impact of simulation-based practice with a library of virtual brains on neurosurgery residents' performance in simulated and live surgical ventriculostomies. Using computed tomographic scans of actual patients, we developed a library of 15 virtual brains for the ImmersiveTouch system, a head- and hand-tracked augmented reality and haptic simulator. The virtual brains represent a range of anatomies including normal, shifted, and compressed ventricles. Neurosurgery residents participated in individual simulator practice on the library of brains including visualizing the 3-dimensional location of the catheter within the brain immediately after each insertion. Performance of participants on novel brains in the simulator and during actual surgery before and after intervention was analyzed using generalized linear mixed models. Simulator cannulation success rates increased after intervention, and live procedure outcomes showed improvement in the rate of successful cannulation on the first pass. However, the incidence of deeper, contralateral (simulator) and third-ventricle (live) placements increased after intervention. Residents reported that simulations were realistic and helpful in improving procedural skills such as aiming the probe, sensing the pressure change when entering the ventricle, and estimating how far the catheter should be advanced within the ventricle. Simulator practice with a library of virtual brains representing a range of anatomies and difficulty levels may improve performance, potentially decreasing complications due to inexpert technique.
The Virtual Tablet: Virtual Reality as a Control System
NASA Technical Reports Server (NTRS)
Chronister, Andrew
2016-01-01
In the field of human-computer interaction, Augmented Reality (AR) and Virtual Reality (VR) have been rapidly growing areas of interest and concerted development effort thanks to both private and public research. At NASA, a number of groups have explored the possibilities afforded by AR and VR technology, among which is the IT Advanced Concepts Lab (ITACL). Within ITACL, the AVR (Augmented/Virtual Reality) Lab focuses on VR technology specifically for its use in command and control. Previous work in the AVR lab includes the Natural User Interface (NUI) project and the Virtual Control Panel (VCP) project, which created virtual three-dimensional interfaces that users could interact with while wearing a VR headset thanks to body- and hand-tracking technology. The Virtual Tablet (VT) project attempts to improve on these previous efforts by incorporating a physical surrogate which is mirrored in the virtual environment, mitigating issues with difficulty of visually determining the interface location and lack of tactile feedback discovered in the development of previous efforts. The physical surrogate takes the form of a handheld sheet of acrylic glass with several infrared-range reflective markers and a sensor package attached. Using the sensor package to track orientation and a motion-capture system to track the marker positions, a model of the surrogate is placed in the virtual environment at a position which corresponds with the real-world location relative to the user's VR Head Mounted Display (HMD). A set of control mechanisms is then projected onto the surface of the surrogate such that to the user, immersed in VR, the control interface appears to be attached to the object they are holding. The VT project was taken from an early stage where the sensor package, motion-capture system, and physical surrogate had been constructed or tested individually but not yet combined or incorporated into the virtual environment. My contribution was to combine the pieces of hardware, write software to incorporate each piece of position or orientation data into a coherent description of the object's location in space, place the virtual analogue accordingly, and project the control interface onto it, resulting in a functioning object which has both a physical and a virtual presence. Additionally, the virtual environment was enhanced with two live video feeds from cameras mounted on the robotic device being used as an example target of the virtual interface. The working VT allows users to naturally interact with a control interface with little to no training and without the issues found in previous efforts.
Real-time 3D image reconstruction guidance in liver resection surgery.
Soler, Luc; Nicolau, Stephane; Pessaux, Patrick; Mutter, Didier; Marescaux, Jacques
2014-04-01
Minimally invasive surgery represents one of the main evolutions of surgical techniques. However, minimally invasive surgery adds difficulty that can be reduced through computer technology. From a patient's medical image [US, computed tomography (CT) or MRI], we have developed an Augmented Reality (AR) system that increases the surgeon's intraoperative vision by providing a virtual transparency of the patient. AR is based on two major processes: 3D modeling and visualization of anatomical or pathological structures appearing in the medical image, and the registration of this visualization onto the real patient. We have thus developed a new online service, named Visible Patient, providing efficient 3D modeling of patients. We have then developed several 3D visualization and surgical planning software tools to combine direct volume rendering and surface rendering. Finally, we have developed two registration techniques, one interactive and one automatic providing intraoperative augmented reality view. From January 2009 to June 2013, 769 clinical cases have been modeled by the Visible Patient service. Moreover, three clinical validations have been realized demonstrating the accuracy of 3D models and their great benefit, potentially increasing surgical eligibility in liver surgery (20% of cases). From these 3D models, more than 50 interactive AR-assisted surgical procedures have been realized illustrating the potential clinical benefit of such assistance to gain safety, but also current limits that automatic augmented reality will overcome. Virtual patient modeling should be mandatory for certain interventions that have now to be defined, such as liver surgery. Augmented reality is clearly the next step of the new surgical instrumentation but remains currently limited due to the complexity of organ deformations during surgery. Intraoperative medical imaging used in new generation of automated augmented reality should solve this issue thanks to the development of Hybrid OR.
Augmented Reality Imaging System: 3D Viewing of a Breast Cancer.
Douglas, David B; Boone, John M; Petricoin, Emanuel; Liotta, Lance; Wilson, Eugene
2016-01-01
To display images of breast cancer from a dedicated breast CT using Depth 3-Dimensional (D3D) augmented reality. A case of breast cancer imaged using contrast-enhanced breast CT (Computed Tomography) was viewed with the augmented reality imaging, which uses a head display unit (HDU) and joystick control interface. The augmented reality system demonstrated 3D viewing of the breast mass with head position tracking, stereoscopic depth perception, focal point convergence and the use of a 3D cursor and joy-stick enabled fly through with visualization of the spiculations extending from the breast cancer. The augmented reality system provided 3D visualization of the breast cancer with depth perception and visualization of the mass's spiculations. The augmented reality system should be further researched to determine the utility in clinical practice.
NASA Astrophysics Data System (ADS)
Potter, Michael; Bensch, Alexander; Dawson-Elli, Alexander; Linte, Cristian A.
2015-03-01
In minimally invasive surgical interventions direct visualization of the target area is often not available. Instead, clinicians rely on images from various sources, along with surgical navigation systems for guidance. These spatial localization and tracking systems function much like the Global Positioning Systems (GPS) that we are all well familiar with. In this work we demonstrate how the video feed from a typical camera, which could mimic a laparoscopic or endoscopic camera used during an interventional procedure, can be used to identify the pose of the camera with respect to the viewed scene and augment the video feed with computer-generated information, such as rendering of internal anatomy not visible beyond the imaged surface, resulting in a simple augmented reality environment. This paper describes the software and hardware environment and methodology for augmenting the real world with virtual models extracted from medical images to provide enhanced visualization beyond the surface view achieved using traditional imaging. Following intrinsic and extrinsic camera calibration, the technique was implemented and demonstrated using a LEGO structure phantom, as well as a 3D-printed patient-specific left atrial phantom. We assessed the quality of the overlay according to fiducial localization, fiducial registration, and target registration errors, as well as the overlay offset error. Using the software extensions we developed in conjunction with common webcams it is possible to achieve tracking accuracy comparable to that seen with significantly more expensive hardware, leading to target registration errors on the order of 2 mm.
Vogt, Tobias; Herpers, Rainer; Askew, Christopher D.; Scherfgen, David; Strüder, Heiko K.; Schneider, Stefan
2015-01-01
Virtual reality environments are increasingly being used to encourage individuals to exercise more regularly, including as part of treatment those with mental health or neurological disorders. The success of virtual environments likely depends on whether a sense of presence can be established, where participants become fully immersed in the virtual environment. Exposure to virtual environments is associated with physiological responses, including cortical activation changes. Whether the addition of a real exercise within a virtual environment alters sense of presence perception, or the accompanying physiological changes, is not known. In a randomized and controlled study design, moderate-intensity Exercise (i.e., self-paced cycling) and No-Exercise (i.e., automatic propulsion) trials were performed within three levels of virtual environment exposure. Each trial was 5 minutes in duration and was followed by posttrial assessments of heart rate, perceived sense of presence, EEG, and mental state. Changes in psychological strain and physical state were generally mirrored by neural activation patterns. Furthermore, these changes indicated that exercise augments the demands of virtual environment exposures and this likely contributed to an enhanced sense of presence. PMID:26366305
Vogt, Tobias; Herpers, Rainer; Askew, Christopher D; Scherfgen, David; Strüder, Heiko K; Schneider, Stefan
2015-01-01
Virtual reality environments are increasingly being used to encourage individuals to exercise more regularly, including as part of treatment those with mental health or neurological disorders. The success of virtual environments likely depends on whether a sense of presence can be established, where participants become fully immersed in the virtual environment. Exposure to virtual environments is associated with physiological responses, including cortical activation changes. Whether the addition of a real exercise within a virtual environment alters sense of presence perception, or the accompanying physiological changes, is not known. In a randomized and controlled study design, moderate-intensity Exercise (i.e., self-paced cycling) and No-Exercise (i.e., automatic propulsion) trials were performed within three levels of virtual environment exposure. Each trial was 5 minutes in duration and was followed by posttrial assessments of heart rate, perceived sense of presence, EEG, and mental state. Changes in psychological strain and physical state were generally mirrored by neural activation patterns. Furthermore, these changes indicated that exercise augments the demands of virtual environment exposures and this likely contributed to an enhanced sense of presence.
A Survey of Mobile and Wireless Technologies for Augmented Reality Systems (Preprint)
2008-02-01
Windows XP. A number of researchers have started employing them in AR simulations such as Wagner et al [25], Newman et al [46] and specifically the Sony ...different music clubs and styles of music according to the selection and tastes of the listeners. In the intro sequence the user can select an animated...3-D character (avatar) as his or her virtual persona and visit the different music rooms in the virtual disco. Users can download or stream music in
Virtual reality and planetary exploration
NASA Technical Reports Server (NTRS)
Mcgreevy, Michael W.
1992-01-01
NASA-Ames is intensively developing virtual-reality (VR) capabilities that can extend and augment computer-generated and remote spatial environments. VR is envisioned not only as a basis for improving human/machine interactions involved in planetary exploration, but also as a medium for the more widespread sharing of the experience of exploration, thereby broadening the support-base for the lunar and planetary-exploration endeavors. Imagery representative of Mars are being gathered for VR presentation at such terrestrial sites as Antarctica and Death Valley.
Augmented Reality: A Brand New Challenge for the Assessment and Treatment of Psychological Disorders
Chicchi Giglioli, Irene Alice; Pedroli, Elisa
2015-01-01
Augmented Reality is a new technological system that allows introducing virtual contents in the real world in order to run in the same representation and, in real time, enhancing the user's sensory perception of reality. From another point of view, Augmented Reality can be defined as a set of techniques and tools that add information to the physical reality. To date, Augmented Reality has been used in many fields, such as medicine, entertainment, maintenance, architecture, education, and cognitive and motor rehabilitation but very few studies and applications of AR exist in clinical psychology. In the treatment of psychological disorders, Augmented Reality has given preliminary evidence to be a useful tool due to its adaptability to the patient needs and therapeutic purposes and interactivity. Another relevant factor is the quality of the user's experience in the Augmented Reality system determined from emotional engagement and sense of presence. This experience could increase the AR ecological validity in the treatment of psychological disorders. This paper reviews the recent studies on the use of Augmented Reality in the evaluation and treatment of psychological disorders, focusing on current uses of this technology and on the specific features that delineate Augmented Reality a new technique useful for psychology. PMID:26339283
Chicchi Giglioli, Irene Alice; Pallavicini, Federica; Pedroli, Elisa; Serino, Silvia; Riva, Giuseppe
2015-01-01
Augmented Reality is a new technological system that allows introducing virtual contents in the real world in order to run in the same representation and, in real time, enhancing the user's sensory perception of reality. From another point of view, Augmented Reality can be defined as a set of techniques and tools that add information to the physical reality. To date, Augmented Reality has been used in many fields, such as medicine, entertainment, maintenance, architecture, education, and cognitive and motor rehabilitation but very few studies and applications of AR exist in clinical psychology. In the treatment of psychological disorders, Augmented Reality has given preliminary evidence to be a useful tool due to its adaptability to the patient needs and therapeutic purposes and interactivity. Another relevant factor is the quality of the user's experience in the Augmented Reality system determined from emotional engagement and sense of presence. This experience could increase the AR ecological validity in the treatment of psychological disorders. This paper reviews the recent studies on the use of Augmented Reality in the evaluation and treatment of psychological disorders, focusing on current uses of this technology and on the specific features that delineate Augmented Reality a new technique useful for psychology.
Rohmer, Kai; Jendersie, Johannes; Grosch, Thorsten
2017-11-01
Augmented Reality offers many applications today, especially on mobile devices. Due to the lack of mobile hardware for illumination measurements, photorealistic rendering with consistent appearance of virtual objects is still an area of active research. In this paper, we present a full two-stage pipeline for environment acquisition and augmentation of live camera images using a mobile device with a depth sensor. We show how to directly work on a recorded 3D point cloud of the real environment containing high dynamic range color values. For unknown and automatically changing camera settings, a color compensation method is introduced. Based on this, we show photorealistic augmentations using variants of differential light simulation techniques. The presented methods are tailored for mobile devices and run at interactive frame rates. However, our methods are scalable to trade performance for quality and can produce quality renderings on desktop hardware.
Ponce, Brent A; Menendez, Mariano E; Oladeji, Lasun O; Fryberger, Charles T; Dantuluri, Phani K
2014-11-01
The authors describe the first surgical case adopting the combination of real-time augmented reality and wearable computing devices such as Google Glass (Google Inc, Mountain View, California). A 66-year-old man presented to their institution for a total shoulder replacement after 5 years of progressive right shoulder pain and decreased range of motion. Throughout the surgical procedure, Google Glass was integrated with the Virtual Interactive Presence and Augmented Reality system (University of Alabama at Birmingham, Birmingham, Alabama), enabling the local surgeon to interact with the remote surgeon within the local surgical field. Surgery was well tolerated by the patient and early surgical results were encouraging, with an improvement of shoulder pain and greater range of motion. The combination of real-time augmented reality and wearable computing devices such as Google Glass holds much promise in the field of surgery. Copyright 2014, SLACK Incorporated.
Usability engineering: domain analysis activities for augmented-reality systems
NASA Astrophysics Data System (ADS)
Gabbard, Joseph; Swan, J. E., II; Hix, Deborah; Lanzagorta, Marco O.; Livingston, Mark; Brown, Dennis B.; Julier, Simon J.
2002-05-01
This paper discusses our usability engineering process for the Battlefield Augmented Reality System (BARS). Usability engineering is a structured, iterative, stepwise development process. Like the related disciplines of software and systems engineering, usability engineering is a combination of management principals and techniques, formal and semi- formal evaluation techniques, and computerized tools. BARS is an outdoor augmented reality system that displays heads- up battlefield intelligence information to a dismounted warrior. The paper discusses our general usability engineering process. We originally developed the process in the context of virtual reality applications, but in this work we are adapting the procedures to an augmented reality system. The focus of this paper is our work on domain analysis, the first activity of the usability engineering process. We describe our plans for and our progress to date on our domain analysis for BARS. We give results in terms of a specific urban battlefield use case we have designed.
NASA Astrophysics Data System (ADS)
Clay, Alexis; Delord, Elric; Couture, Nadine; Domenger, Gaël
We describe the joint research that we conduct in gesture-based emotion recognition and virtual augmentation of a stage, bridging together the fields of computer science and dance. After establishing a common ground for dialogue, we could conduct a research process that equally benefits both fields. As computer scientists, dance is a perfect application case. Dancer's artistic creativity orient our research choices. As dancers, computer science provides new tools for creativity, and more importantly a new point of view that forces us to reconsider dance from its fundamentals. In this paper we hence describe our scientific work and its implications on dance. We provide an overview of our system to augment a ballet stage, taking a dancer's emotion into account. To illustrate our work in both fields, we describe three events that mixed dance, emotion recognition and augmented reality.
Suitability of virtual prototypes to support human factors/ergonomics evaluation during the design.
Aromaa, Susanna; Väänänen, Kaisa
2016-09-01
In recent years, the use of virtual prototyping has increased in product development processes, especially in the assessment of complex systems targeted at end-users. The purpose of this study was to evaluate the suitability of virtual prototyping to support human factors/ergonomics evaluation (HFE) during the design phase. Two different virtual prototypes were used: augmented reality (AR) and virtual environment (VE) prototypes of a maintenance platform of a rock crushing machine. Nineteen designers and other stakeholders were asked to assess the suitability of the prototype for HFE evaluation. Results indicate that the system model characteristics and user interface affect the experienced suitability. The VE system was valued as being more suitable to support the assessment of visibility, reach, and the use of tools than the AR system. The findings of this study can be used as a guidance for the implementing virtual prototypes in the product development process. Copyright © 2016 Elsevier Ltd. All rights reserved.
Comparative evaluation of monocular augmented-reality display for surgical microscopes.
Rodriguez Palma, Santiago; Becker, Brian C; Lobes, Louis A; Riviere, Cameron N
2012-01-01
Medical augmented reality has undergone much development recently. However, there is a lack of studies quantitatively comparing the different display options available. This paper compares the effects of different graphical overlay systems in a simple micromanipulation task with "soft" visual servoing. We compared positioning accuracy in a real-time visually-guided task using Micron, an active handheld tremor-canceling microsurgical instrument, using three different displays: 2D screen, 3D screen, and microscope with monocular image injection. Tested with novices and an experienced vitreoretinal surgeon, display of virtual cues in the microscope via an augmented reality injection system significantly decreased 3D error (p < 0.05) compared to the 2D and 3D monitors when confounding factors such as magnification level were normalized.
Bosc, R; Fitoussi, A; Pigneur, F; Tacher, V; Hersant, B; Meningaud, J-P
2017-08-01
The augmented reality on smart glasses allows the surgeon to visualize three-dimensional virtual objects during surgery, superimposed in real time to the anatomy of the patient. This makes it possible to preserve the vision of the surgical field and to dispose of added computerized information without the need to use a physical surgical guide or a deported screen. The three-dimensional objects that we used and visualized in augmented reality came from the reconstructions made from the CT-scans of the patients. These objects have been transferred through a dedicated application on stereoscopic smart glasses. The positioning and the stabilization of the virtual layers on the anatomy of the patients were obtained thanks to the recognition, by the glasses, of a tracker placed on the skin. We used this technology, in addition to the usual locating methods for preoperative planning and the selection of perforating vessels for 12 patients operated on a breast reconstruction, by perforating flap of deep lower epigastric artery. The "hands-free" smart glasses with two stereoscopic screens make it possible to provide the reconstructive surgeon with binocular visualization in the operative field of the vessels identified with the CT-scan. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
Augmented reality-guided artery-first pancreatico-duodenectomy.
Marzano, Ettore; Piardi, Tullio; Soler, Luc; Diana, Michele; Mutter, Didier; Marescaux, Jacques; Pessaux, Patrick
2013-11-01
Augmented Reality (AR) in surgery consists in the fusion of synthetic computer-generated images (3D virtual model) obtained from medical imaging preoperative work-up and real-time patient images with the aim to visualize unapparent anatomical details. The potential of AR navigation as a tool to improve safety of the surgical dissection is presented in a case of pancreatico-duodenectomy (PD). A 77-year-old male patient underwent an AR-assisted PD. The 3D virtual anatomical model was obtained from thoraco-abdominal CT scan using customary software (VR-RENDER®, IRCAD). The virtual model was superimposed to the operative field using an Exoscope (VITOM®, Karl Storz, Tüttlingen, Germany) as well as different visible landmarks (inferior vena cava, left renal vein, aorta, superior mesenteric vein, inferior margin of the pancreas). A computer scientist manually registered virtual and real images using a video mixer (MX 70; Panasonic, Secaucus, NJ) in real time. Dissection of the superior mesenteric artery and the hanging maneuver were performed under AR guidance along the hanging plane. AR allowed for precise and safe recognition of all the important vascular structures. Operative time was 360 min. AR display and fine registration was performed within 6 min. The postoperative course was uneventful. The pathology was positive for ampullary adenocarcinoma; the final stage was pT1N0 (0/43 retrieved lymph nodes) with clear surgical margins. AR is a valuable navigation tool that can enhance the ability to achieve a safe surgical resection during PD.
Full State Feedback Control for Virtual Power Plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Jay Tillay
This report presents an object-oriented implementation of full state feedback control for virtual power plants (VPP). The components of the VPP full state feedback control are (1) objectoriented high-fidelity modeling for all devices in the VPP; (2) Distribution System Distributed Quasi-Dynamic State Estimation (DS-DQSE) that enables full observability of the VPP by augmenting actual measurements with virtual, derived and pseudo measurements and performing the Quasi-Dynamic State Estimation (QSE) in a distributed manner, and (3) automated formulation of the Optimal Power Flow (OPF) in real time using the output of the DS-DQSE, and solving the distributed OPF to provide the optimalmore » control commands to the DERs of the VPP.« less
Fluet, Gerard G; Deutsch, Judith E
2013-03-01
Developments over the past 2 years in virtual reality (VR) augmented sensorimotor rehabilitation of upper limb use and gait post-stroke were reviewed. Studies were included if they evaluated comparative efficacy between VR and standard of care, and or differences in VR delivery methods; and were CEBM (center for evidence based medicine) level 2 or higher. Eight upper limb and two gait studies were included and described using the following categories hardware (input and output), software (virtual task and feedback and presentation) intervention (progression and dose), and outcomes. Trends in the field were commented on, gaps in knowledge identified, and areas of future research and translation of VR to practice were suggested.
Characterization of active hair-bundle motility by a mechanical-load clamp
NASA Astrophysics Data System (ADS)
Salvi, Joshua D.; Maoiléidigh, Dáibhid Ó.; Fabella, Brian A.; Tobin, Mélanie; Hudspeth, A. J.
2015-12-01
Active hair-bundle motility endows hair cells with several traits that augment auditory stimuli. The activity of a hair bundle might be controlled by adjusting its mechanical properties. Indeed, the mechanical properties of bundles vary between different organisms and along the tonotopic axis of a single auditory organ. Motivated by these biological differences and a dynamical model of hair-bundle motility, we explore how adjusting the mass, drag, stiffness, and offset force applied to a bundle control its dynamics and response to external perturbations. Utilizing a mechanical-load clamp, we systematically mapped the two-dimensional state diagram of a hair bundle. The clamp system used a real-time processor to tightly control each of the virtual mechanical elements. Increasing the stiffness of a hair bundle advances its operating point from a spontaneously oscillating regime into a quiescent regime. As predicted by a dynamical model of hair-bundle mechanics, this boundary constitutes a Hopf bifurcation.
Virtual and augmented reality in the treatment of phantom limb pain: A literature review.
Dunn, Justin; Yeo, Elizabeth; Moghaddampour, Parisah; Chau, Brian; Humbert, Sarah
2017-01-01
Phantom limb pain (PLP), the perception of discomfort in a limb no longer present, commonly occurs following amputation. A variety of interventions have been employed for PLP, including mirror therapy. Virtual Reality (VR) and augmented reality (AR) mirror therapy treatments have also been utilized and have the potential to provide an even greater immersive experience for the amputee. However, there is not currently a consensus on the efficacy of VR and AR therapy. The aim of this review is to evaluate and summarize the current research on the effect of immersive VR and AR in the treatment of PLP. A comprehensive literature search was conducted utilizing PubMed and Google Scholar in order to collect all available studies concerning the use of VR and/or AR in the treatment of PLP using the search terms "virtual reality," "augmented reality," and "phantom limb pain." Eight studies in total were evaluated, with six of those reporting quantitative data and the other two reporting qualitative findings. All studies located were of low-level evidence. Each noted improved pain with VR and AR treatment for phantom limb pain, through quantitative or qualitative reporting. Additionally, adverse effects were limited only to simulator sickness occurring in one trial for one patient. Despite the positive findings, all of the studies were confined purely to case studies and case report series. No studies of higher evidence have been conducted, thus considerably limiting the strength of the findings. As such, the current use of VR and AR for PLP management, while attractive due to the increasing levels of immersion, customizable environments, and decreasing cost, is yet to be fully proven and continues to need further research with higher quality studies to fully explore its benefits.
The effectiveness of virtual and augmented reality in health sciences and medical anatomy.
Moro, Christian; Štromberga, Zane; Raikos, Athanasios; Stirling, Allan
2017-11-01
Although cadavers constitute the gold standard for teaching anatomy to medical and health science students, there are substantial financial, ethical, and supervisory constraints on their use. In addition, although anatomy remains one of the fundamental areas of medical education, universities have decreased the hours allocated to teaching gross anatomy in favor of applied clinical work. The release of virtual (VR) and augmented reality (AR) devices allows learning to occur through hands-on immersive experiences. The aim of this research was to assess whether learning structural anatomy utilizing VR or AR is as effective as tablet-based (TB) applications, and whether these modes allowed enhanced student learning, engagement and performance. Participants (n = 59) were randomly allocated to one of the three learning modes: VR, AR, or TB and completed a lesson on skull anatomy, after which they completed an anatomical knowledge assessment. Student perceptions of each learning mode and any adverse effects experienced were recorded. No significant differences were found between mean assessment scores in VR, AR, or TB. During the lessons however, VR participants were more likely to exhibit adverse effects such as headaches (25% in VR P < 0.05), dizziness (40% in VR, P < 0.001), or blurred vision (35% in VR, P < 0.01). Both VR and AR are as valuable for teaching anatomy as tablet devices, but also promote intrinsic benefits such as increased learner immersion and engagement. These outcomes show great promise for the effective use of virtual and augmented reality as means to supplement lesson content in anatomical education. Anat Sci Educ 10: 549-559. © 2017 American Association of Anatomists. © 2017 American Association of Anatomists.
Evaluating the use of augmented reality to support undergraduate student learning in geomorphology
NASA Astrophysics Data System (ADS)
Ockelford, A.; Bullard, J. E.; Burton, E.; Hackney, C. R.
2016-12-01
Augmented Reality (AR) supports the understanding of complex phenomena by providing unique visual and interactive experiences that combine real and virtual information and help communicate abstract problems to learners. With AR, designers can superimpose virtual graphics over real objects, allowing users to interact with digital content through physical manipulation. One of the most significant pedagogic features of AR is that it provides an essentially student-centred and flexible space in which students can learn. By actively engaging participants using a design-thinking approach, this technology has the potential to provide a more productive and engaging learning environment than real or virtual learning environments alone. AR is increasingly being used in support of undergraduate learning and public engagement activities across engineering, medical and humanities disciplines but it is not widely used across the geosciences disciplines despite the obvious applicability. This paper presents preliminary results from a multi-institutional project which seeks to evaluate the benefits and challenges of using an augmented reality sand box to support undergraduate learning in geomorphology. The sandbox enables users to create and visualise topography. As the sand is sculpted, contours are projected onto the miniature landscape. By hovering a hand over the box, users can make it `rain' over the landscape and the water `flows' down in to rivers and valleys. At undergraduate level, the sand-box is an ideal focus for problem-solving exercises, for example exploring how geomorphology controls hydrological processes, how such processes can be altered and the subsequent impacts of the changes for environmental risk. It is particularly valuable for students who favour a visual or kinesthetic learning style. Results presented in this paper discuss how the sandbox provides a complex interactive environment that encourages communication, collaboration and co-design.
Towards cybernetic surgery: robotic and augmented reality-assisted liver segmentectomy.
Pessaux, Patrick; Diana, Michele; Soler, Luc; Piardi, Tullio; Mutter, Didier; Marescaux, Jacques
2015-04-01
Augmented reality (AR) in surgery consists in the fusion of synthetic computer-generated images (3D virtual model) obtained from medical imaging preoperative workup and real-time patient images in order to visualize unapparent anatomical details. The 3D model could be used for a preoperative planning of the procedure. The potential of AR navigation as a tool to improve safety of the surgical dissection is outlined for robotic hepatectomy. Three patients underwent a fully robotic and AR-assisted hepatic segmentectomy. The 3D virtual anatomical model was obtained using a thoracoabdominal CT scan with a customary software (VR-RENDER®, IRCAD). The model was then processed using a VR-RENDER® plug-in application, the Virtual Surgical Planning (VSP®, IRCAD), to delineate surgical resection planes including the elective ligature of vascular structures. Deformations associated with pneumoperitoneum were also simulated. The virtual model was superimposed to the operative field. A computer scientist manually registered virtual and real images using a video mixer (MX 70; Panasonic, Secaucus, NJ) in real time. Two totally robotic AR segmentectomy V and one segmentectomy VI were performed. AR allowed for the precise and safe recognition of all major vascular structures during the procedure. Total time required to obtain AR was 8 min (range 6-10 min). Each registration (alignment of the vascular anatomy) required a few seconds. Hepatic pedicle clamping was never performed. At the end of the procedure, the remnant liver was correctly vascularized. Resection margins were negative in all cases. The postoperative period was uneventful without perioperative transfusion. AR is a valuable navigation tool which may enhance the ability to achieve safe surgical resection during robotic hepatectomy.
Unsteady Ejector Performance: an Experimental Investigation Using a Pulsejet Driver
NASA Technical Reports Server (NTRS)
Paxson, Daniel E.; Wilson, Jack; Dougherty, Kevin T.
2002-01-01
An experimental investigation is described in which thrust augmentation and mass entrainment were measured for a variety of simple cylindrical ejectors driven by a gasoline-fueled pulsejet. The ejectors were of varying length, diameter, and inlet radius. Measurements were also taken to determine the effect on performance of the distance between pulsejet exit and ejector inlet. Limited tests were also conducted to determine the effect of driver cross-sectional shape. Optimal values were found for all three ejector parameters with respect to thrust augmentation. This was not the case with mass entrainment, which increased monotonically with ejector diameter. Thus, it was found that thrust augmentation is not necessarily directly related to mass entrainment, as is often supposed for ejectors. Peak thrust augmentation values of 1.8 were obtained. Peak mass entrainment values of 30 times the driver mass flow were also observed. Details of the experimental setup and results are presented. Preliminary analysis of the results indicates that the enhanced performance obtained with an unsteady jet (primary source) over comparably sized ejectors driven with steady jets is due primarily to the structure of the starting vortex-type flow associated with the former.
Immersive Technologies and Language Learning
ERIC Educational Resources Information Center
Blyth, Carl
2018-01-01
This article briefly traces the historical conceptualization of linguistic and cultural immersion through technological applications, from the early days of locally networked computers to the cutting-edge technologies known as virtual reality and augmented reality. Next, the article explores the challenges of immersive technologies for the field…
Fernández-Caballero, Antonio; Navarro, Elena; Fernández-Sotos, Patricia; González, Pascual; Ricarte, Jorge J.; Latorre, José M.; Rodriguez-Jimenez, Roberto
2017-01-01
This perspective paper faces the future of alternative treatments that take advantage of a social and cognitive approach with regards to pharmacological therapy of auditory verbal hallucinations (AVH) in patients with schizophrenia. AVH are the perception of voices in the absence of auditory stimulation and represents a severe mental health symptom. Virtual/augmented reality (VR/AR) and brain computer interfaces (BCI) are technologies that are growing more and more in different medical and psychological applications. Our position is that their combined use in computer-based therapies offers still unforeseen possibilities for the treatment of physical and mental disabilities. This is why, the paper expects that researchers and clinicians undergo a pathway toward human-avatar symbiosis for AVH by taking full advantage of new technologies. This outlook supposes to address challenging issues in the understanding of non-pharmacological treatment of schizophrenia-related disorders and the exploitation of VR/AR and BCI to achieve a real human-avatar symbiosis. PMID:29209193
[Registration technology for mandibular angle osteotomy based on augmented reality].
Zhu, Ming; Chai, Gang; Zhang, Yan; Ma, Xiao-Fei; Yu, Zhe-Yuan; Zhu, Yi-Jia
2010-12-01
To establish an effective path to register the operative plan to the real model of mandible made by rapid prototyping (RP) technology. Computerize tomography (CT) was performed on 20 patients to create 3D images, and computer aided operation planning information can be merged with the 3D images. Then dental cast was used to fix the signal which can be recognized by the software. The dental cast was transformed to 3D data with a laser scanner and a programmer that run on a personal computer named Rapidform matching the dental cast and the mandible image to generate the virtual image. Then the registration was achieved by video monitoring system. By using this technology, the virtual image of mandible and the cutting planes both can overlay the real model of mandible made by RP. This study found an effective way for registration by using dental cast, and this way might be a powerful option for the registration of augmented reality. Supported by Program for Innovation Research Team of Shanghai Municipal Education Commission.
Fernández-Caballero, Antonio; Navarro, Elena; Fernández-Sotos, Patricia; González, Pascual; Ricarte, Jorge J; Latorre, José M; Rodriguez-Jimenez, Roberto
2017-01-01
This perspective paper faces the future of alternative treatments that take advantage of a social and cognitive approach with regards to pharmacological therapy of auditory verbal hallucinations (AVH) in patients with schizophrenia. AVH are the perception of voices in the absence of auditory stimulation and represents a severe mental health symptom. Virtual/augmented reality (VR/AR) and brain computer interfaces (BCI) are technologies that are growing more and more in different medical and psychological applications. Our position is that their combined use in computer-based therapies offers still unforeseen possibilities for the treatment of physical and mental disabilities. This is why, the paper expects that researchers and clinicians undergo a pathway toward human-avatar symbiosis for AVH by taking full advantage of new technologies. This outlook supposes to address challenging issues in the understanding of non-pharmacological treatment of schizophrenia-related disorders and the exploitation of VR/AR and BCI to achieve a real human-avatar symbiosis.
On the use of virtual and augmented reality for upper limb prostheses training and simulation.
Lamounier, Edgard; Lopes, Kenedy; Cardoso, Alexandre; Andrade, Adriano; Soares, Alcimar
2010-01-01
Accidents happen and unfortunately people may loose part of their body members. Studies have shown that in this case, most individuals suffer physically and psychologically. For this reason, actions to restore the patient's freedom and mobility are imperative. Traditional solutions require ways to adapt the individual to prosthetic devices. This idea is also applied to patients who have congenital limitations. However, one of the major difficulties faced by those who are fitted with these devices is the great mental effort needed during first stages of training. As a result, a meaningful number of patients give up the use of theses devices very soon. Thus, this article reports on a solution designed by the authors to help patients during the learning phases, without actually having to wear the prosthesis. This solution considers Virtual (VR) and Augmented Reality (AR) techniques to mimic the prosthesis natural counterparts. Thus, it is expected that problems such as weight, heat and pain should not contribute to an already hard task.
Augmented Reality versus Virtual Reality for 3D Object Manipulation.
Krichenbauer, Max; Yamamoto, Goshiro; Taketom, Takafumi; Sandor, Christian; Kato, Hirokazu
2018-02-01
Virtual Reality (VR) Head-Mounted Displays (HMDs) are on the verge of becoming commodity hardware available to the average user and feasible to use as a tool for 3D work. Some HMDs include front-facing cameras, enabling Augmented Reality (AR) functionality. Apart from avoiding collisions with the environment, interaction with virtual objects may also be affected by seeing the real environment. However, whether these effects are positive or negative has not yet been studied extensively. For most tasks it is unknown whether AR has any advantage over VR. In this work we present the results of a user study in which we compared user performance measured in task completion time on a 9 degrees of freedom object selection and transformation task performed either in AR or VR, both with a 3D input device and a mouse. Our results show faster task completion time in AR over VR. When using a 3D input device, a purely VR environment increased task completion time by 22.5 percent on average compared to AR ( ). Surprisingly, a similar effect occurred when using a mouse: users were about 17.3 percent slower in VR than in AR ( ). Mouse and 3D input device produced similar task completion times in each condition (AR or VR) respectively. We further found no differences in reported comfort.
Rohmer, Kai; Buschel, Wolfgang; Dachselt, Raimund; Grosch, Thorsten
2015-12-01
At present, photorealistic augmentation is not yet possible since the computational power of mobile devices is insufficient. Even streaming solutions from stationary PCs cause a latency that affects user interactions considerably. Therefore, we introduce a differential rendering method that allows for a consistent illumination of the inserted virtual objects on mobile devices, avoiding delays. The computation effort is shared between a stationary PC and the mobile devices to make use of the capacities available on both sides. The method is designed such that only a minimum amount of data has to be transferred asynchronously between the participants. This allows for an interactive illumination of virtual objects with a consistent appearance under both temporally and spatially varying real illumination conditions. To describe the complex near-field illumination in an indoor scenario, HDR video cameras are used to capture the illumination from multiple directions. In this way, sources of illumination can be considered that are not directly visible to the mobile device because of occlusions and the limited field of view. While our method focuses on Lambertian materials, we also provide some initial approaches to approximate non-diffuse virtual objects and thereby allow for a wider field of application at nearly the same cost.
Realistic Real-Time Outdoor Rendering in Augmented Reality
Kolivand, Hoshang; Sunar, Mohd Shahrizal
2014-01-01
Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems. PMID:25268480
Realistic real-time outdoor rendering in augmented reality.
Kolivand, Hoshang; Sunar, Mohd Shahrizal
2014-01-01
Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems.
Real-Time Motion Tracking for Mobile Augmented/Virtual Reality Using Adaptive Visual-Inertial Fusion
Fang, Wei; Zheng, Lianyu; Deng, Huanjun; Zhang, Hongbo
2017-01-01
In mobile augmented/virtual reality (AR/VR), real-time 6-Degree of Freedom (DoF) motion tracking is essential for the registration between virtual scenes and the real world. However, due to the limited computational capacity of mobile terminals today, the latency between consecutive arriving poses would damage the user experience in mobile AR/VR. Thus, a visual-inertial based real-time motion tracking for mobile AR/VR is proposed in this paper. By means of high frequency and passive outputs from the inertial sensor, the real-time performance of arriving poses for mobile AR/VR is achieved. In addition, to alleviate the jitter phenomenon during the visual-inertial fusion, an adaptive filter framework is established to cope with different motion situations automatically, enabling the real-time 6-DoF motion tracking by balancing the jitter and latency. Besides, the robustness of the traditional visual-only based motion tracking is enhanced, giving rise to a better mobile AR/VR performance when motion blur is encountered. Finally, experiments are carried out to demonstrate the proposed method, and the results show that this work is capable of providing a smooth and robust 6-DoF motion tracking for mobile AR/VR in real-time. PMID:28475145
Ben-Moussa, Maher; Rubo, Marius; Debracque, Coralie; Lange, Wolf-Gero
2017-01-01
The present paper explores the benefits and the capabilities of various emerging state-of-the-art interactive 3D and Internet of Things technologies and investigates how these technologies can be exploited to develop a more effective technology supported exposure therapy solution for social anxiety disorder. "DJINNI" is a conceptual design of an in vivo augmented reality (AR) exposure therapy mobile support system that exploits several capturing technologies and integrates the patient's state and situation by vision-based, audio-based, and physiology-based analysis as well as by indoor/outdoor localization techniques. DJINNI also comprises an innovative virtual reality exposure therapy system that is adaptive and customizable to the demands of the in vivo experience and therapeutic progress. DJINNI follows a gamification approach where rewards and achievements are utilized to motivate the patient to progress in her/his treatment. The current paper reviews the state of the art of technologies needed for such a solution and recommends how these technologies could be integrated in the development of an individually tailored and yet feasible and effective AR/virtual reality-based exposure therapy. Finally, the paper outlines how DJINNI could be part of classical cognitive behavioral treatment and how to validate such a setup.
Ben-Moussa, Maher; Rubo, Marius; Debracque, Coralie; Lange, Wolf-Gero
2017-01-01
The present paper explores the benefits and the capabilities of various emerging state-of-the-art interactive 3D and Internet of Things technologies and investigates how these technologies can be exploited to develop a more effective technology supported exposure therapy solution for social anxiety disorder. “DJINNI” is a conceptual design of an in vivo augmented reality (AR) exposure therapy mobile support system that exploits several capturing technologies and integrates the patient’s state and situation by vision-based, audio-based, and physiology-based analysis as well as by indoor/outdoor localization techniques. DJINNI also comprises an innovative virtual reality exposure therapy system that is adaptive and customizable to the demands of the in vivo experience and therapeutic progress. DJINNI follows a gamification approach where rewards and achievements are utilized to motivate the patient to progress in her/his treatment. The current paper reviews the state of the art of technologies needed for such a solution and recommends how these technologies could be integrated in the development of an individually tailored and yet feasible and effective AR/virtual reality-based exposure therapy. Finally, the paper outlines how DJINNI could be part of classical cognitive behavioral treatment and how to validate such a setup. PMID:28503155
Fang, Wei; Zheng, Lianyu; Deng, Huanjun; Zhang, Hongbo
2017-05-05
In mobile augmented/virtual reality (AR/VR), real-time 6-Degree of Freedom (DoF) motion tracking is essential for the registration between virtual scenes and the real world. However, due to the limited computational capacity of mobile terminals today, the latency between consecutive arriving poses would damage the user experience in mobile AR/VR. Thus, a visual-inertial based real-time motion tracking for mobile AR/VR is proposed in this paper. By means of high frequency and passive outputs from the inertial sensor, the real-time performance of arriving poses for mobile AR/VR is achieved. In addition, to alleviate the jitter phenomenon during the visual-inertial fusion, an adaptive filter framework is established to cope with different motion situations automatically, enabling the real-time 6-DoF motion tracking by balancing the jitter and latency. Besides, the robustness of the traditional visual-only based motion tracking is enhanced, giving rise to a better mobile AR/VR performance when motion blur is encountered. Finally, experiments are carried out to demonstrate the proposed method, and the results show that this work is capable of providing a smooth and robust 6-DoF motion tracking for mobile AR/VR in real-time.
Using virtual robot-mediated play activities to assess cognitive skills.
Encarnação, Pedro; Alvarez, Liliana; Rios, Adriana; Maya, Catarina; Adams, Kim; Cook, Al
2014-05-01
To evaluate the feasibility of using virtual robot-mediated play activities to assess cognitive skills. Children with and without disabilities utilized both a physical robot and a matching virtual robot to perform the same play activities. The activities were designed such that successfully performing them is an indication of understanding of the underlying cognitive skills. Participants' performance with both robots was similar when evaluated by the success rates in each of the activities. Session video analysis encompassing participants' behavioral, interaction and communication aspects revealed differences in sustained attention, visuospatial and temporal perception, and self-regulation, favoring the virtual robot. The study shows that virtual robots are a viable alternative to the use of physical robots for assessing children's cognitive skills, with the potential of overcoming limitations of physical robots such as cost, reliability and the need for on-site technical support. Virtual robots can provide a vehicle for children to demonstrate cognitive understanding. Virtual and physical robots can be used as augmentative manipulation tools allowing children with disabilities to actively participate in play, educational and therapeutic activities. Virtual robots have the potential of overcoming limitations of physical robots such as cost, reliability and the need for on-site technical support.
NASA Astrophysics Data System (ADS)
Canciani, M.; Conigliaro, E.; Del Grasso, M.; Papalini, P.; Saccone, M.
2016-06-01
The development of close-range photogrammetry has produced a lot of new possibility to study cultural heritage. 3D data acquired with conventional and low cost cameras can be used to document, investigate the full appearance, materials and conservation status, to help the restoration process and identify intervention priorities. At the same time, with 3D survey a lot of three-dimensional data are collected and analyzed by researchers, but there are a very few possibility of 3D output. The augmented reality is one of this possible output with a very low cost technology but a very interesting result. Using simple mobile technology (for iPad and Android Tablets) and shareware software (in the case presented "Augment") it is possible to share and visualize a large number of 3D models with your own device. The case study presented is a part of an architecture graduate thesis, made in Rome at Department of Architecture of Roma Tre University. We have developed a photogrammetric survey to study the Aurelian Wall at Castra Praetoria in Rome. The surveys of 8000 square meters of surface have allowed to identify stratigraphy and construction phases of a complex portion of Aurelian Wall, specially about the Northern door of Castra. During this study, the data coming out of 3D survey (photogrammetric and topographic), are stored and used to create a reverse 3D model, or virtual reconstruction, of the Northern door of Castra. This virtual reconstruction shows the door in the Tiberian period, nowadays it's totally hidden by a curtain wall but, little and significative architectural details allow to know its original feature. The 3D model of the ancient walls has been mapped with the exact type of bricks and mortar, oriented and scaled according to the existing one to use augmented reality. Finally, two kind of application have been developed, one on site, were you can see superimposed the virtual reconstruction on the existing walls using the image recognition. On the other hand, to show the results also during the graduation day, the same application has been created in off-site condition using a poster.
Augmenting the access grid using augmented reality
NASA Astrophysics Data System (ADS)
Li, Ying
2012-01-01
The Access Grid (AG) targets an advanced collaboration environment, with which multi-party group of people from remote sites can collaborate over high-performance networks. However, current AG still employs VIC (Video Conferencing Tool) to offer only pure video for remote communication, while most AG users expect to collaboratively refer and manipulate the 3D geometric models of grid services' results in live videos of AG session. Augmented Reality (AR) technique can overcome the deficiencies with its characteristics of combining virtual and real, real-time interaction and 3D registration, so it is necessary for AG to utilize AR to better assist the advanced collaboration environment. This paper introduces an effort to augment the AG by adding support for AR capability, which is encapsulated in the node service infrastructure, named as Augmented Reality Service (ARS). The ARS can merge the 3D geometric models of grid services' results and real video scene of AG into one AR environment, and provide the opportunity for distributed AG users to interactively and collaboratively participate in the AR environment with better experience.
Chan, Teresa; Sennik, Serena; Zaki, Amna; Trotter, Brendon
2015-03-01
Cloud-based applications such as Google Docs, Skype, Dropbox, and SugarSync are revolutionizing the way that we interact with the world. Members of the millennial generation (those born after 1980) are now becoming senior residents and junior attending physicians. We describe a novel technique combining Internet- and cloud-based methods to digitally augment the classic study group used by final-year residents studying for the Royal College of Physicians and Surgeons of Canada examination. This material was developed by residents and improved over the course of 18 months. This is an innovation report about a process for enhanced communication and collaboration as there has been little research to date regarding the augmentation of learner-driven initiatives with virtual resources.
Use of display technologies for augmented reality enhancement
NASA Astrophysics Data System (ADS)
Harding, Kevin
2016-06-01
Augmented reality (AR) is seen as an important tool for the future of user interfaces as well as training applications. An important application area for AR is expected to be in the digitization of training and worker instructions used in the Brilliant Factory environment. The transition of work instructions methods from printed pages in a book or taped to a machine to virtual simulations is a long step with many challenges along the way. A variety of augmented reality tools are being explored today for industrial applications that range from simple programmable projections in the work space to 3D displays and head mounted gear. This paper will review where some of these tool are today and some of the pros and cons being considered for the future worker environment.
Virtual reality and planetary exploration
NASA Technical Reports Server (NTRS)
Mcgreevy, Michael W.
1992-01-01
Exploring planetary environments is central to NASA's missions and goals. A new computing technology called Virtual Reality has much to offer in support of planetary exploration. This technology augments and extends human presence within computer-generated and remote spatial environments. Historically, NASA has been a leader in many of the fundamental concepts and technologies that comprise Virtual Reality. Indeed, Ames Research Center has a central role in the development of this rapidly emerging approach to using computers. This ground breaking work has inspired researchers in academia, industry, and the military. Further, NASA's leadership in this technology has spun off new businesses, has caught the attention of the international business community, and has generated several years of positive international media coverage. In the future, Virtual Reality technology will enable greatly improved human-machine interactions for more productive planetary surface exploration. Perhaps more importantly, Virtual Reality technology will democratize the experience of planetary exploration and thereby broaden understanding of, and support for, this historic enterprise.
Virtual reality and planetary exploration
NASA Astrophysics Data System (ADS)
McGreevy, Michael W.
Exploring planetary environments is central to NASA's missions and goals. A new computing technology called Virtual Reality has much to offer in support of planetary exploration. This technology augments and extends human presence within computer-generated and remote spatial environments. Historically, NASA has been a leader in many of the fundamental concepts and technologies that comprise Virtual Reality. Indeed, Ames Research Center has a central role in the development of this rapidly emerging approach to using computers. This ground breaking work has inspired researchers in academia, industry, and the military. Further, NASA's leadership in this technology has spun off new businesses, has caught the attention of the international business community, and has generated several years of positive international media coverage. In the future, Virtual Reality technology will enable greatly improved human-machine interactions for more productive planetary surface exploration. Perhaps more importantly, Virtual Reality technology will democratize the experience of planetary exploration and thereby broaden understanding of, and support for, this historic enterprise.
Applied virtual reality at the Research Triangle Institute
NASA Technical Reports Server (NTRS)
Montoya, R. Jorge
1994-01-01
Virtual Reality (VR) is a way for humans to use computers in visualizing, manipulating and interacting with large geometric data bases. This paper describes a VR infrastructure and its application to marketing, modeling, architectural walk through, and training problems. VR integration techniques used in these applications are based on a uniform approach which promotes portability and reusability of developed modules. For each problem, a 3D object data base is created using data captured by hand or electronically. The object's realism is enhanced through either procedural or photo textures. The virtual environment is created and populated with the data base using software tools which also support interactions with and immersivity in the environment. These capabilities are augmented by other sensory channels such as voice recognition, 3D sound, and tracking. Four applications are presented: a virtual furniture showroom, virtual reality models of the North Carolina Global TransPark, a walk through the Dresden Fraunenkirche, and the maintenance training simulator for the National Guard.
Distribution Locational Real-Time Pricing Based Smart Building Control and Management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hao, Jun; Dai, Xiaoxiao; Zhang, Yingchen
This paper proposes an real-virtual parallel computing scheme for smart building operations aiming at augmenting overall social welfare. The University of Denver's campus power grid and Ritchie fitness center is used for demonstrating the proposed approach. An artificial virtual system is built in parallel to the real physical system to evaluate the overall social cost of the building operation based on the social science based working productivity model, numerical experiment based building energy consumption model and the power system based real-time pricing mechanism. Through interactive feedback exchanged between the real and virtual system, enlarged social welfare, including monetary cost reductionmore » and energy saving, as well as working productivity improvements, can be achieved.« less
Internet virtual studio: low-cost augmented reality system for WebTV
NASA Astrophysics Data System (ADS)
Sitnik, Robert; Pasko, Slawomir; Karaszewski, Maciej; Witkowski, Marcin
2008-02-01
In this paper a concept of a Internet Virtual Studio as a modern system for production of news, entertainment, educational and training material is proposed. This system is based on virtual studio technology and integrated with multimedia data base. Its was developed for web television content production. In successive subentries the general system architecture, as well as the architecture of modules one by one is discussed. The authors describe each module by presentation of a brief information about work principles and technical limitations. The presentation of modules is strictly connected with a presentation of their capabilities. Results produced by each of them are shown in the form of exemplary images. Finally, exemplary short production is presented and discussed.
Fluet, Gerard G.
2013-01-01
Developments over the past 2 years in virtual reality (VR) augmented sensorimotor rehabilitation of upper limb use and gait post-stroke were reviewed. Studies were included if they evaluated comparative efficacy between VR and standard of care, and or differences in VR delivery methods; and were CEBM (center for evidence based medicine) level 2 or higher. Eight upper limb and two gait studies were included and described using the following categories hardware (input and output), software (virtual task and feedback and presentation) intervention (progression and dose), and outcomes. Trends in the field were commented on, gaps in knowledge identified, and areas of future research and translation of VR to practice were suggested. PMID:24579058
Webizing mobile augmented reality content
NASA Astrophysics Data System (ADS)
Ahn, Sangchul; Ko, Heedong; Yoo, Byounghyun
2014-01-01
This paper presents a content structure for building mobile augmented reality (AR) applications in HTML5 to achieve a clean separation of the mobile AR content and the application logic for scaling as on the Web. We propose that the content structure contains the physical world as well as virtual assets for mobile AR applications as document object model (DOM) elements and that their behaviour and user interactions are controlled through DOM events by representing objects and places with a uniform resource identifier. Our content structure enables mobile AR applications to be seamlessly developed as normal HTML documents under the current Web eco-system.
LivePhantom: Retrieving Virtual World Light Data to Real Environments.
Kolivand, Hoshang; Billinghurst, Mark; Sunar, Mohd Shahrizal
2016-01-01
To achieve realistic Augmented Reality (AR), shadows play an important role in creating a 3D impression of a scene. Casting virtual shadows on real and virtual objects is one of the topics of research being conducted in this area. In this paper, we propose a new method for creating complex AR indoor scenes using real time depth detection to exert virtual shadows on virtual and real environments. A Kinect camera was used to produce a depth map for the physical scene mixing into a single real-time transparent tacit surface. Once this is created, the camera's position can be tracked from the reconstructed 3D scene. Real objects are represented by virtual object phantoms in the AR scene enabling users holding a webcam and a standard Kinect camera to capture and reconstruct environments simultaneously. The tracking capability of the algorithm is shown and the findings are assessed drawing upon qualitative and quantitative methods making comparisons with previous AR phantom generation applications. The results demonstrate the robustness of the technique for realistic indoor rendering in AR systems.
Virtual reality and physical rehabilitation: a new toy or a new research and rehabilitation tool?
Keshner, Emily A
2004-01-01
Virtual reality (VR) technology is rapidly becoming a popular application for physical rehabilitation and motor control research. But questions remain about whether this technology really extends our ability to influence the nervous system or whether moving within a virtual environment just motivates the individual to perform. I served as guest editor of this month's issue of the Journal of NeuroEngineering and Rehabilitation (JNER) for a group of papers on augmented and virtual reality in rehabilitation. These papers demonstrate a variety of approaches taken for applying VR technology to physical rehabilitation. The papers by Kenyon et al. and Sparto et al. address critical questions about how this technology can be applied to physical rehabilitation and research. The papers by Sveistrup and Viau et al. explore whether action within a virtual environment is equivalent to motor performance within the physical environment. Finally, papers by Riva et al. and Weiss et al. discuss the important characteristics of a virtual environment that will be most effective for obtaining changes in the motor system. PMID:15679943
LivePhantom: Retrieving Virtual World Light Data to Real Environments
2016-01-01
To achieve realistic Augmented Reality (AR), shadows play an important role in creating a 3D impression of a scene. Casting virtual shadows on real and virtual objects is one of the topics of research being conducted in this area. In this paper, we propose a new method for creating complex AR indoor scenes using real time depth detection to exert virtual shadows on virtual and real environments. A Kinect camera was used to produce a depth map for the physical scene mixing into a single real-time transparent tacit surface. Once this is created, the camera’s position can be tracked from the reconstructed 3D scene. Real objects are represented by virtual object phantoms in the AR scene enabling users holding a webcam and a standard Kinect camera to capture and reconstruct environments simultaneously. The tracking capability of the algorithm is shown and the findings are assessed drawing upon qualitative and quantitative methods making comparisons with previous AR phantom generation applications. The results demonstrate the robustness of the technique for realistic indoor rendering in AR systems. PMID:27930663
Combined virtual and real robotic test-bed for single operator control of multiple robots
NASA Astrophysics Data System (ADS)
Lee, Sam Y.-S.; Hunt, Shawn; Cao, Alex; Pandya, Abhilash
2010-04-01
Teams of heterogeneous robots with different dynamics or capabilities could perform a variety of tasks such as multipoint surveillance, cooperative transport and explorations in hazardous environments. In this study, we work with heterogeneous robots of semi-autonomous ground and aerial robots for contaminant localization. We developed a human interface system which linked every real robot to its virtual counterpart. A novel virtual interface has been integrated with Augmented Reality that can monitor the position and sensory information from video feed of ground and aerial robots in the 3D virtual environment, and improve user situational awareness. An operator can efficiently control the real multi-robots using the Drag-to-Move method on the virtual multi-robots. This enables an operator to control groups of heterogeneous robots in a collaborative way for allowing more contaminant sources to be pursued simultaneously. The advanced feature of the virtual interface system is guarded teleoperation. This can be used to prevent operators from accidently driving multiple robots into walls and other objects. Moreover, the feature of the image guidance and tracking is able to reduce operator workload.
Bibliography on augmentation of convective heat and mass transfer-II
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bergles, A.E.; Nirmalan, V.; Junkhan, G.H.
1983-12-01
Heat transfer augmentation has developed into a major specialty area in heat transfer research and development. This report presents and updated bibliography of world literature on augmentation. The literature is classified into passive augmentation techniques, which require no external power, and active techniques, which do require external power. The fifteen techniques are grouped in terms of their applications to the various modes of heat transfer. Mass transfer is included for completeness. Key words are included with each citation for technique/mode identification. The total number of publications cited is 3045, including 135 surveys of various techniques and 86 papers on performancemore » evaluation of passive techniques. Patents are not included, as they are the subject of a separate bibliographic report.« less
Bibliography on augmentation of convective heat and mass transfer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bergles, A.E.; Webb, R.L.; Junkhan, G.H.
1979-05-01
Heat transfer augmentation has developed into a major specialty area in heat transfer research and development. A bibliography of world literature on augmentation is presented. The literature is classified into passive augmentation techniques, which require no external power, and active techniques, which do require external power. The fourteen techniques are grouped in terms of their application to the various modes of heat transfer. Mass transfer is included for completeness. Key words are included with each citation for technique/mode identification. The total number of publications cited is 1,967, including 75 surveys of various techniques and 42 papers on performance evaluation ofmore » passive techniques. Patents are not included as they will be the subject of a future topical report.« less
Virtual reality gaming in the rehabilitation of the upper extremities post-stroke.
Yates, Michael; Kelemen, Arpad; Sik Lanyi, Cecilia
2016-01-01
Occurrences of strokes often result in unilateral upper limb dysfunction. Dysfunctions of this nature frequently persist and can present chronic limitations to activities of daily living. Research into applying virtual reality gaming systems to provide rehabilitation therapy have seen resurgence. Themes explored in stroke rehab for paretic limbs are action observation and imitation, versatility, intensity and repetition and preservation of gains. Fifteen articles were ultimately selected for review. The purpose of this literature review is to compare the various virtual reality gaming modalities in the current literature and ascertain their efficacy. The literature supports the use of virtual reality gaming rehab therapy as equivalent to traditional therapies or as successful augmentation to those therapies. While some degree of rigor was displayed in the literature, small sample sizes, variation in study lengths and therapy durations and unequal controls reduce generalizability and comparability. Future studies should incorporate larger sample sizes and post-intervention follow-up measures.
An intelligent control and virtual display system for evolutionary space station workstation design
NASA Technical Reports Server (NTRS)
Feng, Xin; Niederjohn, Russell J.; Mcgreevy, Michael W.
1992-01-01
Research and development of the Advanced Display and Computer Augmented Control System (ADCACS) for the space station Body-Ported Cupola Virtual Workstation (BP/VCWS) were pursued. The potential applications were explored of body ported virtual display and intelligent control technology for the human-system interfacing applications is space station environment. The new system is designed to enable crew members to control and monitor a variety of space operations with greater flexibility and efficiency than existing fixed consoles. The technologies being studied include helmet mounted virtual displays, voice and special command input devices, and microprocessor based intelligent controllers. Several research topics, such as human factors, decision support expert systems, and wide field of view, color displays are being addressed. The study showed the significant advantages of this uniquely integrated display and control system, and its feasibility for human-system interfacing applications in the space station command and control environment.
Designing for Virtual Windows in a Deep Space Habitat
NASA Technical Reports Server (NTRS)
Howe, A. Scott; Howard, Robert L.; Moore, Nathan; Amoroso, Michael
2013-01-01
This paper discusses configurations and test analogs toward the design of a virtual window capability in a Deep Space Habitat. Long-duration space missions will require crews to remain in the confines of a spacecraft for extended periods of time, with possible harmful effects if a crewmember cannot cope with the small habitable volume. Virtual windows expand perceived volume using a minimal amount of image projection equipment and computing resources, and allow a limited immersion in remote environments. Uses for the virtual window include: live or augmented reality views of the external environment; flight deck, piloting, observation, or other participation in remote missions through live transmission of cameras mounted to remote vehicles; pre-recorded background views of nature areas, seasonal occurrences, or cultural events; and pre-recorded events such as birthdays, anniversaries, and other meaningful events prepared by ground support and families of the crewmembers.
A Process Study of the Development of Virtual Research Environments
NASA Astrophysics Data System (ADS)
Ahmed, I.; Cooper, K.; McGrath, R.; Griego, G.; Poole, M. S.; Hanisch, R. J.
2014-05-01
In recent years, cyberinfrastructures have been deployed to create virtual research environments (VREs) - such as the Virtual Astronomical Observatory (VAO) - to enhance the quality and speed of scientific research, and to foster global scientific communities. Our study utilizes process methodology to study the evolution of VREs. This approach focuses on a series of events that bring about or lead to some outcome, and attempts to specify the generative mechanism that could produce the event series. This paper briefly outlines our approach and describes initial results of a case study of the VAO, one of the participating VREs. The case study is based on interviews with seven individuals participating in the VAO, and analysis of project documents and online resources. These sources are hand tagged to identify events related to the thematic tracks, to yield a narrative of the project. Results demonstrate the event series of an organization through traditional methods augmented by virtual sources.
The assessment of virtual reality for human anatomy instruction
NASA Technical Reports Server (NTRS)
Benn, Karen P.
1994-01-01
This research project seeks to meet the objective of science training by developing, assessing, and validating virtual reality as a human anatomy training medium. In ideal situations, anatomic models, computer-based instruction, and cadaver dissection are utilized to augment the traditional methods of instruction. At many institutions, lack of financial resources limits anatomy instruction to textbooks and lectures. However, human anatomy is three dimensional, unlike the one dimensional depiction found in textbooks and the two dimensional depiction found on the computer. Virtual reality is a breakthrough technology that allows one to step through the computer screen into a three dimensional world. This technology offers many opportunities to enhance science education. Therefore, a virtual testing environment of the abdominopelvic region of a human cadaver was created to study the placement of body parts within the nine anatomical divisions of the abdominopelvic region and the four abdominal quadrants.
Real-time 3D image reconstruction guidance in liver resection surgery
Nicolau, Stephane; Pessaux, Patrick; Mutter, Didier; Marescaux, Jacques
2014-01-01
Background Minimally invasive surgery represents one of the main evolutions of surgical techniques. However, minimally invasive surgery adds difficulty that can be reduced through computer technology. Methods From a patient’s medical image [US, computed tomography (CT) or MRI], we have developed an Augmented Reality (AR) system that increases the surgeon’s intraoperative vision by providing a virtual transparency of the patient. AR is based on two major processes: 3D modeling and visualization of anatomical or pathological structures appearing in the medical image, and the registration of this visualization onto the real patient. We have thus developed a new online service, named Visible Patient, providing efficient 3D modeling of patients. We have then developed several 3D visualization and surgical planning software tools to combine direct volume rendering and surface rendering. Finally, we have developed two registration techniques, one interactive and one automatic providing intraoperative augmented reality view. Results From January 2009 to June 2013, 769 clinical cases have been modeled by the Visible Patient service. Moreover, three clinical validations have been realized demonstrating the accuracy of 3D models and their great benefit, potentially increasing surgical eligibility in liver surgery (20% of cases). From these 3D models, more than 50 interactive AR-assisted surgical procedures have been realized illustrating the potential clinical benefit of such assistance to gain safety, but also current limits that automatic augmented reality will overcome. Conclusions Virtual patient modeling should be mandatory for certain interventions that have now to be defined, such as liver surgery. Augmented reality is clearly the next step of the new surgical instrumentation but remains currently limited due to the complexity of organ deformations during surgery. Intraoperative medical imaging used in new generation of automated augmented reality should solve this issue thanks to the development of Hybrid OR. PMID:24812598
Ntourakis, Dimitrios; Memeo, Ricardo; Soler, Luc; Marescaux, Jacques; Mutter, Didier; Pessaux, Patrick
2016-02-01
Modern chemotherapy achieves the shrinking of colorectal cancer liver metastases (CRLM) to such extent that they may disappear from radiological imaging. Disappearing CRLM rarely represents a complete pathological remission and have an important risk of recurrence. Augmented reality (AR) consists in the fusion of real-time patient images with a computer-generated 3D virtual patient model created from pre-operative medical imaging. The aim of this prospective pilot study is to investigate the potential of AR navigation as a tool to help locate and surgically resect missing CRLM. A 3D virtual anatomical model was created from thoracoabdominal CT-scans using customary software (VR RENDER(®), IRCAD). The virtual model was superimposed to the operative field using an Exoscope (VITOM(®), Karl Storz, Tüttlingen, Germany). Virtual and real images were manually registered in real-time using a video mixer, based on external anatomical landmarks with an estimated accuracy of 5 mm. This modality was tested in three patients, with four missing CRLM that had sizes from 12 to 24 mm, undergoing laparotomy after receiving pre-operative oxaliplatin-based chemotherapy. AR display and fine registration was performed within 6 min. AR helped detect all four missing CRLM, and guided their resection. In all cases the planned security margin of 1 cm was clear and resections were confirmed to be R0 by pathology. There was no postoperative major morbidity or mortality. No local recurrence occurred in the follow-up period of 6-22 months. This initial experience suggests that AR may be a helpful navigation tool for the resection of missing CRLM.
USDA-ARS?s Scientific Manuscript database
Host defense peptides (HDPs) constitute a large group of natural broad-spectrum antimicrobials and an important first line of immunity in virtually all forms of life. Specific augmentation of synthesis of endogenous HDPs may represent a promising antibiotic-alternative approach to disease control. I...
When Worlds Collide: An Augmented Reality Check
ERIC Educational Resources Information Center
Villano, Matt
2008-01-01
The technology is simple: Mobile technologies such as handheld computers and global positioning systems work in sync to create an alternate, hybrid world that mixes virtual characters with the actual physical environment. The result is a digital simulation that offers powerful game-playing opportunities and allows students to become more engaged…
ERIC Educational Resources Information Center
Achuthan, Krishnashree; Francis, Saneesh P.; Diwakar, Shyam
2017-01-01
Learning theories converge on the principles of reflective learning processes and perceive them as fundamental to effective learning. Traditional laboratory education in science and engineering often happens in highly resource-constrained environments that compromise some of the learning objectives. This paper focuses on characterizing three…
Genome size in Anthurium evaluated in the context of karyotypes and phenotypes
USDA-ARS?s Scientific Manuscript database
Anthurium is an important horticultural flower crop from family Araceae in order Alismatales, a monocot lineage considered to have diverged from other monocots prior to the divergence of the cereals lineage. Currently there is a virtual lack of molecular-genetic resources that would greatly augment...
Incorporating Technology in Teaching Musical Instruments
ERIC Educational Resources Information Center
Prodan, Angelica
2017-01-01
After discussing some of the drawbacks of using Skype for long distance music lessons, Angelica Prodan describes three different types of Artificial Reality (Virtual Reality, Augmented Reality and Mixed or Merged Reality). She goes on to describe the beneficial applications of technology, with results otherwise impossible to achieve in areas such…
NASA Astrophysics Data System (ADS)
De Breuck, Carlos
2018-03-01
The APEX telescope has a range instruments that are highly complementary to ALMA. The single pixel heterodyne receivers cover virtually all atmospheric windows from 157 GHz to above 1 THz, augmented by 7-pixel heterodyne arrays covering 280 to 950 GHz, while the bolometer arrays cover the 870, 450 and 350µm bands.
Implementation of augmented reality to models sultan deli
NASA Astrophysics Data System (ADS)
Syahputra, M. F.; Lumbantobing, N. P.; Siregar, B.; Rahmat, R. F.; Andayani, U.
2018-03-01
Augmented reality is a technology that can provide visualization in the form of 3D virtual model. With the utilization of augmented reality technology hence image-based modeling to produce 3D model of Sultan Deli Istana Maimun can be applied to restore photo of Sultan of Deli into three dimension model. This is due to the Sultan of Deli which is one of the important figures in the history of the development of the city of Medan is less known by the public because the image of the Sultanate of Deli is less clear and has been very long. To achieve this goal, augmented reality applications are used with image processing methodologies into 3D models through several toolkits. The output generated from this method is the visitor’s photos Maimun Palace with 3D model of Sultan Deli with the detection of markers 20-60 cm apart so as to provide convenience for the public to recognize the Sultan Deli who had ruled in Maimun Palace.
Jiang, Taoran; Zhu, Ming; Zan, Tao; Gu, Bin; Li, Qingfeng
2017-08-01
In perforator flap transplantation, dissection of the perforator is an important but difficult procedure because of the high variability in vascular anatomy. Preoperative imaging techniques could provide substantial information about vascular anatomy; however, it cannot provide direct guidance for surgeons during the operation. In this study, a navigation system (NS) was established to overlie a vascular map on surgical sites to further provide a direct guide for perforator flap transplantation. The NS was established based on computed tomographic angiography and augmented reality techniques. A virtual vascular map was reconstructed according to computed tomographic angiography data and projected onto real patient images using ARToolKit software. Additionally, a screw-fixation marker holder was created to facilitate registration. With the use of a tracking and display system, we conducted the NS on an animal model and measured the system error on a rapid prototyping model. The NS assistance allowed for correct identification, as well as a safe and precise dissection of the perforator. The mean value of the system error was determined to be 3.474 ± 1.546 mm. Augmented reality-based NS can provide precise navigation information by directly displaying a 3-dimensional individual anatomical virtual model onto the operative field in real time. It will allow rapid identification and safe dissection of a perforator in free flap transplantation surgery.
Hybrid diffractive-refractive optical system design of head-mounted display for augmented reality
NASA Astrophysics Data System (ADS)
Zhang, Huijuan
2005-02-01
An optical see-through head-mounted display for augmented reality is designed in this paper. Considering the factors, such as the optical performance, the utilization ratios of energy of real world and virtual world, the feelings of users when he wears it and etc., a structure of the optical see-through is adopted. With the characteristics of the particular negative dispersive and the power of realizing random-phase modulation, the diffractive surface is helpful for optical system of reducing weight, simplifying structure and etc., and a diffractive surface is introduced in our optical system. The optical system with 25 mm eye relief, 12 mm exit pupil and 20° (H)x15.4° (V) field-of-view is designed. The utilization ratios of energy of real world and virtual world are 1/4 and 1/2, respectively. The angular resolution of display is 0.27 mrad and it less than that of the minimum of human eyes. The diameter of this system is less than 46mm, and it applies the binocular. This diffractive-refractive optical system of see-through head-mounted display not only satisfies the demands of user"s factors in structure, but also with high resolution, very small chromatic aberration and distortion, and satisfies the need of augmented reality. In the end, the parameters of the diffractive surface are discussed.
On-the-fly augmented reality for orthopedic surgery using a multimodal fiducial.
Andress, Sebastian; Johnson, Alex; Unberath, Mathias; Winkler, Alexander Felix; Yu, Kevin; Fotouhi, Javad; Weidert, Simon; Osgood, Greg; Navab, Nassir
2018-04-01
Fluoroscopic x-ray guidance is a cornerstone for percutaneous orthopedic surgical procedures. However, two-dimensional (2-D) observations of the three-dimensional (3-D) anatomy suffer from the effects of projective simplification. Consequently, many x-ray images from various orientations need to be acquired for the surgeon to accurately assess the spatial relations between the patient's anatomy and the surgical tools. We present an on-the-fly surgical support system that provides guidance using augmented reality and can be used in quasiunprepared operating rooms. The proposed system builds upon a multimodality marker and simultaneous localization and mapping technique to cocalibrate an optical see-through head mounted display to a C-arm fluoroscopy system. Then, annotations on the 2-D x-ray images can be rendered as virtual objects in 3-D providing surgical guidance. We quantitatively evaluate the components of the proposed system and, finally, design a feasibility study on a semianthropomorphic phantom. The accuracy of our system was comparable to the traditional image-guided technique while substantially reducing the number of acquired x-ray images as well as procedure time. Our promising results encourage further research on the interaction between virtual and real objects that we believe will directly benefit the proposed method. Further, we would like to explore the capabilities of our on-the-fly augmented reality support system in a larger study directed toward common orthopedic interventions.
Some tests on small-scale rectangular throat ejector. [thrust augmentation for V/STOL aircraft
NASA Technical Reports Server (NTRS)
Dean, W. N., Jr.; Franke, M. E.
1979-01-01
A small scale rectangular throat ejector with plane slot nozzles and a fixed throat area was tested to determine the effects of diffuser sidewall length, diffuser area ratio, and sidewall nozzle position on thrust and mass augmentation. The thrust augmentation ratio varied from approximately 0.9 to 1.1. Although the ejector did not have good thrust augmentation performance, the effects of the parameters studied are believed to indicate probable trends in thrust augmenting ejectors.
Adaptive space warping to enhance passive haptics in an arthroscopy surgical simulator.
Spillmann, Jonas; Tuchschmid, Stefan; Harders, Matthias
2013-04-01
Passive haptics, also known as tactile augmentation, denotes the use of a physical counterpart to a virtual environment to provide tactile feedback. Employing passive haptics can result in more realistic touch sensations than those from active force feedback, especially for rigid contacts. However, changes in the virtual environment would necessitate modifications of the physical counterparts. In recent work space warping has been proposed as one solution to overcome this limitation. In this technique virtual space is distorted such that a variety of virtual models can be mapped onto one single physical object. In this paper, we propose as an extension adaptive space warping; we show how this technique can be employed in a mixed-reality surgical training simulator in order to map different virtual patients onto one physical anatomical model. We developed methods to warp different organ geometries onto one physical mock-up, to handle different mechanical behaviors of the virtual patients, and to allow interactive modifications of the virtual structures, while the physical counterparts remain unchanged. Various practical examples underline the wide applicability of our approach. To the best of our knowledge this is the first practical usage of such a technique in the specific context of interactive medical training.
Augmented Reality-Guided Lumbar Facet Joint Injections.
Agten, Christoph A; Dennler, Cyrill; Rosskopf, Andrea B; Jaberg, Laurenz; Pfirrmann, Christian W A; Farshad, Mazda
2018-05-08
The aim of this study was to assess feasibility and accuracy of augmented reality-guided lumbar facet joint injections. A spine phantom completely embedded in hardened opaque agar with 3 ring markers was built. A 3-dimensional model of the phantom was uploaded to an augmented reality headset (Microsoft HoloLens). Two radiologists independently performed 20 augmented reality-guided and 20 computed tomography (CT)-guided facet joint injections each: for each augmented reality-guided injection, the hologram was manually aligned with the phantom container using the ring markers. The radiologists targeted the virtual facet joint and tried to place the needle tip in the holographic joint space. Computed tomography was performed after each needle placement to document final needle tip position. Time needed from grabbing the needle to final needle placement was measured for each simulated injection. An independent radiologist rated images of all needle placements in a randomized order blinded to modality (augmented reality vs CT) and performer as perfect, acceptable, incorrect, or unsafe. Accuracy and time to place needles were compared between augmented reality-guided and CT-guided facet joint injections. In total, 39/40 (97.5%) of augmented reality-guided needle placements were either perfect or acceptable compared with 40/40 (100%) CT-guided needle placements (P = 0.5). One augmented reality-guided injection missed the facet joint space by 2 mm. No unsafe needle placements occurred. Time to final needle placement was substantially faster with augmented reality guidance (mean 14 ± 6 seconds vs 39 ± 15 seconds, P < 0.001 for both readers). Augmented reality-guided facet joint injections are feasible and accurate without potentially harmful needle placement in an experimental setting.
Stability-Augmentation Devices for Miniature Aircraft
NASA Technical Reports Server (NTRS)
Wood, RIchard M.
2005-01-01
Non-aerodynamic mechanical devices are under consideration as means to augment the stability of miniature autonomous and remotely controlled aircraft. Such aircraft can be used for diverse purposes, including military reconnaissance, radio communications, and safety-related monitoring of wide areas. The need for stability-augmentation devices arises because adverse meteorological conditions generally affect smaller aircraft more strongly than they affect larger aircraft: Miniature aircraft often become uncontrollable under conditions that would not be considered severe enough to warrant grounding of larger aircraft. The need for the stability-augmentation devices to be non-aerodynamic arises because there is no known way to create controlled aerodynamic forces sufficient to counteract the uncontrollable meteorological forces on miniature aircraft. A stability-augmentation device of the type under consideration includes a mass pod (a counterweight) at the outer end of a telescoping shaft, plus associated equipment to support the operation of the aircraft. The telescoping shaft and mass pod are stowed in the rear of the aircraft. When deployed, they extend below the aircraft. Optionally, an antenna for radio communication can be integrated into the shaft. At the time of writing this article, the deployment of the telescoping shaft and mass pod was characterized as passive and automatic, but information about the deployment mechanism(s) was not available. The feasibility of this stability-augmentation concept was demonstrated in flights of hand-launched prototype aircraft.
Feasibility of Virtual Reality Augmented Cycling for Health Promotion of People Post-Stroke
Deutsch, Judith E; Myslinski, Mary Jane; Kafri, Michal; Ranky, Richard; Sivak, Mark; Mavroidis, Constantinos; Lewis, Jeffrey A
2013-01-01
Background and Purpose A virtual reality (VR) augmented cycling kit (VRACK) was developed to address motor control and fitness deficits of individuals with chronic stroke. In this paper we report on the safety, feasibility and efficacy of using the VRACK to train cardio-respiratory (CR) fitness of individuals in the chronic phase poststroke. Methods Four individuals with chronic stroke (47–65 years old and three or more years post-stroke), with residual lower extremity impairments (Fugl Meyer 24–26/34) who were limited community ambulators (gait speed range 0.56 to 1.1 m/s) participated in this study. Safety was defined as the absence of adverse events. Feasibility was measured using attendance, total exercise time, and “involvement” measured with the Presence Questionnaire (PQ). Efficacy of CR fitness was evaluated using a sub-maximal bicycle ergometer test before and after an 8-week training program. Results The intervention was safe and feasible with participants having 1 adverse event, 100% adherence, achieving between 90 and 125 minutes of cycling each week and a mean PQ score of 39 (SD 3.3). There was a statistically significant 13% (p = 0.035) improvement in peak VO2 with a range of 6–24.5 %. Discussion and Conclusion For these individuals post-stroke, VR augmented cycling, using their heart rate to set their avatar’s speed, fostered training of sufficient duration and intensity to promote CR fitness. In addition, there was a transfer of training from the bicycle to walking endurance. VR augmented cycling may be an addition to the therapist’s tools for concurrent training of mobility and health promotion of individuals post-stroke. Video Abstract available (see Video, Supplemental Digital Content 1) for more insights from the authors. PMID:23863828
A preliminary look at control augmented dynamic response of structures
NASA Technical Reports Server (NTRS)
Ryan, R. S.; Jewell, R. E.
1983-01-01
The augmentation of structural characteristics, mass, damping, and stiffness through the use of control theory in lieu of structural redesign or augmentation was reported. The standard single-degree-of-freedom system was followed by a treatment of the same system using control augmentation. The system was extended to elastic structures using single and multisensor approaches and concludes with a brief discussion of potential application to large orbiting space structures.
Augmented Reality Based Doppler Lidar Data Visualization: Promises and Challenges
NASA Astrophysics Data System (ADS)
Cherukuru, N. W.; Calhoun, R.
2016-06-01
Augmented reality (AR) is a technology in which the enables the user to view virtual content as if it existed in real world. We are exploring the possibility of using this technology to view radial velocities or processed wind vectors from a Doppler wind lidar, thus giving the user an ability to see the wind in a literal sense. This approach could find possible applications in aviation safety, atmospheric data visualization as well as in weather education and public outreach. As a proof of concept, we used the lidar data from a recent field campaign and developed a smartphone application to view the lidar scan in augmented reality. In this paper, we give a brief methodology of this feasibility study, present the challenges and promises of using AR technology in conjunction with Doppler wind lidars.
Focus, locus, and sensus: the three dimensions of virtual experience.
Waterworth, E L; Waterworth, J A
2001-04-01
A model of virtual/physical experience is presented, which provides a three dimensional conceptual space for virtual and augmented reality (VR and AR) comprising the dimensions of focus, locus, and sensus. Focus is most closely related to what is generally termed presence in the VR literature. When in a virtual environment, presence is typically shared between the VR and the physical world. "Breaks in presence" are actually shifts of presence away from the VR and toward the external environment. But we can also have "breaks in presence" when attention moves toward absence--when an observer is not attending to stimuli present in the virtual environment, nor to stimuli present in the surrounding physical environment--when the observer is present in neither the virtual nor the physical world. We thus have two dimensions of presence: focus of attention (between presence and absence) and the locus of attention (the virtual vs. the physical world). A third dimension is the sensus of attention--the level of arousal determining whether the observer is highly conscious or relatively unconscious while interacting with the environment. After expanding on each of these three dimensions of experience in relation to VR, we present a couple of educational examples as illustrations, and also relate our model to a suggested spectrum of evaluation methods for virtual environments.
Chen, Xiaojun; Xu, Lu; Wang, Yiping; Wang, Huixiang; Wang, Fang; Zeng, Xiangsen; Wang, Qiugen; Egger, Jan
2015-06-01
The surgical navigation system has experienced tremendous development over the past decades for minimizing the risks and improving the precision of the surgery. Nowadays, Augmented Reality (AR)-based surgical navigation is a promising technology for clinical applications. In the AR system, virtual and actual reality are mixed, offering real-time, high-quality visualization of an extensive variety of information to the users (Moussa et al., 2012) [1]. For example, virtual anatomical structures such as soft tissues, blood vessels and nerves can be integrated with the real-world scenario in real time. In this study, an AR-based surgical navigation system (AR-SNS) is developed using an optical see-through HMD (head-mounted display), aiming at improving the safety and reliability of the surgery. With the use of this system, including the calibration of instruments, registration, and the calibration of HMD, the 3D virtual critical anatomical structures in the head-mounted display are aligned with the actual structures of patient in real-world scenario during the intra-operative motion tracking process. The accuracy verification experiment demonstrated that the mean distance and angular errors were respectively 0.809±0.05mm and 1.038°±0.05°, which was sufficient to meet the clinical requirements. Copyright © 2015 Elsevier Inc. All rights reserved.
Anesthesiology training using 3D imaging and virtual reality
NASA Astrophysics Data System (ADS)
Blezek, Daniel J.; Robb, Richard A.; Camp, Jon J.; Nauss, Lee A.
1996-04-01
Current training for regional nerve block procedures by anesthesiology residents requires expert supervision and the use of cadavers; both of which are relatively expensive commodities in today's cost-conscious medical environment. We are developing methods to augment and eventually replace these training procedures with real-time and realistic computer visualizations and manipulations of the anatomical structures involved in anesthesiology procedures, such as nerve plexus injections (e.g., celiac blocks). The initial work is focused on visualizations: both static images and rotational renderings. From the initial results, a coherent paradigm for virtual patient and scene representation will be developed.
Virtual reality applications for motor rehabilitation after stroke.
Sisto, Sue Ann; Forrest, Gail F; Glendinning, Diana
2002-01-01
Hemiparesis is the primary physical impairment underlying functional disability after stroke. A goal of rehabilitation is to enhance motor skill acquisition, which is a direct result of practice. However, frequency and duration of practice are limited in rehabilitation. Virtual reality (VR) is a computer technology that simulates real-life learning while providing augmented feedback and increased frequency, duration, and intensity of practiced tasks. The rate and extent of relearning of motor tasks could affect the duration, effectiveness, and cost of patient care. The purpose of this article is to review the use of VR training for motor rehabilitation after stroke.
H2OTSTUF: Appropriate Operating Regimes for Magnetohydrodynamic Augmentation
NASA Technical Reports Server (NTRS)
Jones, Jonathan E.; Hawk, Clark W.
1998-01-01
A trade study of magnetohydrodynamic (MHD) augmented propulsion reveals a unique operating regime at lower thrust levels. Substantial mass savings are realized over conventional chemical, solar, and electrical propulsion concepts when MHD augmentation is used to obtain optimal I(sub sp). However, trip times for the most conservative estimates of power plant specific impulse and accelerator efficiency may be prohibitively long. Quasi-one-dimensional calculations show that a solar or nuclear thermal system augmented by MHD can provide competitive performance while utilizing a diverse range of propellants including water, which is available from the Space Shuttle, the Moon, asteroids, and various moons and planets within our solar system. The use of in-situ propellants will reduce costs of space operations as well as enable human exploration of our Solar System. The following conclusions can be drawn from the results of the mission trade study: (1) There exists a maximum thrust or mass flow rate above which MHD augmentation increases the initial mass in low earth orbit (LEO); (2) Mass saving of over 50% can be realized for unique combination of solar/MHD systems; (3) Trip times for systems utilizing current power supply technology may be prohibitively long. Theoretical predictions of MHD performance for in space propulsion systems show that improved efficiencies can reduce trip times to acceptable levels; (4) Long trip times indicative of low thrust systems can be shortened by an increase in the MHD accelerator efficiency or a decrease in the specific mass of the power supply and power processing unit; and (5) As for all propulsion concepts, missions with larger (Delta)v's benefit more from the increased specific impulse resulting from MHD augmentation. Using a quasi-one-dimensional analysis, the required operating conditions for a MHD accelerator to reach acceptable efficiencies are outlined. This analysis shows that substantial non-equilibrium ionization is desirable.
Deutsch, Judith E
2009-01-01
Improving walking for individuals with musculoskeletal and neuromuscular conditions is an important aspect of rehabilitation. The capabilities of clinicians who address these rehabilitation issues could be augmented with innovations such as virtual reality gaming based technologies. The chapter provides an overview of virtual reality gaming based technologies currently being developed and tested to improve motor and cognitive elements required for ambulation and mobility in different patient populations. Included as well is a detailed description of a single VR system, consisting of the rationale for development and iterative refinement of the system based on clinical science. These concepts include: neural plasticity, part-task training, whole task training, task specific training, principles of exercise and motor learning, sensorimotor integration, and visual spatial processing.
ERIC Educational Resources Information Center
Brown, Abbie Howard
1999-01-01
Describes and discusses how simulation activities can be used in teacher education to augment the traditional field-experience approach, focusing on artificial intelligence, virtual reality, and intelligent tutoring systems. Includes an overview of simulation as a teaching and learning strategy and specific examples of high-technology simulations…
The Effectiveness of Virtual and Augmented Reality in Health Sciences and Medical Anatomy
ERIC Educational Resources Information Center
Moro, Christian; Štromberga, Zane; Raikos, Athanasios; Stirling, Allan
2017-01-01
Although cadavers constitute the gold standard for teaching anatomy to medical and health science students, there are substantial financial, ethical, and supervisory constraints on their use. In addition, although anatomy remains one of the fundamental areas of medical education, universities have decreased the hours allocated to teaching gross…
A Review on Making Things See: Augmented Reality for Futuristic Virtual Educator
ERIC Educational Resources Information Center
Iqbal, Javid; Sidhu, Manjit Singh
2017-01-01
In the past few years many choreographers have focused upon implementation of computer technology to enhance their artistic skills. Computer vision technology presents new methods for learning, instructing, developing, and assessing physical movements as well as provides scope to expand dance resources and rediscover the learning process. This…
ERIC Educational Resources Information Center
Duncan, Mike R.; Birrell, Bob; Williams, Toni
2005-01-01
Virtual Reality (VR) is primarily a visual technology. Elements such as haptics (touch feedback) and sound can augment an experience, but the visual cues are the prime driver of what an audience will experience from a VR presentation. At its inception in 2001 the Centre for Advanced Visualization (CFAV) at Niagara College of Arts and Technology…
The Role of Immersive Media in Online Education
ERIC Educational Resources Information Center
Bronack, Stephen C.
2011-01-01
An increasing number of educators are integrating immersive media into core course offerings. Virtual worlds, serious games, simulations, and augmented reality are enabling students and instructors to connect with content and with one another in novel ways. As a result, many are investigating the new affordances these media provide and the impact…
Best-Practice Model for Technology Enhanced Learning in the Creative Arts
ERIC Educational Resources Information Center
Power, Jess; Kannara, Vidya
2016-01-01
This paper presents a best-practice model for the redesign of virtual learning environments (VLEs) within creative arts to augment blended learning. In considering a blended learning best-practice model, three factors should be considered: the conscious and active human intervention, good learning design and pedagogical input, and the sensitive…
ERIC Educational Resources Information Center
Chang, Hsin-Yi
2017-01-01
Two investigations were conducted in this study. In the first experiment, the effects of two types of interactivity with a computer simulation were compared: experimentation versus observation interactivity. Experimentation interactivity allows students to use simulations to conduct virtual experiments, whereas observation interactivity allows…
Augmented reality in the surgery of cerebral aneurysms: a technical report.
Cabrilo, Ivan; Bijlenga, Philippe; Schaller, Karl
2014-06-01
Augmented reality is the overlay of computer-generated images on real-world structures. It has previously been used for image guidance during surgical procedures, but it has never been used in the surgery of cerebral aneurysms. To report our experience of cerebral aneurysm surgery aided by augmented reality. Twenty-eight patients with 39 unruptured aneurysms were operated on in a prospective manner with augmented reality. Preoperative 3-dimensional image data sets (angio-magnetic resonance imaging, angio-computed tomography, and 3-dimensional digital subtraction angiography) were used to create virtual segmentations of patients' vessels, aneurysms, aneurysm necks, skulls, and heads. These images were injected intraoperatively into the eyepiece of the operating microscope. An example case of an unruptured posterior communicating artery aneurysm clipping is illustrated in a video. The described operating procedure allowed continuous monitoring of the accuracy of patient registration with neuronavigation data and assisted in the performance of tailored surgical approaches and optimal clipping with minimized exposition. Augmented reality may add to the performance of a minimally invasive approach, although further studies need to be performed to evaluate whether certain groups of aneurysms are more likely to benefit from it. Further technological development is required to improve its user friendliness.
A spatially augmented reality sketching interface for architectural daylighting design.
Sheng, Yu; Yapo, Theodore C; Young, Christopher; Cutler, Barbara
2011-01-01
We present an application of interactive global illumination and spatially augmented reality to architectural daylight modeling that allows designers to explore alternative designs and new technologies for improving the sustainability of their buildings. Images of a model in the real world, captured by a camera above the scene, are processed to construct a virtual 3D model. To achieve interactive rendering rates, we use a hybrid rendering technique, leveraging radiosity to simulate the interreflectance between diffuse patches and shadow volumes to generate per-pixel direct illumination. The rendered images are then projected on the real model by four calibrated projectors to help users study the daylighting illumination. The virtual heliodon is a physical design environment in which multiple designers, a designer and a client, or a teacher and students can gather to experience animated visualizations of the natural illumination within a proposed design by controlling the time of day, season, and climate. Furthermore, participants may interactively redesign the geometry and materials of the space by manipulating physical design elements and see the updated lighting simulation. © 2011 IEEE Published by the IEEE Computer Society
Interactive Molecular Graphics for Augmented Reality Using HoloLens.
Müller, Christoph; Krone, Michael; Huber, Markus; Biener, Verena; Herr, Dominik; Koch, Steffen; Reina, Guido; Weiskopf, Daniel; Ertl, Thomas
2018-06-13
Immersive technologies like stereo rendering, virtual reality, or augmented reality (AR) are often used in the field of molecular visualisation. Modern, comparably lightweight and affordable AR headsets like Microsoft's HoloLens open up new possibilities for immersive analytics in molecular visualisation. A crucial factor for a comprehensive analysis of molecular data in AR is the rendering speed. HoloLens, however, has limited hardware capabilities due to requirements like battery life, fanless cooling and weight. Consequently, insights from best practises for powerful desktop hardware may not be transferable. Therefore, we evaluate the capabilities of the HoloLens hardware for modern, GPU-enabled, high-quality rendering methods for the space-filling model commonly used in molecular visualisation. We also assess the scalability for large molecular data sets. Based on the results, we discuss ideas and possibilities for immersive molecular analytics. Besides more obvious benefits like the stereoscopic rendering offered by the device, this specifically includes natural user interfaces that use physical navigation instead of the traditional virtual one. Furthermore, we consider different scenarios for such an immersive system, ranging from educational use to collaborative scenarios.
Designing and researching of the virtual display system based on the prism elements
NASA Astrophysics Data System (ADS)
Vasilev, V. N.; Grimm, V. A.; Romanova, G. E.; Smirnov, S. A.; Bakholdin, A. V.; Grishina, N. Y.
2014-05-01
Problems of designing of systems for virtual display systems for augmented reality placed near the observers eye (so called head worn displays) with the light guide prismatic elements are considered. Systems of augmented reality is the complex consists of the image generator (most often it's the microdisplay with the illumination system if the display is not self-luminous), the objective which forms the display image practically in infinity and the combiner which organizes the light splitting so that an observer could see the information of the microdisplay and the surrounding environment as the background at the same time. This work deals with the system with the combiner based on the composite structure of the prism elements. In the work three cases of the prism combiner design are considered and also the results of the modeling with the optical design software are presented. In the model the question of the large pupil zone was analyzed and also the discontinuous character (mosaic structure) of the angular field in transmission of the information from the microdisplay to the observer's eye with the prismatic structure are discussed.
Explore and experience: mobile augmented reality for medical training.
Albrecht, Urs-Vito; Noll, Christoph; von Jan, Ute
2013-01-01
In medicine, especially in basic education, it may sometimes be inappropriate to integrate real patients into classes due to ethical issues that must be avoided. Nevertheless, the quality of medical education may suffer without the use of real cases. This is especially true of medical specialties such as legal medicine: survivors of a crime are already subjected to procedures that constitute a severe emotional burden and may cause additional distress even without the added presence of students. Using augmented reality based applications may alleviate this ethical dilemma by giving students the possibility to practice the necessary skills based on virtual but nevertheless almost realistic cases. The app "mARble®" that is presented in this paper follows this approach. The currently available learning module for legal medicine gives users an opportunity to learn about various wound patterns by virtually overlaying them on their own skin and is applicable in different learning settings. Preliminary evaluation results covering learning efficiency and emotional components of the learning process are promising. Content modules for other medical specialtiesare currently under construction.
NASA Astrophysics Data System (ADS)
Barazzetti, L.; Banfi, F.; Brumana, R.; Oreni, D.; Previtali, M.; Roncoroni, F.
2015-08-01
This paper describes a procedure for the generation of a detailed HBIM which is then turned into a model for mobile apps based on augmented and virtual reality. Starting from laser point clouds, photogrammetric data and additional information, a geometric reconstruction with a high level of detail can be carried out by considering the basic requirements of BIM projects (parametric modelling, object relations, attributes). The work aims at demonstrating that a complex HBIM can be managed in portable devices to extract useful information not only for expert operators, but also towards a wider user community interested in cultural tourism.
Riva, Giuseppe; Baños, Rosa M.; Botella, Cristina; Mantovani, Fabrizia; Gaggioli, Andrea
2016-01-01
During life, many personal changes occur. These include changing house, school, work, and even friends and partners. However, the daily experience shows clearly that, in some situations, subjects are unable to change even if they want to. The recent advances in psychology and neuroscience are now providing a better view of personal change, the change affecting our assumptive world: (a) the focus of personal change is reducing the distance between self and reality (conflict); (b) this reduction is achieved through (1) an intense focus on the particular experience creating the conflict or (2) an internal or external reorganization of this experience; (c) personal change requires a progression through a series of different stages that however happen in discontinuous and non-linear ways; and (d) clinical psychology is often used to facilitate personal change when subjects are unable to move forward. Starting from these premises, the aim of this paper is to review the potential of virtuality for enhancing the processes of personal and clinical change. First, the paper focuses on the two leading virtual technologies – augmented reality (AR) and virtual reality (VR) – exploring their current uses in behavioral health and the outcomes of the 28 available systematic reviews and meta-analyses. Then the paper discusses the added value provided by VR and AR in transforming our external experience by focusing on the high level of personal efficacy and self-reflectiveness generated by their sense of presence and emotional engagement. Finally, it outlines the potential future use of virtuality for transforming our inner experience by structuring, altering, and/or replacing our bodily self-consciousness. The final outcome may be a new generation of transformative experiences that provide knowledge that is epistemically inaccessible to the individual until he or she has that experience, while at the same time transforming the individual’s worldview. PMID:27746747
Riva, Giuseppe; Baños, Rosa M; Botella, Cristina; Mantovani, Fabrizia; Gaggioli, Andrea
2016-01-01
During life, many personal changes occur. These include changing house, school, work, and even friends and partners. However, the daily experience shows clearly that, in some situations, subjects are unable to change even if they want to. The recent advances in psychology and neuroscience are now providing a better view of personal change, the change affecting our assumptive world: (a) the focus of personal change is reducing the distance between self and reality (conflict); (b) this reduction is achieved through (1) an intense focus on the particular experience creating the conflict or (2) an internal or external reorganization of this experience; (c) personal change requires a progression through a series of different stages that however happen in discontinuous and non-linear ways; and (d) clinical psychology is often used to facilitate personal change when subjects are unable to move forward. Starting from these premises, the aim of this paper is to review the potential of virtuality for enhancing the processes of personal and clinical change. First, the paper focuses on the two leading virtual technologies - augmented reality (AR) and virtual reality (VR) - exploring their current uses in behavioral health and the outcomes of the 28 available systematic reviews and meta-analyses. Then the paper discusses the added value provided by VR and AR in transforming our external experience by focusing on the high level of personal efficacy and self-reflectiveness generated by their sense of presence and emotional engagement. Finally, it outlines the potential future use of virtuality for transforming our inner experience by structuring, altering, and/or replacing our bodily self-consciousness. The final outcome may be a new generation of transformative experiences that provide knowledge that is epistemically inaccessible to the individual until he or she has that experience, while at the same time transforming the individual's worldview.
A new approach to enforce element-wise mass/species balance using the augmented Lagrangian method
NASA Astrophysics Data System (ADS)
Chang, J.; Nakshatrala, K.
2015-12-01
The least-squares finite element method (LSFEM) is one of many ways in which one can discretize and express a set of first ordered partial differential equations as a mixed formulation. However, the standard LSFEM is not locally conservative by design. The absence of this physical property can have serious implications in the numerical simulation of subsurface flow and transport. Two commonly employed ways to circumvent this issue is through the Lagrange multiplier method, which explicitly satisfies the element-wise divergence by introducing new unknowns, or through appending a penalty factor to the continuity constraint, which reduces the violation in the mass balance. However, these methodologies have some well-known drawbacks. Herein, we propose a new approach to improve the local balance of species/mass balance. The approach augments constraints to a least-square function by a novel mathematical construction of the local species/mass balance, which is different from the conventional ways. The resulting constrained optimization problem is solved using the augmented Lagrangian, which corrects the balance errors in an iterative fashion. The advantages of this methodology are that the problem size is not increased (thus preserving the symmetry and positive definite-ness) and that one need not provide an accurate guess for the initial penalty to reach a prescribed mass balance tolerance. We derive the least-squares weighting needed to ensure accurate solutions. We also demonstrate the robustness of the weighted LSFEM coupled with the augmented Lagrangian by solving large-scale heterogenous and variably saturated flow through porous media problems. The performance of the iterative solvers with respect to various user-defined augmented Lagrangian parameters will be documented.
Mobile viewer system for virtual 3D space using infrared LED point markers and camera
NASA Astrophysics Data System (ADS)
Sakamoto, Kunio; Taneji, Shoto
2006-09-01
The authors have developed a 3D workspace system using collaborative imaging devices. A stereoscopic display enables this system to project 3D information. In this paper, we describe the position detecting system for a see-through 3D viewer. A 3D display system is useful technology for virtual reality, mixed reality and augmented reality. We have researched spatial imaging and interaction system. We have ever proposed 3D displays using the slit as a parallax barrier, the lenticular screen and the holographic optical elements(HOEs) for displaying active image 1)2)3)4). The purpose of this paper is to propose the interactive system using these 3D imaging technologies. The observer can view virtual images in the real world when the user watches the screen of a see-through 3D viewer. The goal of our research is to build the display system as follows; when users see the real world through the mobile viewer, the display system gives users virtual 3D images, which is floating in the air, and the observers can touch these floating images and interact them such that kids can make play clay. The key technologies of this system are the position recognition system and the spatial imaging display. The 3D images are presented by the improved parallax barrier 3D display. Here the authors discuss the measuring method of the mobile viewer using infrared LED point markers and a camera in the 3D workspace (augmented reality world). The authors show the geometric analysis of the proposed measuring method, which is the simplest method using a single camera not the stereo camera, and the results of our viewer system.
Kong, Seong-Ho; Haouchine, Nazim; Soares, Renato; Klymchenko, Andrey; Andreiuk, Bohdan; Marques, Bruno; Shabat, Galyna; Piechaud, Thierry; Diana, Michele; Cotin, Stéphane; Marescaux, Jacques
2017-07-01
Augmented reality (AR) is the fusion of computer-generated and real-time images. AR can be used in surgery as a navigation tool, by creating a patient-specific virtual model through 3D software manipulation of DICOM imaging (e.g., CT scan). The virtual model can be superimposed to real-time images enabling transparency visualization of internal anatomy and accurate localization of tumors. However, the 3D model is rigid and does not take into account inner structures' deformations. We present a concept of automated AR registration, while the organs undergo deformation during surgical manipulation, based on finite element modeling (FEM) coupled with optical imaging of fluorescent surface fiducials. Two 10 × 1 mm wires (pseudo-tumors) and six 10 × 0.9 mm fluorescent fiducials were placed in ex vivo porcine kidneys (n = 10). Biomechanical FEM-based models were generated from CT scan. Kidneys were deformed and the shape changes were identified by tracking the fiducials, using a near-infrared optical system. The changes were registered automatically with the virtual model, which was deformed accordingly. Accuracy of prediction of pseudo-tumors' location was evaluated with a CT scan in the deformed status (ground truth). In vivo: fluorescent fiducials were inserted under ultrasound guidance in the kidney of one pig, followed by a CT scan. The FEM-based virtual model was superimposed on laparoscopic images by automatic registration of the fiducials. Biomechanical models were successfully generated and accurately superimposed on optical images. The mean measured distance between the estimated tumor by biomechanical propagation and the scanned tumor (ground truth) was 0.84 ± 0.42 mm. All fiducials were successfully placed in in vivo kidney and well visualized in near-infrared mode enabling accurate automatic registration of the virtual model on the laparoscopic images. Our preliminary experiments showed the potential of a biomechanical model with fluorescent fiducials to propagate the deformation of solid organs' surface to their inner structures including tumors with good accuracy and automatized robust tracking.
E-Learning Application of Tarsier with Virtual Reality using Android Platform
NASA Astrophysics Data System (ADS)
Oroh, H. N.; Munir, R.; Paseru, D.
2017-01-01
Spectral Tarsier is a primitive primate that can only be found in the province of North Sulawesi. To study these primates can be used an e-learning application with Augmented Reality technology that uses a marker to confronted the camera computer to interact with three dimensions Tarsier object. But that application only shows tarsier object in three dimensions without habitat and requires a lot of resources because it runs on a Personal Computer. The same technology can be shown three dimensions’ objects is Virtual Reality to excess can make the user like venturing into the virtual world with Android platform that requires fewer resources. So, put on Virtual Reality technology using the Android platform that can make users not only to view and interact with the tarsiers but also the habitat. The results of this research indicate that the user can learn the Tarsier and habitat with good. Thus, the use of Virtual Reality technology in the e-learning application of tarsiers can help people to see, know, and learn about Spectral Tarsier.
Stereoscopic augmented reality with pseudo-realistic global illumination effects
NASA Astrophysics Data System (ADS)
de Sorbier, Francois; Saito, Hideo
2014-03-01
Recently, augmented reality has become very popular and has appeared in our daily life with gaming, guiding systems or mobile phone applications. However, inserting object in such a way their appearance seems natural is still an issue, especially in an unknown environment. This paper presents a framework that demonstrates the capabilities of Kinect for convincing augmented reality in an unknown environment. Rather than pre-computing a reconstruction of the scene like proposed by most of the previous method, we propose a dynamic capture of the scene that allows adapting to live changes of the environment. Our approach, based on the update of an environment map, can also detect the position of the light sources. Combining information from the environment map, the light sources and the camera tracking, we can display virtual objects using stereoscopic devices with global illumination effects such as diffuse and mirror reflections, refractions and shadows in real time.
Augmented reality social story for autism spectrum disorder
NASA Astrophysics Data System (ADS)
Syahputra, M. F.; Arisandi, D.; Lumbanbatu, A. F.; Kemit, L. F.; Nababan, E. B.; Sheta, O.
2018-03-01
Augmented Reality is a technique that can bring social story therapy into virtual world to increase intrinsic motivation of children with Autism Spectrum Disorder(ASD). By looking at the behaviour of ASD who will be difficult to get the focus, the lack of sensory and motor nerves in the use of loads on the hands or other organs will be very distressing children with ASD in doing the right activities, and interpret and understand the social situation in determining a response appropriately. Required method to be able to apply social story on therapy of children with ASD that is implemented with Augmented Reality. The output resulting from this method is 3D animation (three-dimensional animation) of social story by detecting marker located in special book and some simple game which done by using leap motion controller which is useful in reading hand movement in real-time.
ChemPreview: an augmented reality-based molecular interface.
Zheng, Min; Waller, Mark P
2017-05-01
Human computer interfaces make computational science more comprehensible and impactful. Complex 3D structures such as proteins or DNA are magnified by digital representations and displayed on two-dimensional monitors. Augmented reality has recently opened another door to access the virtual three-dimensional world. Herein, we present an augmented reality application called ChemPreview with the potential to manipulate bio-molecular structures at an atomistic level. ChemPreview is available at https://github.com/wallerlab/chem-preview/releases, and is built on top of the Meta 1 platform https://www.metavision.com/. ChemPreview can be used to interact with a protein in an intuitive way using natural hand gestures, thereby making it appealing to computational chemists or structural biologists. The ability to manipulate atoms in real world could eventually provide new and more efficient ways of extracting structural knowledge, or designing new molecules in silico. Copyright © 2017 Elsevier Inc. All rights reserved.
Culbertson, Heather; Kuchenbecker, Katherine J
2017-01-01
Interacting with physical objects through a tool elicits tactile and kinesthetic sensations that comprise your haptic impression of the object. These cues, however, are largely missing from interactions with virtual objects, yielding an unrealistic user experience. This article evaluates the realism of virtual surfaces rendered using haptic models constructed from data recorded during interactions with real surfaces. The models include three components: surface friction, tapping transients, and texture vibrations. We render the virtual surfaces on a SensAble Phantom Omni haptic interface augmented with a Tactile Labs Haptuator for vibration output. We conducted a human-subject study to assess the realism of these virtual surfaces and the importance of the three model components. Following a perceptual discrepancy paradigm, subjects compared each of 15 real surfaces to a full rendering of the same surface plus versions missing each model component. The realism improvement achieved by including friction, tapping, or texture in the rendering was found to directly relate to the intensity of the surface's property in that domain (slipperiness, hardness, or roughness). A subsequent analysis of forces and vibrations measured during interactions with virtual surfaces indicated that the Omni's inherent mechanical properties corrupted the user's haptic experience, decreasing realism of the virtual surface.
Helios: a tangible and augmented environment to learn optical phenomena in astronomy
NASA Astrophysics Data System (ADS)
Fleck, Stéphanie; Hachet, Martin
2015-10-01
France is among the few countries that have integrated astronomy in primary school levels. However, for fifteen years, a lot of studies have shown that children have difficulties in understanding elementary astronomic phenomena such as day/night alternation, seasons or moon phases' evolution. To understand these phenomena, learners have to mentally construct 3D perceptions of aster motions and to understand how light propagates from an allocentric point of view. Therefore, 4-5 grades children (8 to 11 years old), who are developing their spatial cognition, have many difficulties to assimilate geometric optical problems that are linked to astronomy. To make astronomical learning more efficient for young pupils, we have designed an Augmented Inquiry-Based Learning Environment (AIBLE): HELIOS. Because manipulations in astronomy are intrinsically not possible, we propose to manipulate the underlying model. With HELIOS, virtual replicas of the Sun, Moon and Earth are directly manipulated from tangible manipulations. This digital support combines the possibilities of Augmented Reality (AR) while maintaining intuitive interactions following the principles of didactic of sciences. Light properties are taken into account and shadows of Earth and Moon are directly produced by an omnidirectional light source associated to the virtual Sun. This AR environment provides users with experiences they would otherwise not be able to experiment in the physical world. Our main goal is that students can take active control of their learning, express and support their ideas, make predictions and hypotheses, and test them by conducting investigations.
3D interactive augmented reality-enhanced digital learning systems for mobile devices
NASA Astrophysics Data System (ADS)
Feng, Kai-Ten; Tseng, Po-Hsuan; Chiu, Pei-Shuan; Yang, Jia-Lin; Chiu, Chun-Jie
2013-03-01
With enhanced processing capability of mobile platforms, augmented reality (AR) has been considered a promising technology for achieving enhanced user experiences (UX). Augmented reality is to impose virtual information, e.g., videos and images, onto a live-view digital display. UX on real-world environment via the display can be e ectively enhanced with the adoption of interactive AR technology. Enhancement on UX can be bene cial for digital learning systems. There are existing research works based on AR targeting for the design of e-learning systems. However, none of these work focuses on providing three-dimensional (3-D) object modeling for en- hanced UX based on interactive AR techniques. In this paper, the 3-D interactive augmented reality-enhanced learning (IARL) systems will be proposed to provide enhanced UX for digital learning. The proposed IARL systems consist of two major components, including the markerless pattern recognition (MPR) for 3-D models and velocity-based object tracking (VOT) algorithms. Realistic implementation of proposed IARL system is conducted on Android-based mobile platforms. UX on digital learning can be greatly improved with the adoption of proposed IARL systems.
Visual error augmentation enhances learning in three dimensions.
Sharp, Ian; Huang, Felix; Patton, James
2011-09-02
Because recent preliminary evidence points to the use of Error augmentation (EA) for motor learning enhancements, we visually enhanced deviations from a straight line path while subjects practiced a sensorimotor reversal task, similar to laparoscopic surgery. Our study asked 10 healthy subjects in two groups to perform targeted reaching in a simulated virtual reality environment, where the transformation of the hand position matrix was a complete reversal--rotated 180 degrees about an arbitrary axis (hence 2 of the 3 coordinates are reversed). Our data showed that after 500 practice trials, error-augmented-trained subjects reached the desired targets more quickly and with lower error (differences of 0.4 seconds and 0.5 cm Maximum Perpendicular Trajectory deviation) when compared to the control group. Furthermore, the manner in which subjects practiced was influenced by the error augmentation, resulting in more continuous motions for this group and smaller errors. Even with the extreme sensory discordance of a reversal, these data further support that distorted reality can promote more complete adaptation/learning when compared to regular training. Lastly, upon removing the flip all subjects quickly returned to baseline rapidly within 6 trials.
Design of virtual display and testing system for moving mass electromechanical actuator
NASA Astrophysics Data System (ADS)
Gao, Zhigang; Geng, Keda; Zhou, Jun; Li, Peng
2015-12-01
Aiming at the problem of control, measurement and movement virtual display of moving mass electromechanical actuator(MMEA), the virtual testing system of MMEA was developed based on the PC-DAQ architecture and the software platform of LabVIEW, and the comprehensive test task such as drive control of MMEA, tests of kinematic parameter, measurement of centroid position and virtual display of movement could be accomplished. The system could solve the alignment for acquisition time between multiple measurement channels in different DAQ cards, then on this basis, the researches were focused on the dynamic 3D virtual display by the LabVIEW, and the virtual display of MMEA were realized by the method of calling DLL and the method of 3D graph drawing controls. Considering the collaboration with the virtual testing system, including the hardware drive, the measurement software of data acquisition, and the 3D graph drawing controls method was selected, which could obtained the synchronization measurement, control and display. The system can measure dynamic centroid position and kinematic position of movable mass block while controlling the MMEA, and the interface of 3D virtual display has realistic effect and motion smooth, which can solve the problem of display and playback about MMEA in the closed shell.
Extending MAM5 Meta-Model and JaCalIV E Framework to Integrate Smart Devices from Real Environments.
Rincon, J A; Poza-Lujan, Jose-Luis; Julian, V; Posadas-Yagüe, Juan-Luis; Carrascosa, C
2016-01-01
This paper presents the extension of a meta-model (MAM5) and a framework based on the model (JaCalIVE) for developing intelligent virtual environments. The goal of this extension is to develop augmented mirror worlds that represent a real and virtual world coupled, so that the virtual world not only reflects the real one, but also complements it. A new component called a smart resource artifact, that enables modelling and developing devices to access the real physical world, and a human in the loop agent to place a human in the system have been included in the meta-model and framework. The proposed extension of MAM5 has been tested by simulating a light control system where agents can access both virtual and real sensor/actuators through the smart resources developed. The results show that the use of real environment interactive elements (smart resource artifacts) in agent-based simulations allows to minimize the error between simulated and real system.
Shema-Shiratzky, Shirley; Brozgol, Marina; Cornejo-Thumm, Pablo; Geva-Dayan, Karen; Rotstein, Michael; Leitner, Yael; Hausdorff, Jeffrey M; Mirelman, Anat
2018-05-17
To examine the feasibility and efficacy of a combined motor-cognitive training using virtual reality to enhance behavior, cognitive function and dual-tasking in children with Attention-Deficit/Hyperactivity Disorder (ADHD). Fourteen non-medicated school-aged children with ADHD, received 18 training sessions during 6 weeks. Training included walking on a treadmill while negotiating virtual obstacles. Behavioral symptoms, cognition and gait were tested before and after the training and at 6-weeks follow-up. Based on parental report, there was a significant improvement in children's social problems and psychosomatic behavior after the training. Executive function and memory were improved post-training while attention was unchanged. Gait regularity significantly increased during dual-task walking. Long-term training effects were maintained in memory and executive function. Treadmill-training augmented with virtual-reality is feasible and may be an effective treatment to enhance behavior, cognitive function and dual-tasking in children with ADHD.
Extending MAM5 Meta-Model and JaCalIV E Framework to Integrate Smart Devices from Real Environments
2016-01-01
This paper presents the extension of a meta-model (MAM5) and a framework based on the model (JaCalIVE) for developing intelligent virtual environments. The goal of this extension is to develop augmented mirror worlds that represent a real and virtual world coupled, so that the virtual world not only reflects the real one, but also complements it. A new component called a smart resource artifact, that enables modelling and developing devices to access the real physical world, and a human in the loop agent to place a human in the system have been included in the meta-model and framework. The proposed extension of MAM5 has been tested by simulating a light control system where agents can access both virtual and real sensor/actuators through the smart resources developed. The results show that the use of real environment interactive elements (smart resource artifacts) in agent-based simulations allows to minimize the error between simulated and real system. PMID:26926691
Virtual reality and robotics for stroke rehabilitation: where do we go from here?
Wade, Eric; Winstein, Carolee J
2011-01-01
Promoting functional recovery after stroke requires collaborative and innovative approaches to neurorehabilitation research. Task-oriented training (TOT) approaches that include challenging, adaptable, and meaningful activities have led to successful outcomes in several large-scale multisite definitive trials. This, along with recent technological advances of virtual reality and robotics, provides a fertile environment for furthering clinical research in neurorehabilitation. Both virtual reality and robotics make use of multimodal sensory interfaces to affect human behavior. In the therapeutic setting, these systems can be used to quantitatively monitor, manipulate, and augment the users' interaction with their environment, with the goal of promoting functional recovery. This article describes recent advances in virtual reality and robotics and the synergy with best clinical practice. Additionally, we describe the promise shown for automated assessments and in-home activity-based interventions. Finally, we propose a broader approach to ensuring that technology-based assessment and intervention complement evidence-based practice and maintain a patient-centered perspective.
Sarver, Nina Wong; Beidel, Deborah C; Spitalnick, Josh S
2014-01-01
Two significant challenges for the dissemination of social skills training programs are the need to assure generalizability and provide sufficient practice opportunities. In the case of social anxiety disorder, virtual environments may provide one strategy to address these issues. This study evaluated the utility of an interactive virtual school environment for the treatment of social anxiety disorder in preadolescent children. Eleven children with a primary diagnosis of social anxiety disorder between 8 to 12 years old participated in this initial feasibility trial. All children were treated with Social Effectiveness Therapy for Children, an empirically supported treatment for children with social anxiety disorder. However, the in vivo peer generalization sessions and standard parent-assisted homework assignments were substituted by practice in a virtual environment. Overall, the virtual environment programs were acceptable, feasible, and credible treatment components. Both children and clinicians were satisfied with using the virtual environment technology, and children believed it was a high-quality program overall. In addition, parents were satisfied with the virtual environment augmented treatment and indicated that they would recommend the program to family and friends. Findings indicate that the virtual environments are viewed as acceptable and credible by potential recipients. Furthermore, they are easy to implement by even novice users and appear to be useful adjunctive elements for the treatment of childhood social anxiety disorder.
Vision-based overlay of a virtual object into real scene for designing room interior
NASA Astrophysics Data System (ADS)
Harasaki, Shunsuke; Saito, Hideo
2001-10-01
In this paper, we introduce a geometric registration method for augmented reality (AR) and an application system, interior simulator, in which a virtual (CG) object can be overlaid into a real world space. Interior simulator is developed as an example of an AR application of the proposed method. Using interior simulator, users can visually simulate the location of virtual furniture and articles in the living room so that they can easily design the living room interior without placing real furniture and articles, by viewing from many different locations and orientations in real-time. In our system, two base images of a real world space are captured from two different views for defining a projective coordinate of object 3D space. Then each projective view of a virtual object in the base images are registered interactively. After such coordinate determination, an image sequence of a real world space is captured by hand-held camera with tracking non-metric measured feature points for overlaying a virtual object. Virtual objects can be overlaid onto the image sequence by taking each relationship between the images. With the proposed system, 3D position tracking device, such as magnetic trackers, are not required for the overlay of virtual objects. Experimental results demonstrate that 3D virtual furniture can be overlaid into an image sequence of the scene of a living room nearly at video rate (20 frames per second).
Wong, Nina; Beidel, Deborah C.; Spitalnick, Josh
2013-01-01
Objective Two significant challenges for the dissemination of social skills training programs are the need to assure generalizability and provide sufficient practice opportunities. In the case of social anxiety disorder, virtual environments may provide one strategy to address these issues. This study evaluated the utility of an interactive virtual school environment for the treatment of social anxiety disorder in preadolescent children. Method Eleven children with a primary diagnosis of social anxiety disorder between 8 to 12 years old participated in this initial feasibility trial. All children were treated with Social Effectiveness Therapy for Children, an empirically supported treatment for children with social anxiety disorder. However, the in vivo peer generalization sessions and standard parent-assisted homework assignments were substituted by practice in a virtual environment. Results Overall, the virtual environment programs were acceptable, feasible, and credible treatment components. Both children and clinicians were satisfied with using the virtual environment technology, and children believed it was a high quality program overall. Additionally, parents were satisfied with the virtual environment augmented treatment and indicated that they would recommend the program to family and friends. Conclusion Virtual environments are viewed as acceptable and credible by potential recipients. Furthermore, they are easy to implement by even novice users and appear to be useful adjunctive elements for the treatment of childhood social anxiety disorder. PMID:24144182
ERIC Educational Resources Information Center
Gilbert, Kristen A.
2017-01-01
Aim/Purpose: Improving public schools is a focus of federal legislation in the United States with much of the burden placed on principals. However, preparing principals for this task has proven elusive despite many changes in programming by institutions of higher learning. Emerging technologies that rely on augmented and virtual realities are…
The Game "Pokemon Go" as a Crosscultural Phenomenon
ERIC Educational Resources Information Center
Koroleva, D. O.; Kochervey, A. I.; Nasonova, K. M.; Shibeko, Yu. V.
2016-01-01
The gaming culture of modern childhood takes place not only in real space, but in a virtual one too. These two components (two planes of development) are often viewed in isolation from one another. This study investigates the completely new phenomenon of augmented reality gaming through the lens of "Pokemon Go," and it describes how the…
Ingress in Geography: Portals to Academic Success?
ERIC Educational Resources Information Center
Davis, Michael
2017-01-01
Niantic Labs has developed an augmented virtual reality mobile app game called Ingress in which agents must seek out and control locations for their designated factions. The app uses the Google Maps interface along with GPS to enhance a geocaching-like experience with elements of other classical games such as capture-the-flag. This study aims to…
Learning Protein Structure with Peers in an AR-Enhanced Learning Environment
ERIC Educational Resources Information Center
Chen, Yu-Chien
2013-01-01
Augmented reality (AR) is an interactive system that allows users to interact with virtual objects and the real world at the same time. The purpose of this dissertation was to explore how AR, as a new visualization tool, that can demonstrate spatial relationships by representing three dimensional objects and animations, facilitates students to…
The development of a virtual camera system for astronaut-rover planetary exploration.
Platt, Donald W; Boy, Guy A
2012-01-01
A virtual assistant is being developed for use by astronauts as they use rovers to explore the surface of other planets. This interactive database, called the Virtual Camera (VC), is an interactive database that allows the user to have better situational awareness for exploration. It can be used for training, data analysis and augmentation of actual surface exploration. This paper describes the development efforts and Human-Computer Interaction considerations for implementing a first-generation VC on a tablet mobile computer device. Scenarios for use will be presented. Evaluation and success criteria such as efficiency in terms of processing time and precision situational awareness, learnability, usability, and robustness will also be presented. Initial testing and the impact of HCI design considerations of manipulation and improvement in situational awareness using a prototype VC will be discussed.
Virtual reality for health care: a survey.
Moline, J
1997-01-01
This report surveys the state of the art in applications of virtual environments and related technologies for health care. Applications of these technologies are being developed for health care in the following areas: surgical procedures (remote surgery or telepresence, augmented or enhanced surgery, and planning and simulation of procedures before surgery); medical therapy; preventive medicine and patient education; medical education and training; visualization of massive medical databases; skill enhancement and rehabilitation; and architectural design for health-care facilities. To date, such applications have improved the quality of health care, and in the future they will result in substantial cost savings. Tools that respond to the needs of present virtual environment systems are being refined or developed. However, additional large-scale research is necessary in the following areas: user studies, use of robots for telepresence procedures, enhanced system reality, and improved system functionality.
Experimental Characterization of Microfabricated VirtualImpactor Efficiency
The Air-Microfluidics Group is developing a microelectromechanical systems-based direct reading particulate matter (PM) mass sensor. The sensor consists of two main components: a microfabricated virtual impactor (VI) and a PM mass sensor. The VI leverages particle inertia to sepa...
Improved approximations for control augmented structural synthesis
NASA Technical Reports Server (NTRS)
Thomas, H. L.; Schmit, L. A.
1990-01-01
A methodology for control-augmented structural synthesis is presented for structure-control systems which can be modeled as an assemblage of beam, truss, and nonstructural mass elements augmented by a noncollocated direct output feedback control system. Truss areas, beam cross sectional dimensions, nonstructural masses and rotary inertias, and controller position and velocity gains are treated simultaneously as design variables. The structural mass and a control-system performance index can be minimized simultaneously, with design constraints placed on static stresses and displacements, dynamic harmonic displacements and forces, structural frequencies, and closed-loop eigenvalues and damping ratios. Intermediate design-variable and response-quantity concepts are used to generate new approximations for displacements and actuator forces under harmonic dynamic loads and for system complex eigenvalues. This improves the overall efficiency of the procedure by reducing the number of complete analyses required for convergence. Numerical results which illustrate the effectiveness of the method are given.
Percutaneous spinal fixation simulation with virtual reality and haptics.
Luciano, Cristian J; Banerjee, P Pat; Sorenson, Jeffery M; Foley, Kevin T; Ansari, Sameer A; Rizzi, Silvio; Germanwala, Anand V; Kranzler, Leonard; Chittiboina, Prashant; Roitberg, Ben Z
2013-01-01
In this study, we evaluated the use of a part-task simulator with 3-dimensional and haptic feedback as a training tool for percutaneous spinal needle placement. To evaluate the learning effectiveness in terms of entry point/target point accuracy of percutaneous spinal needle placement on a high-performance augmented-reality and haptic technology workstation with the ability to control the duration of computer-simulated fluoroscopic exposure, thereby simulating an actual situation. Sixty-three fellows and residents performed needle placement on the simulator. A virtual needle was percutaneously inserted into a virtual patient's thoracic spine derived from an actual patient computed tomography data set. Ten of 126 needle placement attempts by 63 participants ended in failure for a failure rate of 7.93%. From all 126 needle insertions, the average error (15.69 vs 13.91), average fluoroscopy exposure (4.6 vs 3.92), and average individual performance score (32.39 vs 30.71) improved from the first to the second attempt. Performance accuracy yielded P = .04 from a 2-sample t test in which the rejected null hypothesis assumes no improvement in performance accuracy from the first to second attempt in the test session. The experiments showed evidence (P = .04) of performance accuracy improvement from the first to the second percutaneous needle placement attempt. This result, combined with previous learning retention and/or face validity results of using the simulator for open thoracic pedicle screw placement and ventriculostomy catheter placement, supports the efficacy of augmented reality and haptics simulation as a learning tool.
Holographic and light-field imaging for augmented reality
NASA Astrophysics Data System (ADS)
Lee, Byoungho; Hong, Jong-Young; Jang, Changwon; Jeong, Jinsoo; Lee, Chang-Kun
2017-02-01
We discuss on the recent state of the augmented reality (AR) display technology. In order to realize AR, various seethrough three-dimensional (3D) display techniques have been reported. We describe the AR display with 3D functionality such as light-field display and holography. See-through light-field display can be categorized by the optical elements which are used for see-through property: optical elements controlling path of the light-fields and those generating see-through light-field. Holographic display can be also a good candidate for AR display because it can reconstruct wavefront information and provide realistic virtual information. We introduce the see-through holographic display using various optical techniques.
Can virtual reality be used to conduct mass prophylaxis clinic training? A pilot program.
Yellowlees, Peter; Cook, James N; Marks, Shayna L; Wolfe, Daniel; Mangin, Elanor
2008-03-01
To create and evaluate a pilot bioterrorism defense training environment using virtual reality technology. The present pilot project used Second Life, an internet-based virtual world system, to construct a virtual reality environment to mimic an actual setting that might be used as a Strategic National Stockpile (SNS) distribution site for northern California in the event of a bioterrorist attack. Scripted characters were integrated into the system as mock patients to analyze various clinic workflow scenarios. Users tested the virtual environment over two sessions. Thirteen users who toured the environment were asked to complete an evaluation survey. Respondents reported that the virtual reality system was relevant to their practice and had potential as a method of bioterrorism defense training. Computer simulations of bioterrorism defense training scenarios are feasible with existing personal computer technology. The use of internet-connected virtual environments holds promise for bioterrorism defense training. Recommendations are made for public health agencies regarding the implementation and benefits of using virtual reality for mass prophylaxis clinic training.
NASA Astrophysics Data System (ADS)
Freund, Eckhard; Rossmann, Juergen
2002-02-01
In 2004, the European COLUMBUS Module is to be attached to the International Space Station. On the way to the successful planning, deployment and operation of the module, computer generated and animated models are being used to optimize performance. Under contract of the German Space Agency DLR, it has become IRF's task to provide a Projective Virtual Reality System to provide a virtual world built after the planned layout of the COLUMBUS module let astronauts and experimentators practice operational procedures and the handling of experiments. The key features of the system currently being realized comprise the possibility for distributed multi-user access to the virtual lab and the visualization of real-world experiment data. Through the capabilities to share the virtual world, cooperative operations can be practiced easily, but also trainers and trainees can work together more effectively sharing the virtual environment. The capability to visualize real-world data will be used to introduce measured data of experiments into the virtual world online in order to realistically interact with the science-reference model hardware: The user's actions in the virtual world are translated into corresponding changes of the inputs of the science reference model hardware; the measured data is than in turn fed back into the virtual world. During the operation of COLUMBUS, the capabilities for distributed access and the capabilities to visualize measured data through the use of metaphors and augmentations of the virtual world may be used to provide virtual access to the COLUMBUS module, e.g. via Internet. Currently, finishing touches are being put to the system. In November 2001 the virtual world shall be operational, so that besides the design and the key ideas, first experimental results can be presented.
Wearable computer for mobile augmented-reality-based controlling of an intelligent robot
NASA Astrophysics Data System (ADS)
Turunen, Tuukka; Roening, Juha; Ahola, Sami; Pyssysalo, Tino
2000-10-01
An intelligent robot can be utilized to perform tasks that are either hazardous or unpleasant for humans. Such tasks include working in disaster areas or conditions that are, for example, too hot. An intelligent robot can work on its own to some extent, but in some cases the aid of humans will be needed. This requires means for controlling the robot from somewhere else, i.e. teleoperation. Mobile augmented reality can be utilized as a user interface to the environment, as it enhances the user's perception of the situation compared to other interfacing methods and allows the user to perform other tasks while controlling the intelligent robot. Augmented reality is a method that combines virtual objects into the user's perception of the real world. As computer technology evolves, it is possible to build very small devices that have sufficient capabilities for augmented reality applications. We have evaluated the existing wearable computers and mobile augmented reality systems to build a prototype of a future mobile terminal- the CyPhone. A wearable computer with sufficient system resources for applications, wireless communication media with sufficient throughput and enough interfaces for peripherals has been built at the University of Oulu. It is self-sustained in energy, with enough operating time for the applications to be useful, and uses accurate positioning systems.
Speech-Enabled Tools for Augmented Interaction in E-Learning Applications
ERIC Educational Resources Information Center
Selouani, Sid-Ahmed A.; Lê, Tang-Hô; Benahmed, Yacine; O'Shaughnessy, Douglas
2008-01-01
This article presents systems that use speech technology, to emulate the one-on-one interaction a student can get from a virtual instructor. A web-based learning tool, the Learn IN Context (LINC+) system, designed and used in a real mixed-mode learning context for a computer (C++ language) programming course taught at the Université de Moncton…
Augmented Reality, Virtual Reality and Their Effect on Learning Style in the Creative Design Process
ERIC Educational Resources Information Center
Chandrasekera, Tilanka; Yoon, So-Yeon
2018-01-01
Research has shown that user characteristics such as preference for using an interface can result in effective use of the interface. Research has also suggested that there is a relationship between learner preference and creativity. This study uses the VARK learning styles inventory to assess students learning style then explores how this learning…
ERIC Educational Resources Information Center
Schmidt, Matthew; Galyen, Krista; Laffey, James; Babiuch, Ryan; Schmidt, Carla
2014-01-01
Design-based research (DBR) and open source software are both acknowledged as potentially productive ways for advancing learning technologies. These approaches have practical benefits for the design and development process and for building and leveraging community to augment and sustain design and development. This report presents a case study of…
Camera pose estimation for augmented reality in a small indoor dynamic scene
NASA Astrophysics Data System (ADS)
Frikha, Rawia; Ejbali, Ridha; Zaied, Mourad
2017-09-01
Camera pose estimation remains a challenging task for augmented reality (AR) applications. Simultaneous localization and mapping (SLAM)-based methods are able to estimate the six degrees of freedom camera motion while constructing a map of an unknown environment. However, these methods do not provide any reference for where to insert virtual objects since they do not have any information about scene structure and may fail in cases of occlusion of three-dimensional (3-D) map points or dynamic objects. This paper presents a real-time monocular piece wise planar SLAM method using the planar scene assumption. Using planar structures in the mapping process allows rendering virtual objects in a meaningful way on the one hand and improving the precision of the camera pose and the quality of 3-D reconstruction of the environment by adding constraints on 3-D points and poses in the optimization process on the other hand. We proposed to benefit from the 3-D planes rigidity motion in the tracking process to enhance the system robustness in the case of dynamic scenes. Experimental results show that using a constrained planar scene improves our system accuracy and robustness compared with the classical SLAM systems.
Augmented reality: past, present, future
NASA Astrophysics Data System (ADS)
Inzerillo, Laura
2013-03-01
A great opportunity has permitted to carry out a cultural, historical, architectural and social research with great impact factor on the international cultural interest. We are talking about the realization of a museum whose the main theme is the visit and the discovery of a monument of great prestige: the monumental building the "Steri" in Palermo. The museum is divided into sub themes including the one above all, that has aroused the international interest so much that it has been presented the instance to include the museum in the cultural heritage of UNESCO. It is the realization of a museum path that regards the cells of the Inquisition, which are located just inside of some buildings of the monumental building. The project, as a whole, is faced, in a total view, between the various competences implicated: historic, chemic, architectonic, topographic, drawing, representation, virtual communication, informatics. The birth of the museum will be a sum of the results of all these disciplines involved. Methodology, implementation, fruition, virtual museum, goals, 2D graphic restitution, effects on the cultural heritage and landscape environmental, augmented reality, Surveying 2D and 3D, hi-touch screen, Photogrammetric survey, Photographic survey, representation, drawing 3D and more than this has been dealt with this research.
Design and evaluation of an augmented reality simulator using leap motion.
Wright, Trinette; de Ribaupierre, Sandrine; Eagleson, Roy
2017-10-01
Advances in virtual and augmented reality (AR) are having an impact on the medical field in areas such as surgical simulation. Improvements to surgical simulation will provide students and residents with additional training and evaluation methods. This is particularly important for procedures such as the endoscopic third ventriculostomy (ETV), which residents perform regularly. Simulators such as NeuroTouch, have been designed to aid in training associated with this procedure. The authors have designed an affordable and easily accessible ETV simulator, and compare it with the existing NeuroTouch for its usability and training effectiveness. This simulator was developed using Unity, Vuforia and the leap motion (LM) for an AR environment. The participants, 16 novices and two expert neurosurgeons, were asked to complete 40 targeting tasks. Participants used the NeuroTouch tool or a virtual hand controlled by the LM to select the position and orientation for these tasks. The length of time to complete each task was recorded and the trajectory log files were used to calculate performance. The resulting data from the novices' and experts' speed and accuracy are compared, and they discuss the objective performance of training in terms of the speed and accuracy of targeting accuracy for each system.
Design and evaluation of an augmented reality simulator using leap motion
de Ribaupierre, Sandrine; Eagleson, Roy
2017-01-01
Advances in virtual and augmented reality (AR) are having an impact on the medical field in areas such as surgical simulation. Improvements to surgical simulation will provide students and residents with additional training and evaluation methods. This is particularly important for procedures such as the endoscopic third ventriculostomy (ETV), which residents perform regularly. Simulators such as NeuroTouch, have been designed to aid in training associated with this procedure. The authors have designed an affordable and easily accessible ETV simulator, and compare it with the existing NeuroTouch for its usability and training effectiveness. This simulator was developed using Unity, Vuforia and the leap motion (LM) for an AR environment. The participants, 16 novices and two expert neurosurgeons, were asked to complete 40 targeting tasks. Participants used the NeuroTouch tool or a virtual hand controlled by the LM to select the position and orientation for these tasks. The length of time to complete each task was recorded and the trajectory log files were used to calculate performance. The resulting data from the novices' and experts' speed and accuracy are compared, and they discuss the objective performance of training in terms of the speed and accuracy of targeting accuracy for each system. PMID:29184667
A second life for eHealth: prospects for the use of 3-D virtual worlds in clinical psychology.
Gorini, Alessandra; Gaggioli, Andrea; Vigna, Cinzia; Riva, Giuseppe
2008-08-05
The aim of the present paper is to describe the role played by three-dimensional (3-D) virtual worlds in eHealth applications, addressing some potential advantages and issues related to the use of this emerging medium in clinical practice. Due to the enormous diffusion of the World Wide Web (WWW), telepsychology, and telehealth in general, have become accepted and validated methods for the treatment of many different health care concerns. The introduction of the Web 2.0 has facilitated the development of new forms of collaborative interaction between multiple users based on 3-D virtual worlds. This paper describes the development and implementation of a form of tailored immersive e-therapy called p-health whose key factor is interreality, that is, the creation of a hybrid augmented experience merging physical and virtual worlds. We suggest that compared with conventional telehealth applications such as emails, chat, and videoconferences, the interaction between real and 3-D virtual worlds may convey greater feelings of presence, facilitate the clinical communication process, positively influence group processes and cohesiveness in group-based therapies, and foster higher levels of interpersonal trust between therapists and patients. However, challenges related to the potentially addictive nature of such virtual worlds and questions related to privacy and personal safety will also be discussed.
Luigi Ingrassia, Pier; Ragazzoni, Luca; Carenzo, Luca; Colombo, Davide; Ripoll Gallardo, Alba; Della Corte, Francesco
2015-04-01
This study tested the hypothesis that virtual reality simulation is equivalent to live simulation for testing naive medical students' abilities to perform mass casualty triage using the Simple Triage and Rapid Treatment (START) algorithm in a simulated disaster scenario and to detect the improvement in these skills after a teaching session. Fifty-six students in their last year of medical school were randomized into two groups (A and B). The same scenario, a car accident, was developed identically on the two simulation methodologies: virtual reality and live simulation. On day 1, group A was exposed to the live scenario and group B was exposed to the virtual reality scenario, aiming to triage 10 victims. On day 2, all students attended a 2-h lecture on mass casualty triage, specifically the START triage method. On day 3, groups A and B were crossed over. The groups' abilities to perform mass casualty triage in terms of triage accuracy, intervention correctness, and speed in the scenarios were assessed. Triage and lifesaving treatment scores were assessed equally by virtual reality and live simulation on day 1 and on day 3. Both simulation methodologies detected an improvement in triage accuracy and treatment correctness from day 1 to day 3 (P<0.001). The time to complete each scenario and its decrease from day 1 to day 3 were detected equally in the two groups (P<0.05). Virtual reality simulation proved to be a valuable tool, equivalent to live simulation, to test medical students' abilities to perform mass casualty triage and to detect improvement in such skills.
Determination of balloon gas mass and revised estimates of drag and virtual mass coefficients
NASA Technical Reports Server (NTRS)
Robbins, E.; Martone, M.
1993-01-01
In support of the NASA Balloon Program, small-scale balloons were flown with varying lifting gas and total system mass. Instrument packages were developed to measure and record acceleration and temperature data during these tests. Top fitting and instrument payload accelerations were measured from launch to steady state ascent and through ballast drop transients. The development of the small lightweight self-powered Stowaway Special instrument packages is discussed along with mathematical models developed to determine gas mass, drag and virtual mass coefficients.
Lin, Yen-Kun; Yau, Hong-Tzong; Wang, I-Chung; Zheng, Cheng; Chung, Kwok-Hung
2015-06-01
Stereoscopic visualization concept combined with head-mounted displays may increase the accuracy of computer-aided implant surgery. The aim of this study was to develop an augmented reality-based dental implant placement system and evaluate the accuracy of the virtually planned versus the actual prepared implant site created in vitro. Four fully edentulous mandibular and four partially edentulous maxillary duplicated casts were used. Six implants were planned in the mandibular and four in the maxillary casts. A total of 40 osteotomy sites were prepared in the casts using stereolithographic template integrated with augmented reality-based surgical simulation. During the surgery, the dentist could be guided accurately through a head-mounted display by superimposing the virtual auxiliary line and the drill stop. The deviation between planned and prepared positions of the implants was measured via postoperative computer tomography generated scan images. Mean and standard deviation of the discrepancy between planned and prepared sites at the entry point, apex, angle, depth, and lateral locations were 0.50 ± 0.33 mm, 0.96 ± 0.36 mm, 2.70 ± 1.55°, 0.33 ± 0.27 mm, and 0.86 ± 0.34 mm, respectively, for the fully edentulous mandible, and 0.46 ± 0.20 mm, 1.23 ± 0.42 mm, 3.33 ± 1.42°, 0.48 ± 0.37 mm, and 1.1 ± 0.39 mm, respectively, for the partially edentulous maxilla. There was a statistically significant difference in the apical deviation between maxilla and mandible in this surgical simulation (p < .05). Deviation of implant placement from planned position was significantly reduced by integrating surgical template and augmented reality technology. © 2013 Wiley Periodicals, Inc.
Registration using natural features for augmented reality systems.
Yuan, M L; Ong, S K; Nee, A Y C
2006-01-01
Registration is one of the most difficult problems in augmented reality (AR) systems. In this paper, a simple registration method using natural features based on the projective reconstruction technique is proposed. This method consists of two steps: embedding and rendering. Embedding involves specifying four points to build the world coordinate system on which a virtual object will be superimposed. In rendering, the Kanade-Lucas-Tomasi (KLT) feature tracker is used to track the natural feature correspondences in the live video. The natural features that have been tracked are used to estimate the corresponding projective matrix in the image sequence. Next, the projective reconstruction technique is used to transfer the four specified points to compute the registration matrix for augmentation. This paper also proposes a robust method for estimating the projective matrix, where the natural features that have been tracked are normalized (translation and scaling) and used as the input data. The estimated projective matrix will be used as an initial estimate for a nonlinear optimization method that minimizes the actual residual errors based on the Levenberg-Marquardt (LM) minimization method, thus making the results more robust and stable. The proposed registration method has three major advantages: 1) It is simple, as no predefined fiducials or markers are used for registration for either indoor and outdoor AR applications. 2) It is robust, because it remains effective as long as at least six natural features are tracked during the entire augmentation, and the existence of the corresponding projective matrices in the live video is guaranteed. Meanwhile, the robust method to estimate the projective matrix can obtain stable results even when there are some outliers during the tracking process. 3) Virtual objects can still be superimposed on the specified areas, even if some parts of the areas are occluded during the entire process. Some indoor and outdoor experiments have been conducted to validate the performance of this proposed method.
NASA Astrophysics Data System (ADS)
Cheok, Adrian David
This chapter details the Human Pacman system to illuminate entertainment computing which ventures to embed the natural physical world seamlessly with a fantasy virtual playground by capitalizing on infrastructure provided by mobile computing, wireless LAN, and ubiquitous computing. With Human Pacman, we have a physical role-playing computer fantasy together with real human-social and mobile-gaming that emphasizes on collaboration and competition between players in a wide outdoor physical area that allows natural wide-area human-physical movements. Pacmen and Ghosts are now real human players in the real world experiencing mixed computer graphics fantasy-reality provided by using the wearable computers on them. Virtual cookies and actual tangible physical objects are incorporated into the game play to provide novel experiences of seamless transitions between the real and virtual worlds. This is an example of a new form of gaming that anchors on physicality, mobility, social interaction, and ubiquitous computing.
New tools for sculpting cranial implants in a shared haptic augmented reality environment.
Ai, Zhuming; Evenhouse, Ray; Leigh, Jason; Charbel, Fady; Rasmussen, Mary
2006-01-01
New volumetric tools were developed for the design and fabrication of high quality cranial implants from patient CT data. These virtual tools replace time consuming physical sculpting, mold making and casting steps. The implant is designed by medical professionals in tele-immersive collaboration. Virtual clay is added in the virtual defect area on the CT data using the adding tool. With force feedback the modeler can feel the edge of the defect and fill only the space where no bone is present. A carving tool and a smoothing tool are then used to sculpt and refine the implant. To make a physical evaluation, the skull with simulated defect and the implant are fabricated via stereolithography to allow neurosurgeons to evaluate the quality of the implant. Initial tests demonstrate a very high quality fit. These new haptic volumetric sculpting tools are a critical component of a comprehensive tele-immersive system.
Virtual surgery in a (tele-)radiology framework.
Glombitza, G; Evers, H; Hassfeld, S; Engelmann, U; Meinzer, H P
1999-09-01
This paper presents telemedicine as an extension of a teleradiology framework through tools for virtual surgery. To classify the described methods and applications, the research field of virtual reality (VR) is broadly reviewed. Differences with respect to technical equipment, methodological requirements and areas of application are pointed out. Desktop VR, augmented reality, and virtual reality are differentiated and discussed in some typical contexts of diagnostic support, surgical planning, therapeutic procedures, simulation and training. Visualization techniques are compared as a prerequisite for virtual reality and assigned to distinct levels of immersion. The advantage of a hybrid visualization kernel is emphasized with respect to the desktop VR applications that are subsequently shown. Moreover, software design aspects are considered by outlining functional openness in the architecture of the host system. Here, a teleradiology workstation was extended by dedicated tools for surgical planning through a plug-in mechanism. Examples of recent areas of application are introduced such as liver tumor resection planning, diagnostic support in heart surgery, and craniofacial surgery planning. In the future, surgical planning systems will become more important. They will benefit from improvements in image acquisition and communication, new image processing approaches, and techniques for data presentation. This will facilitate preoperative planning and intraoperative applications.
Impact of virtual microscopy with conventional microscopy on student learning in dental histology.
Hande, Alka Harish; Lohe, Vidya K; Chaudhary, Minal S; Gawande, Madhuri N; Patil, Swati K; Zade, Prajakta R
2017-01-01
In dental histology, the assimilation of histological features of different dental hard and soft tissues is done by conventional microscopy. This traditional method of learning prevents the students from screening the entire slide and change of magnification. To address these drawbacks, modification in conventional microscopy has evolved and become motivation for changing the learning tool. Virtual microscopy is the technique in which there is complete digitization of the microscopic glass slide, which can be analyzed on a computer. This research is designed to evaluate the effectiveness of virtual microscopy with conventional microscopy on student learning in dental histology. A cohort of 105 students were included and randomized into three groups: A, B, and C. Group A students studied the microscopic features of oral histologic lesions by conventional microscopy, Group B by virtual microscopy, and Group C by both conventional and virtual microscopy. The students' understanding of the subject was evaluated by a prepared questionnaire. The effectiveness of the study designs on knowledge gains and satisfaction levels was assessed by statistical assessment of differences in mean test scores. The difference in score between Groups A, B, and C at pre- and post-test was highly significant. This enhanced understanding of the subject may be due to benefits of using virtual microscopy in teaching histology. The augmentation of conventional microscopy with virtual microscopy shows enhancement of the understanding of the subject as compared to the use of conventional microscopy and virtual microscopy alone.
Utilization of a virtual patient for advanced assessment of student performance in pain management.
Smith, Michael A; Waite, Laura H
2017-09-01
To assess student performance and achievement of course objectives following the integration of a virtual patient case designed to promote active, patient-centered learning in a required pharmacy course. DecisionSim™ (Kynectiv, Inc., Chadsford, PA), a dynamic virtual patient platform, was used to implement an interactive patient case to augment pain management material presented during a didactic session in a pharmacotherapy course. Simulation performance data were collected and analyzed. Student exam performance on pain management questions was compared to student exam performance on nearly identical questions from a prior year when a paper-based case was used instead of virtual patient technology. Students who performed well on the virtual patient case performed better on exam questions related to patient assessment (p = 0.0244), primary pharmacological therapy (p = 0.0001), and additional pharmacological therapy (p = 0.0001). Overall exam performance did not differ between the two groups. However, students with exposure to the virtual patient case demonstrated significantly better performance on higher level Bloom's Taxonomy questions that required them to create pharmacotherapy regimens (p=0.0005). Students in the previous year (exposed only to a paper patient case) performed better in calculating conversions of opioids for patients (p = 0.0001). Virtual patient technology may enhance student performance on high-level Bloom's Taxonomy examination questions. This study adds to the current literature demonstrating the value of virtual patient technology as an active-learning strategy. Copyright © 2017 Elsevier Inc. All rights reserved.
Augmented reality in neurosurgery: a systematic review.
Meola, Antonio; Cutolo, Fabrizio; Carbone, Marina; Cagnazzo, Federico; Ferrari, Mauro; Ferrari, Vincenzo
2017-10-01
Neuronavigation has become an essential neurosurgical tool in pursuing minimal invasiveness and maximal safety, even though it has several technical limitations. Augmented reality (AR) neuronavigation is a significant advance, providing a real-time updated 3D virtual model of anatomical details, overlaid on the real surgical field. Currently, only a few AR systems have been tested in a clinical setting. The aim is to review such devices. We performed a PubMed search of reports restricted to human studies of in vivo applications of AR in any neurosurgical procedure using the search terms "Augmented reality" and "Neurosurgery." Eligibility assessment was performed independently by two reviewers in an unblinded standardized manner. The systems were qualitatively evaluated on the basis of the following: neurosurgical subspecialty of application, pathology of treated lesions and lesion locations, real data source, virtual data source, tracking modality, registration technique, visualization processing, display type, and perception location. Eighteen studies were included during the period 1996 to September 30, 2015. The AR systems were grouped by the real data source: microscope (8), hand- or head-held cameras (4), direct patient view (2), endoscope (1), and X-ray fluoroscopy (1) head-mounted display (1). A total of 195 lesions were treated: 75 (38.46 %) were neoplastic, 77 (39.48 %) neurovascular, and 1 (0.51 %) hydrocephalus, and 42 (21.53 %) were undetermined. Current literature confirms that AR is a reliable and versatile tool when performing minimally invasive approaches in a wide range of neurosurgical diseases, although prospective randomized studies are not yet available and technical improvements are needed.
Botella, Cristina; Pérez-Ara, M Ángeles; Bretón-López, Juana; Quero, Soledad; García-Palacios, Azucena; Baños, Rosa María
2016-01-01
Although in vivo exposure is the treatment of choice for specific phobias, some acceptability problems have been associated with it. Virtual Reality exposure has been shown to be as effective as in vivo exposure, and it is widely accepted for the treatment of specific phobias, but only preliminary data are available in the literature about the efficacy of Augmented Reality. The purpose of the present study was to examine the efficacy and acceptance of two treatment conditions for specific phobias in which the exposure component was applied in different ways: In vivo exposure (N = 31) versus an Augmented Reality system (N = 32) in a randomized controlled trial. "One-session treatment" guidelines were followed. Participants in the Augmented Reality condition significantly improved on all the outcome measures at post-treatment and follow-ups. When the two treatment conditions were compared, some differences were found at post-treatment, favoring the participants who received in vivo exposure. However, these differences disappeared at the 3- and 6-month follow-ups. Regarding participants' expectations and satisfaction with the treatment, very positive ratings were reported in both conditions. In addition, participants from in vivo exposure condition considered the treatment more useful for their problem whereas participants from Augmented Reality exposure considered the treatment less aversive. Results obtained in this study indicate that Augmented Reality exposure is an effective treatment for specific phobias and well accepted by the participants.
Botella, Cristina; Pérez-Ara, M. Ángeles; Bretón-López, Juana; Quero, Soledad; García-Palacios, Azucena; Baños, Rosa María
2016-01-01
Although in vivo exposure is the treatment of choice for specific phobias, some acceptability problems have been associated with it. Virtual Reality exposure has been shown to be as effective as in vivo exposure, and it is widely accepted for the treatment of specific phobias, but only preliminary data are available in the literature about the efficacy of Augmented Reality. The purpose of the present study was to examine the efficacy and acceptance of two treatment conditions for specific phobias in which the exposure component was applied in different ways: In vivo exposure (N = 31) versus an Augmented Reality system (N = 32) in a randomized controlled trial. “One-session treatment” guidelines were followed. Participants in the Augmented Reality condition significantly improved on all the outcome measures at post-treatment and follow-ups. When the two treatment conditions were compared, some differences were found at post-treatment, favoring the participants who received in vivo exposure. However, these differences disappeared at the 3- and 6-month follow-ups. Regarding participants’ expectations and satisfaction with the treatment, very positive ratings were reported in both conditions. In addition, participants from in vivo exposure condition considered the treatment more useful for their problem whereas participants from Augmented Reality exposure considered the treatment less aversive. Results obtained in this study indicate that Augmented Reality exposure is an effective treatment for specific phobias and well accepted by the participants. PMID:26886423
Environments for online maritime simulators with cloud computing capabilities
NASA Astrophysics Data System (ADS)
Raicu, Gabriel; Raicu, Alexandra
2016-12-01
This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.
NASA's Hybrid Reality Lab: One Giant Leap for Full Dive
NASA Technical Reports Server (NTRS)
Delgado, Francisco J.; Noyes, Matthew
2017-01-01
This presentation demonstrates how NASA is using consumer VR headsets, game engine technology and NVIDIA's GPUs to create highly immersive future training systems augmented with extremely realistic haptic feedback, sound, additional sensory information, and how these can be used to improve the engineering workflow. Include in this presentation is an environment simulation of the ISS, where users can interact with virtual objects, handrails, and tracked physical objects while inside VR, integration of consumer VR headsets with the Active Response Gravity Offload System, and a space habitat architectural evaluation tool. Attendees will learn how the best elements of real and virtual worlds can be combined into a hybrid reality environment with tangible engineering and scientific applications.
Govrin-Yehudain, Orel; Matanis, Yossef; Govrin-Yehudain, Jacky
2018-03-22
The postoperative pain associated with breast augmentation is a top concern of most patients and can affect the decision on surgery. This study aimed to compare the postoperative pain and recovery times of patients undergoing primary breast augmentation with lightweight versus full-mass implants of similar volumes. We hypothesized that the reduced mechanical strain applied by lightweight implants elicits less pain. In this retrospective, observational study, 100 women who had undergone primary breast augmentation with either a lightweight breast implant (LWBI; B-Lite®, G&G Biotechnology Ltd., Haifa, Israel; n=50) or a traditional full-mass silicone implant (n=50), were contacted by phone and asked about their postoperative experiences and overall satisfaction with the outcome. All women were treated by the same surgical team and the two groups were matched by date of surgery. The majority of patients in the two cohorts had a self-reported preoperative B cup size and relatively high tolerance to pain. On average, LWBI patients were 6 years older than those undergoing full-mass implantation (32.4 ± 8.7 vs. 26.2 ± 8.0; p=0.0004) and more had experienced at least one pregnancy (61.2% vs. 24%, p=0.0002). LWBI patients opted for implants 39 ± 28.4 cc larger than patients in the control group. Subglandular placement was selected in the majority of cases (LWBI: 83.7% and full-mass: 90.0%). Mean postoperative pain was lower in the LWBI cohort (5.5 ± 2.4 vs. 6.5 ± 2.4) and required a shorter duration of analgesics use (3.87 ± 1.77 days vs. 5.26 ± 2.94 days; p=0.009). Age- and parity-adjusted measures demonstrated a respective 2-day and 5-day shorter recovery period and return to normal activities interval in the LWBI versus full-mass implant cohorts (p=0.04 and p=0.002, respectively). As compared to traditional silicone filled full-mass implants, breast augmentations with B-Lite lightweight breast implants, elicit less postoperative pain and require less down-time, ultimately, meeting patients' quest for desired breast shape at minimal discomfort.
Duan, Liya; Guan, Tao; Yang, Bo
2009-01-01
Augmented reality (AR) is a field of computer research which deals with the combination of real world and computer generated data. Registration is one of the most difficult problems currently limiting the usability of AR systems. In this paper, we propose a novel natural feature tracking based registration method for AR applications. The proposed method has following advantages: (1) it is simple and efficient, as no man-made markers are needed for both indoor and outdoor AR applications; moreover, it can work with arbitrary geometric shapes including planar, near planar and non planar structures which really enhance the usability of AR systems. (2) Thanks to the reduced SIFT based augmented optical flow tracker, the virtual scene can still be augmented on the specified areas even under the circumstances of occlusion and large changes in viewpoint during the entire process. (3) It is easy to use, because the adaptive classification tree based matching strategy can give us fast and accurate initialization, even when the initial camera is different from the reference image to a large degree. Experimental evaluations validate the performance of the proposed method for online pose tracking and augmentation.
FlyAR: augmented reality supported micro aerial vehicle navigation.
Zollmann, Stefanie; Hoppe, Christof; Langlotz, Tobias; Reitmayr, Gerhard
2014-04-01
Micro aerial vehicles equipped with high-resolution cameras can be used to create aerial reconstructions of an area of interest. In that context automatic flight path planning and autonomous flying is often applied but so far cannot fully replace the human in the loop, supervising the flight on-site to assure that there are no collisions with obstacles. Unfortunately, this workflow yields several issues, such as the need to mentally transfer the aerial vehicles position between 2D map positions and the physical environment, and the complicated depth perception of objects flying in the distance. Augmented Reality can address these issues by bringing the flight planning process on-site and visualizing the spatial relationship between the planned or current positions of the vehicle and the physical environment. In this paper, we present Augmented Reality supported navigation and flight planning of micro aerial vehicles by augmenting the users view with relevant information for flight planning and live feedback for flight supervision. Furthermore, we introduce additional depth hints supporting the user in understanding the spatial relationship of virtual waypoints in the physical world and investigate the effect of these visualization techniques on the spatial understanding.
Fast Markerless Tracking for Augmented Reality in Planar Environment
NASA Astrophysics Data System (ADS)
Basori, Ahmad Hoirul; Afif, Fadhil Noer; Almazyad, Abdulaziz S.; AbuJabal, Hamza Ali S.; Rehman, Amjad; Alkawaz, Mohammed Hazim
2015-12-01
Markerless tracking for augmented reality should not only be accurate but also fast enough to provide a seamless synchronization between real and virtual beings. Current reported methods showed that a vision-based tracking is accurate but requires high computational power. This paper proposes a real-time hybrid-based method for tracking unknown environments in markerless augmented reality. The proposed method provides collaboration of vision-based approach with accelerometers and gyroscopes sensors as camera pose predictor. To align the augmentation relative to camera motion, the tracking method is done by substituting feature-based camera estimation with combination of inertial sensors with complementary filter to provide more dynamic response. The proposed method managed to track unknown environment with faster processing time compared to available feature-based approaches. Moreover, the proposed method can sustain its estimation in a situation where feature-based tracking loses its track. The collaboration of sensor tracking managed to perform the task for about 22.97 FPS, up to five times faster than feature-based tracking method used as comparison. Therefore, the proposed method can be used to track unknown environments without depending on amount of features on scene, while requiring lower computational cost.
NASA Astrophysics Data System (ADS)
Damayanti, Latifah Adelina; Ikhsan, Jaslin
2017-05-01
Integration of information technology in education more rapidly performed in a medium of learning. Three-dimensional (3D) molecular modeling was performed in Augmented Reality as a tangible manifestation of increasingly modern technology utilization. Based on augmented reality, three-dimensional virtual object is projected in real time and the exact environment. This paper reviewed the uses of chemical learning supplement book of aldehydes and ketones which are equipped with three-dimensional molecular modeling by which students can inspect molecules from various viewpoints. To plays the 3D illustration printed on the book, smartphones with the open-source software of the technology based integrated Augmented Reality can be used. The aims of this research were to develop the monograph of aldehydes and ketones with 3 dimensional (3D) illustrations, to determine the specification of the monograph, and to determine the quality of the monograph. The quality of the monograph is evaluated by experiencing chemistry teachers on the five aspects of contents/materials, presentations, language and images, graphs, and software engineering, resulted in the result that the book has a very good quality to be used as a chemistry learning supplement book.
Liu, Miao; Yang, Shourui; Wang, Zhangying; Huang, Shujun; Liu, Yue; Niu, Zhenqi; Zhang, Xiaoxuan; Zhu, Jigui; Zhang, Zonghua
2016-05-30
Augmented reality system can be applied to provide precise guidance for various kinds of manual works. The adaptability and guiding accuracy of such systems are decided by the computational model and the corresponding calibration method. In this paper, a novel type of augmented reality guiding system and the corresponding designing scheme are proposed. Guided by external positioning equipment, the proposed system can achieve high relative indication accuracy in a large working space. Meanwhile, the proposed system is realized with a digital projector and the general back projection model is derived with geometry relationship between digitized 3D model and the projector in free space. The corresponding calibration method is also designed for the proposed system to obtain the parameters of projector. To validate the proposed back projection model, the coordinate data collected by a 3D positioning equipment is used to calculate and optimize the extrinsic parameters. The final projecting indication accuracy of the system is verified with subpixel pattern projecting technique.
An Augmented Reality Nanomanipulator for Learning Nanophysics: The "NanoLearner" Platform
NASA Astrophysics Data System (ADS)
Marchi, Florence; Marliere, Sylvain; Florens, Jean Loup; Luciani, Annie; Chevrier, Joel
The work focuses on the description and evaluation of an augmented reality nanomanipulator, called "NanoLearner" platform used as educational tool in practical works of nanophysics. Through virtual reality associated to multisensory renderings, students are immersed in the nanoworld where they can interact in real time with a sample surface or an object, using their senses as hearing, seeing and touching. The role of each sensorial rendering in the understanding and control of the "approach-retract" interaction has been determined thanks to statistical studies obtained during the practical works. Finally, we present two extensions of the use of this innovative tool for investigating nano effects in living organisms and for allowing grand public to have access to a natural understanding of nanophenomena.
Near-to-eye electroholography via guided-wave acousto-optics for augmented reality
NASA Astrophysics Data System (ADS)
Jolly, Sundeep; Savidis, Nickolaos; Datta, Bianca; Smalley, Daniel; Bove, V. Michael
2017-03-01
Near-to-eye holographic displays act to directly project wavefronts into a viewer's eye in order to recreate 3-D scenes for augmented or virtual reality applications. Recently, several solutions for near-to-eye electroholography have been proposed based on digital spatial light modulators in conjunction with supporting optics, such as holographic waveguides for light delivery; however, such schemes are limited by the inherent low space-bandwidth product available with current digital SLMs. In this paper, we depict a fully monolithic, integrated optical platform for transparent near-to-eye holographic display requiring no supporting optics. Our solution employs a guided-wave acousto-optic spatial light modulator implemented in lithium niobate in conjunction with an integrated Bragg-regime reflection volume hologram.
Interactive 3D visualization for theoretical virtual observatories
NASA Astrophysics Data System (ADS)
Dykes, T.; Hassan, A.; Gheller, C.; Croton, D.; Krokos, M.
2018-06-01
Virtual observatories (VOs) are online hubs of scientific knowledge. They encompass a collection of platforms dedicated to the storage and dissemination of astronomical data, from simple data archives to e-research platforms offering advanced tools for data exploration and analysis. Whilst the more mature platforms within VOs primarily serve the observational community, there are also services fulfilling a similar role for theoretical data. Scientific visualization can be an effective tool for analysis and exploration of data sets made accessible through web platforms for theoretical data, which often contain spatial dimensions and properties inherently suitable for visualization via e.g. mock imaging in 2D or volume rendering in 3D. We analyse the current state of 3D visualization for big theoretical astronomical data sets through scientific web portals and virtual observatory services. We discuss some of the challenges for interactive 3D visualization and how it can augment the workflow of users in a virtual observatory context. Finally we showcase a lightweight client-server visualization tool for particle-based data sets, allowing quantitative visualization via data filtering, highlighting two example use cases within the Theoretical Astrophysical Observatory.
NASA Astrophysics Data System (ADS)
Abercrombie, S. P.; Menzies, A.; Goddard, C.
2017-12-01
Virtual and augmented reality enable scientists to visualize environments that are very difficult, or even impossible to visit, such as the surface of Mars. A useful immersive visualization begins with a high quality reconstruction of the environment under study. This presentation will discuss a photogrammetry pipeline developed at the Jet Propulsion Laboratory to reconstruct 3D models of the surface of Mars using stereo images sent back to Earth by the Curiosity Mars rover. The resulting models are used to support a virtual reality tool (OnSight) that allows scientists and engineers to visualize the surface of Mars as if they were standing on the red planet. Images of Mars present challenges to existing scene reconstruction solutions. Surface images of Mars are sparse with minimal overlap, and are often taken from extremely different viewpoints. In addition, the specialized cameras used by Mars rovers are significantly different than consumer cameras, and GPS localization data is not available on Mars. This presentation will discuss scene reconstruction with an emphasis on coping with limited input data, and on creating models suitable for rendering in virtual reality at high frame rate.
Virtual Environments: Issues and Opportunities for Researching Inclusive Educational Practices
NASA Astrophysics Data System (ADS)
Sheehy, Kieron
This chapter argues that virtual environments offer new research areas for those concerned with inclusive education. Further, it proposes that they also present opportunities for developing increasingly inclusive research processes. This chapter considers how researchers might approach researching some of these affordances. It discusses the relationship between specific features of inclusive pedagogy, derived from an international systematic literature review, and the affordances of different forms of virtual characters and environments. Examples are drawn from research in Second LifeTM (SL), virtual tutors and augmented reality. In doing this, the chapter challenges a simplistic notion of isolated physical and virtual worlds and, in the context of inclusion, between the practice of research and the research topic itself. There are a growing number of virtual worlds in which identified educational activities are taking place, or whose activities are being noted for their educational merit. These encompasses non-themed worlds such as SL and Active Worlds, game based worlds such as World of Warcraft and Runescape, and even Club Penguin, a themed virtual where younger players interact through a variety of Penguin themed environments and activities. It has been argued that these spaces, outside traditional education, are able to offer pedagogical insights (Twining 2009) i.e. that these global virtual communities have been identified as being useful as creative educational environments (Delwiche 2006; Sheehy 2009). This chapter will explore how researchers might use these spaces to investigative and create inclusive educational experiences for learners. In order to do this the chapter considers three interrelated issues: What is inclusive education?; How might inclusive education influence virtual world research? And, what might inclusive education look like in virtual worlds?
Human computer confluence applied in healthcare and rehabilitation.
Viaud-Delmon, Isabelle; Gaggioli, Andrea; Ferscha, Alois; Dunne, Stephen
2012-01-01
Human computer confluence (HCC) is an ambitious research program studying how the emerging symbiotic relation between humans and computing devices can enable radically new forms of sensing, perception, interaction, and understanding. It is an interdisciplinary field, bringing together researches from horizons as various as pervasive computing, bio-signals processing, neuroscience, electronics, robotics, virtual & augmented reality, and provides an amazing potential for applications in medicine and rehabilitation.
Perceptual Issues of Augmented and Virtual Environments
2007-07-01
distinct preference for one eye over the other. This is typically, quickly, and easily found through sighting tests (Peli, 1990). This eye dominance...been researched extensively and for different purposes. The entertainment industry has also experimented with synthetic smell production, in the...form of accompanying smells to enhance the experience of films (Lefcowitz, 2001, Somerson, 2001). In the Aroma Rama and the Smell -o-vision systems
ERIC Educational Resources Information Center
Lee, Mark J. W.; Nikolic, Sasha; Vial, Peter J.; Ritz, Christian H.; Li, Wanqing; Goldfinch, Tom
2016-01-01
Project-based learning is a widely used pedagogical strategy in engineering education shown to be effective in fostering problem-solving, design, and teamwork skills. There are distinct benefits to be gained from giving students autonomy in determining the nature and scope of the projects that they wish to undertake, but a lack of expert guidance…
ERIC Educational Resources Information Center
Borrero, A. Mejias; Marquez, J. M. Andujar
2012-01-01
Lab practices are an essential part of teaching in Engineering. However, traditional laboratory lessons developed in classroom labs (CL) must be adapted to teaching and learning strategies that go far beyond the common concept of e-learning, in the sense that completely virtualized distance education disconnects teachers and students from the real…
ERIC Educational Resources Information Center
Chatham-Carpenter, April
2017-01-01
As students begin using more virtual and augmented reality tools for education and entertainment, they may come to expect instructors to use similar types of technology in their teaching (Kelly, 2016). Instructors may choose to employ this type of technology in the future, to create more of a real-life feel in the online classroom. The effects of…
NASA Astrophysics Data System (ADS)
Mejías Borrero, A.; Andújar Márquez, J. M.
2012-10-01
Lab practices are an essential part of teaching in Engineering. However, traditional laboratory lessons developed in classroom labs (CL) must be adapted to teaching and learning strategies that go far beyond the common concept of e-learning, in the sense that completely virtualized distance education disconnects teachers and students from the real world, which can generate specific problems in laboratory classes. Current proposals of virtual labs (VL) and remote labs (RL) do not either cover new needs properly or contribute remarkable improvement to traditional labs—except that they favor distance training. Therefore, online teaching and learning in lab practices demand a further step beyond current VL and RL. This paper poses a new reality and new teaching/learning concepts in the field of lab practices in engineering. The developed augmented reality-based lab system (augmented remote lab, ARL) enables teachers and students to work remotely (Internet/intranet) in current CL, including virtual elements which interact with real ones. An educational experience was conducted to assess the developed ARL with the participation of a group of 10 teachers and another group of 20 students. Both groups have completed lab practices of the contents in the subjects Digital Systems and Robotics and Industrial Automation, which belong to the second year of the new degree in Electronic Engineering (adapted to the European Space for Higher Education). The labs were carried out by means of three different possibilities: CL, VL and ARL. After completion, both groups were asked to fill in some questionnaires aimed at measuring the improvement contributed by ARL relative to CL and VL. Except in some specific questions, the opinion of teachers and students was rather similar and positive regarding the use and possibilities of ARL. Although the results are still preliminary and need further study, seems to conclude that ARL remarkably improves the possibilities of current VL and RL. Furthermore, ARL can be concluded to allow further possibilities when used online than traditional laboratory lessons completed in CL.
Plechawska, Małgorzata; Polańska, Joanna
2009-01-01
This article presents the method of the processing of mass spectrometry data. Mass spectra are modelled with Gaussian Mixture Models. Every peak of the spectrum is represented by a single Gaussian. Its parameters describe the location, height and width of the corresponding peak of the spectrum. An authorial version of the Expectation Maximisation Algorithm was used to perform all calculations. Errors were estimated with a virtual mass spectrometer. The discussed tool was originally designed to generate a set of spectra within defined parameters.
Augmentation of heat and mass transfer in laminar flow of suspensions: A correlation of data
NASA Astrophysics Data System (ADS)
Ahuja, Avtar S.
1980-01-01
The experimental data from literature on the augmentation of heat and gas transport in the laminar flow of suspensions of polystyrene spheres have been correlated on common coordinates. The correlation includes the influences of particle size, tube diameter and length, shear rate of flow, transport properties of diffusing species (heat or gas) in suspending liquids, and of the particle interactions on the augmentation of heat or gas transfer in flowing suspensions.
A Second Life for eHealth: Prospects for the Use of 3-D Virtual Worlds in Clinical Psychology
Gaggioli, Andrea; Vigna, Cinzia; Riva, Giuseppe
2008-01-01
The aim of the present paper is to describe the role played by three-dimensional (3-D) virtual worlds in eHealth applications, addressing some potential advantages and issues related to the use of this emerging medium in clinical practice. Due to the enormous diffusion of the World Wide Web (WWW), telepsychology, and telehealth in general, have become accepted and validated methods for the treatment of many different health care concerns. The introduction of the Web 2.0 has facilitated the development of new forms of collaborative interaction between multiple users based on 3-D virtual worlds. This paper describes the development and implementation of a form of tailored immersive e-therapy called p-health whose key factor is interreality, that is, the creation of a hybrid augmented experience merging physical and virtual worlds. We suggest that compared with conventional telehealth applications such as emails, chat, and videoconferences, the interaction between real and 3-D virtual worlds may convey greater feelings of presence, facilitate the clinical communication process, positively influence group processes and cohesiveness in group-based therapies, and foster higher levels of interpersonal trust between therapists and patients. However, challenges related to the potentially addictive nature of such virtual worlds and questions related to privacy and personal safety will also be discussed. PMID:18678557
Seamless 3D interaction for virtual tables, projection planes, and CAVEs
NASA Astrophysics Data System (ADS)
Encarnacao, L. M.; Bimber, Oliver; Schmalstieg, Dieter; Barton, Robert J., III
2000-08-01
The Virtual Table presents stereoscopic graphics to a user in a workbench-like setting. This device shares with other large- screen display technologies (such as data walls and surround- screen projection systems) the lack of human-centered unencumbered user interfaces and 3D interaction technologies. Such shortcomings present severe limitations to the application of virtual reality (VR) technology to time- critical applications as well as employment scenarios that involve heterogeneous groups of end-users without high levels of computer familiarity and expertise. Traditionally such employment scenarios are common in planning-related application areas such as mission rehearsal and command and control. For these applications, a high grade of flexibility with respect to the system requirements (display and I/O devices) as well as to the ability to seamlessly and intuitively switch between different interaction modalities and interaction are sought. Conventional VR techniques may be insufficient to meet this challenge. This paper presents novel approaches for human-centered interfaces to Virtual Environments focusing on the Virtual Table visual input device. It introduces new paradigms for 3D interaction in virtual environments (VE) for a variety of application areas based on pen-and-clipboard, mirror-in-hand, and magic-lens metaphors, and introduces new concepts for combining VR and augmented reality (AR) techniques. It finally describes approaches toward hybrid and distributed multi-user interaction environments and concludes by hypothesizing on possible use cases for defense applications.
Teaching Basic Field Skills Using Screen-Based Virtual Reality Landscapes
NASA Astrophysics Data System (ADS)
Houghton, J.; Robinson, A.; Gordon, C.; Lloyd, G. E. E.; Morgan, D. J.
2016-12-01
We are using screen-based virtual reality landscapes, created using the Unity 3D game engine, to augment the training geoscience students receive in preparing for fieldwork. Students explore these landscapes as they would real ones, interacting with virtual outcrops to collect data, determine location, and map the geology. Skills for conducting field geological surveys - collecting, plotting and interpreting data; time management and decision making - are introduced interactively and intuitively. As with real landscapes, the virtual landscapes are open-ended terrains with embedded data. This means the game does not structure student interaction with the information as it is through experience the student learns the best methods to work successfully and efficiently. These virtual landscapes are not replacements for geological fieldwork rather virtual spaces between classroom and field in which to train and reinforcement essential skills. Importantly, these virtual landscapes offer accessible parallel provision for students unable to visit, or fully partake in visiting, the field. The project has received positive feedback from both staff and students. Results show students find it easier to focus on learning these basic field skills in a classroom, rather than field setting, and make the same mistakes as when learning in the field, validating the realistic nature of the virtual experience and providing opportunity to learn from these mistakes. The approach also saves time, and therefore resources, in the field as basic skills are already embedded. 70% of students report increased confidence with how to map boundaries and 80% have found the virtual training a useful experience. We are also developing landscapes based on real places with 3D photogrammetric outcrops, and a virtual urban landscape in which Engineering Geology students can conduct a site investigation. This project is a collaboration between the University of Leeds and Leeds College of Art, UK, and all our virtual landscapes are freely available online at www.see.leeds.ac.uk/virtual-landscapes/.
Lohnes, Karen; Quebbemann, Neil R; Liu, Kate; Kobzeff, Fred; Loo, Joseph A; Ogorzalek Loo, Rachel R
2016-07-15
The virtual two-dimensional gel electrophoresis/mass spectrometry (virtual 2D gel/MS) technology combines the premier, high-resolution capabilities of 2D gel electrophoresis with the sensitivity and high mass accuracy of mass spectrometry (MS). Intact proteins separated by isoelectric focusing (IEF) gel electrophoresis are imaged from immobilized pH gradient (IPG) polyacrylamide gels (the first dimension of classic 2D-PAGE) by matrix-assisted laser desorption/ionization (MALDI) MS. Obtaining accurate intact masses from sub-picomole-level proteins embedded in 2D-PAGE gels or in IPG strips is desirable to elucidate how the protein of one spot identified as protein 'A' on a 2D gel differs from the protein of another spot identified as the same protein, whenever tryptic peptide maps fail to resolve the issue. This task, however, has been extremely challenging. Virtual 2D gel/MS provides access to these intact masses. Modifications to our matrix deposition procedure improve the reliability with which IPG gels can be prepared; the new procedure is described. Development of this MALDI MS imaging (MSI) method for high-throughput MS with integrated 'top-down' MS to elucidate protein isoforms from complex biological samples is described and it is demonstrated that a 4-cm IPG gel segment can now be imaged in approximately 5min. Gel-wide chemical and enzymatic methods with further interrogation by MALDI MS/MS provide identifications, sequence-related information, and post-translational/transcriptional modification information. The MSI-based virtual 2D gel/MS platform may potentially link the benefits of 'top-down' and 'bottom-up' proteomics. Copyright © 2016 Elsevier Inc. All rights reserved.
Bergmann, Jeannine; Krewer, Carmen; Bauer, Petra; Koenig, Alexander; Riener, Robert; Müller, Friedemann
2018-06-01
Active performance is crucial for motor learning, and, together with motivation, is believed to be associated with a better rehabilitation outcome. Virtual reality (VR) is an innovative approach to engage and motivate patients during training. There is promising evidence for its efficiency in retraining upper limb function. However, there is insufficient proof for its effectiveness in gait training. To evaluate the acceptability of robot-assisted gait training (RAGT) with and without VR and the feasibility of potential outcome measures to guide the planning of a larger randomized controlled trial (RCT). Single-blind randomized controlled pilot trial with two parallel arms. Rehabilitation hospital. Twenty subacute stroke patients (64±9 years) with a Functional Ambulation Classification (FAC) ≤2. Twelve sessions (over 4 weeks) of either VR-augmented RAGT (intervention group) or standard RAGT (control group). Acceptability of the interventions (drop-out rate, questionnaire), patients' motivation (Intrinsic Motivation Inventory [IMI], individual mean walking time), and feasibility of potential outcome measures (completion rate and response to interventions) were determined. We found high acceptability of repetitive VR-augmented RAGT. The drop-out rate was 1/11 in the intervention and 4/14 in the control group. Patients of the intervention group spent significantly more time walking in the robot than the control group (per session and total walking time; P<0.03). In both groups, motivation measured with the IMI was high over the entire intervention period. The felt pressure and tension significantly decreased in the intervention group (P<0.01) and was significantly lower than in the control group at the last therapy session (r=-0.66, P=0.005). The FAC is suggested as a potential primary outcome measure for a definitive RCT, as it could be assessed in all patients and showed significant response to interventions (P<0.01). We estimated a sample size of 44 for a future RCT. VR-augmented RAGT resulted in high acceptability and motivation, and in a reduced drop-out rate and an extended training time compared to standard RAGT. This pilot trial provides guidance for a prospective RCT on the effectiveness of VR-augmented RAGT. VR might be a promising approach to enrich and improve gait rehabilitation after stroke.
A novel 3D guidance system using augmented reality for percutaneous vertebroplasty: technical note.
Abe, Yuichiro; Sato, Shigenobu; Kato, Koji; Hyakumachi, Takahiko; Yanagibashi, Yasushi; Ito, Manabu; Abumi, Kuniyoshi
2013-10-01
Augmented reality (AR) is an imaging technology by which virtual objects are overlaid onto images of real objects captured in real time by a tracking camera. This study aimed to introduce a novel AR guidance system called virtual protractor with augmented reality (VIPAR) to visualize a needle trajectory in 3D space during percutaneous vertebroplasty (PVP). The AR system used for this study comprised a head-mount display (HMD) with a tracking camera and a marker sheet. An augmented scene was created by overlaying the preoperatively generated needle trajectory path onto a marker detected on the patient using AR software, thereby providing the surgeon with augmented views in real time through the HMD. The accuracy of the system was evaluated by using a computer-generated simulation model in a spine phantom and also evaluated clinically in 5 patients. In the 40 spine phantom trials, the error of the insertion angle (EIA), defined as the difference between the attempted angle and the insertion angle, was evaluated using 3D CT scanning. Computed tomography analysis of the 40 spine phantom trials showed that the EIA in the axial plane significantly improved when VIPAR was used compared with when it was not used (0.96° ± 0.61° vs 4.34° ± 2.36°, respectively). The same held true for EIA in the sagittal plane (0.61° ± 0.70° vs 2.55° ± 1.93°, respectively). In the clinical evaluation of the AR system, 5 patients with osteoporotic vertebral fractures underwent VIPAR-guided PVP from October 2011 to May 2012. The postoperative EIA was evaluated using CT. The clinical results of the 5 patients showed that the EIA in all 10 needle insertions was 2.09° ± 1.3° in the axial plane and 1.98° ± 1.8° in the sagittal plane. There was no pedicle breach or leakage of polymethylmethacrylate. VIPAR was successfully used to assist in needle insertion during PVP by providing the surgeon with an ideal insertion point and needle trajectory through the HMD. The findings indicate that AR guidance technology can become a useful assistive device during spine surgeries requiring percutaneous procedures.
An optical tracking system for virtual reality
NASA Astrophysics Data System (ADS)
Hrimech, Hamid; Merienne, Frederic
2009-03-01
In this paper we present a low-cost 3D tracking system which we have developed and tested in order to move away from traditional 2D interaction techniques (keyboard and mouse) in an attempt to improve user's experience while using a CVE. Such a tracking system is used to implement 3D interaction techniques that augment user experience, promote user's sense of transportation in the virtual world as well as user's awareness of their partners. The tracking system is a passive optical tracking system using stereoscopy a technique allowing the reconstruction of three-dimensional information from a couple of images. We have currently deployed our 3D tracking system on a collaborative research platform for investigating 3D interaction techniques in CVEs.
Real-Time View Correction for Mobile Devices.
Schops, Thomas; Oswald, Martin R; Speciale, Pablo; Yang, Shuoran; Pollefeys, Marc
2017-11-01
We present a real-time method for rendering novel virtual camera views from given RGB-D (color and depth) data of a different viewpoint. Missing color and depth information due to incomplete input or disocclusions is efficiently inpainted in a temporally consistent way. The inpainting takes the location of strong image gradients into account as likely depth discontinuities. We present our method in the context of a view correction system for mobile devices, and discuss how to obtain a screen-camera calibration and options for acquiring depth input. Our method has use cases in both augmented and virtual reality applications. We demonstrate the speed of our system and the visual quality of its results in multiple experiments in the paper as well as in the supplementary video.
Rothbaum, Barbara Olasov; Price, Matthew; Jovanovic, Tanja; Norrholm, Seth D.; Gerardi, Maryrose; Dunlop, Boadie; Davis, Michael; Bradley, Bekh; Duncan, Erica; Rizzo, Albert “Skip”; Ressler, Kerry J.
2014-01-01
Objective To determine the effectiveness of Virtual Reality Exposure (VRE) augmented with D-cycloserine (50mg) or alprazolam (0.25mg), compared to placebo, in reducing PTSD due to military trauma in Iraq and Afghanistan. Method A double-blind, placebo-controlled randomized clinical trial comparing augmentation methods for VRE for subjects (n= 156) with PTSD was conducted. Results PTSD symptoms significantly improved from pre- to post-treatment over the 6-session VRE treatment (p<.001) across all conditions and were maintained at 3, 6, and 12 months follow-up. There were no overall differences between the D-cycloserine group on symptoms at any time-point. The alprazolam and placebo conditions significantly differed on the post-treatment Clinician Administered PTSD scale (p = .006) and the 3-month post-treatment PTSD diagnosis, such that the alprazolam group showed greater rates of PTSD (79.2% alprazolam vs. 47.8% placebo). Between-session extinction learning was a treatment-specific enhancer of outcome for the D-cycloserine group only (p<.005). At post-treatment, the D-cycloserine group was the lowest on cortisol reactivity (p<.05) and startle response during VR scenes (p<.05). Conclusions A small number of VRE sessions were associated with reduced PTSD diagnosis and symptoms in Iraq/Afghanistan veterans, although there was no control condition for the VRE. Overall, there was no advantage of D-cycloserine on PTSD symptoms in primary analyses. In secondary analyses, benzodiazepine use during treatment may impair recovery, and D-cycloserine may enhance VRE in patients who demonstrate within-session learning. D-cycloserine augmentation treatment in PTSD patients may reduce cortisol and startle reactivity compared to the alprazolam and placebo treatment, consistent with the animal literature. PMID:24743802
Tang, Rui; Ma, Long-Fei; Rong, Zhi-Xia; Li, Mo-Dan; Zeng, Jian-Ping; Wang, Xue-Dong; Liao, Hong-En; Dong, Jia-Hong
2018-04-01
Augmented reality (AR) technology is used to reconstruct three-dimensional (3D) images of hepatic and biliary structures from computed tomography and magnetic resonance imaging data, and to superimpose the virtual images onto a view of the surgical field. In liver surgery, these superimposed virtual images help the surgeon to visualize intrahepatic structures and therefore, to operate precisely and to improve clinical outcomes. The keywords "augmented reality", "liver", "laparoscopic" and "hepatectomy" were used for searching publications in the PubMed database. The primary source of literatures was from peer-reviewed journals up to December 2016. Additional articles were identified by manual search of references found in the key articles. In general, AR technology mainly includes 3D reconstruction, display, registration as well as tracking techniques and has recently been adopted gradually for liver surgeries including laparoscopy and laparotomy with video-based AR assisted laparoscopic resection as the main technical application. By applying AR technology, blood vessels and tumor structures in the liver can be displayed during surgery, which permits precise navigation during complex surgical procedures. Liver transformation and registration errors during surgery were the main factors that limit the application of AR technology. With recent advances, AR technologies have the potential to improve hepatobiliary surgical procedures. However, additional clinical studies will be required to evaluate AR as a tool for reducing postoperative morbidity and mortality and for the improvement of long-term clinical outcomes. Future research is needed in the fusion of multiple imaging modalities, improving biomechanical liver modeling, and enhancing image data processing and tracking technologies to increase the accuracy of current AR methods. Copyright © 2018 First Affiliated Hospital, Zhejiang University School of Medicine in China. Published by Elsevier B.V. All rights reserved.
Model-based video segmentation for vision-augmented interactive games
NASA Astrophysics Data System (ADS)
Liu, Lurng-Kuo
2000-04-01
This paper presents an architecture and algorithms for model based video object segmentation and its applications to vision augmented interactive game. We are especially interested in real time low cost vision based applications that can be implemented in software in a PC. We use different models for background and a player object. The object segmentation algorithm is performed in two different levels: pixel level and object level. At pixel level, the segmentation algorithm is formulated as a maximizing a posteriori probability (MAP) problem. The statistical likelihood of each pixel is calculated and used in the MAP problem. Object level segmentation is used to improve segmentation quality by utilizing the information about the spatial and temporal extent of the object. The concept of an active region, which is defined based on motion histogram and trajectory prediction, is introduced to indicate the possibility of a video object region for both background and foreground modeling. It also reduces the overall computation complexity. In contrast with other applications, the proposed video object segmentation system is able to create background and foreground models on the fly even without introductory background frames. Furthermore, we apply different rate of self-tuning on the scene model so that the system can adapt to the environment when there is a scene change. We applied the proposed video object segmentation algorithms to several prototype virtual interactive games. In our prototype vision augmented interactive games, a player can immerse himself/herself inside a game and can virtually interact with other animated characters in a real time manner without being constrained by helmets, gloves, special sensing devices, or background environment. The potential applications of the proposed algorithms including human computer gesture interface and object based video coding such as MPEG-4 video coding.
Combining 3D structure of real video and synthetic objects
NASA Astrophysics Data System (ADS)
Kim, Man-Bae; Song, Mun-Sup; Kim, Do-Kyoon
1998-04-01
This paper presents a new approach of combining real video and synthetic objects. The purpose of this work is to use the proposed technology in the fields of advanced animation, virtual reality, games, and so forth. Computer graphics has been used in the fields previously mentioned. Recently, some applications have added real video to graphic scenes for the purpose of augmenting the realism that the computer graphics lacks in. This approach called augmented or mixed reality can produce more realistic environment that the entire use of computer graphics. Our approach differs from the virtual reality and augmented reality in the manner that computer- generated graphic objects are combined to 3D structure extracted from monocular image sequences. The extraction of the 3D structure requires the estimation of 3D depth followed by the construction of a height map. Graphic objects are then combined to the height map. The realization of our proposed approach is carried out in the following steps: (1) We derive 3D structure from test image sequences. The extraction of the 3D structure requires the estimation of depth and the construction of a height map. Due to the contents of the test sequence, the height map represents the 3D structure. (2) The height map is modeled by Delaunay triangulation or Bezier surface and each planar surface is texture-mapped. (3) Finally, graphic objects are combined to the height map. Because 3D structure of the height map is already known, Step (3) is easily manipulated. Following this procedure, we produced an animation video demonstrating the combination of the 3D structure and graphic models. Users can navigate the realistic 3D world whose associated image is rendered on the display monitor.
Embedded Augmented Reality Training System for Dynamic Human-Robot Cooperation
2009-10-01
through (OST) head- mounted displays ( HMDs ) still lack in usability and ergonomics because of their size, weight, resolution, and the hard-to-realize...with addressable focal planes [10], for example. Accurate and easy-to-use calibration routines for OST HMDs remains a challenging task; established...methods are based on matching of virtual over real objects [11], newer approaches use cameras looking directly through the HMD optics to exploit both
Towards surgeon-authored VR training: the scene-development cycle.
Dindar, Saleh; Nguyen, Thien; Peters, Jörg
2016-01-01
Enabling surgeon-educators to themselves create virtual reality (VR) training units promises greater variety, specialization, and relevance of the units. This paper describes a software bridge that semi-automates the scene-generation cycle, a key bottleneck in authoring, modeling, and developing VR units. Augmenting an open source modeling environment with physical behavior attachment and collision specifications yields single-click testing of the full force-feedback enabled anatomical scene.
Skill Transfer and Virtual Training for IND Response Decision-Making: Project Summary and Next Steps
2016-01-01
to be very productive partners—independent video - game developers and academic game degree programs—are not familiar with working with government...experimental validation. • Independent Video - Game Developers. Small companies and individuals that pursue video - game design and development can be inexpensive...Emergency Management Agency (FEMA) that examines alternative mechanisms for training and evaluation of emergency managers (EMs) to augment and
Wasko, Michael J; Pellegrene, Kendy A; Madura, Jeffry D; Surratt, Christopher K
2015-01-01
Hundreds of millions of U.S. dollars are invested in the research and development of a single drug. Lead compound development is an area ripe for new design strategies. Therapeutic lead candidates have been traditionally found using high-throughput in vitro pharmacological screening, a costly method for assaying thousands of compounds. This approach has recently been augmented by virtual screening (VS), which employs computer models of the target protein to narrow the search for possible leads. A variant of VS is fragment-based drug design (FBDD), an emerging in silico lead discovery method that introduces low-molecular weight fragments, rather than intact compounds, into the binding pocket of the receptor model. These fragments serve as starting points for "growing" the lead candidate. Current efforts in virtual FBDD within central nervous system (CNS) targets are reviewed, as is a recent rule-based optimization strategy in which new molecules are generated within a 3D receptor-binding pocket using the fragment as a scaffold. This process not only places special emphasis on creating synthesizable molecules but also exposes computational questions worth addressing. Fragment-based methods provide a viable, relatively low-cost alternative for therapeutic lead discovery and optimization that can be applied to CNS targets to augment current design strategies.
Wasko, Michael J.; Pellegrene, Kendy A.; Madura, Jeffry D.; Surratt, Christopher K.
2015-01-01
Hundreds of millions of U.S. dollars are invested in the research and development of a single drug. Lead compound development is an area ripe for new design strategies. Therapeutic lead candidates have been traditionally found using high-throughput in vitro pharmacological screening, a costly method for assaying thousands of compounds. This approach has recently been augmented by virtual screening (VS), which employs computer models of the target protein to narrow the search for possible leads. A variant of VS is fragment-based drug design (FBDD), an emerging in silico lead discovery method that introduces low-molecular weight fragments, rather than intact compounds, into the binding pocket of the receptor model. These fragments serve as starting points for “growing” the lead candidate. Current efforts in virtual FBDD within central nervous system (CNS) targets are reviewed, as is a recent rule-based optimization strategy in which new molecules are generated within a 3D receptor-binding pocket using the fragment as a scaffold. This process not only places special emphasis on creating synthesizable molecules but also exposes computational questions worth addressing. Fragment-based methods provide a viable, relatively low-cost alternative for therapeutic lead discovery and optimization that can be applied to CNS targets to augment current design strategies. PMID:26441817
Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S.; Phoon, Sin Ye
2016-01-01
This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively. PMID:27271840
NASA Astrophysics Data System (ADS)
Demir, I.
2014-12-01
Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The hydrological simulation system is a web-based 3D interactive learning environment for teaching hydrological processes and concepts. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create or load predefined scenarios, control environmental parameters, and evaluate environmental mitigation alternatives. The web-based simulation system provides an environment for students to learn about the hydrological processes (e.g. flooding and flood damage), and effects of development and human activity in the floodplain. The system utilizes latest web technologies and graphics processing unit (GPU) for water simulation and object collisions on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various visualization and interaction modes.
Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S; Phoon, Sin Ye
2016-06-07
This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively.
NASA Astrophysics Data System (ADS)
Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S.; Phoon, Sin Ye
2016-06-01
This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively.
Martin Salzmann-Erikson, R.N.; Henrik Eriksson, R.N.T.
2011-01-01
Earlier research shows that breast augmentation is positively correlated with positive psychological states. The aim of this study was to explore the shared values, feelings, and thoughts within the culture of breast enlargement among women visiting Internet-based forums when considering and/or undergoing esthetic plastic surgery. The study used a netnographic method for gathering and analyzing data. The findings show that the women used the Internet forum to provide emotional support to other women. Through electronic postings, they cared for and nursed each others’ anxiety and feelings throughout the whole process. Apart from the process, another central issue was that the women's relationships were frequently discussed; specifically their relationship to themselves, their environment, and with the surgeons. The findings suggest that Internet forums represent a channel through which posters can share values, feelings, and thoughts from the position of an agent of action as well as from a position as the object of action. These dual positions and the medium endow the women with a virtual nursing competence that would otherwise be unavailable. By introducing the concept of torrenting as a means of sharing important self-care information, the authors provide a concept that can be further explored in relation to post modern self-care strategies within contemporary nursing theories and practice. PMID:22053162
Miragall, Marta; Baños, Rosa M.; Cebolla, Ausiàs; Botella, Cristina
2015-01-01
This study examines the psychometric properties of the Working Alliance Inventory-Short (WAI-S) adaptation to Virtual Reality (VR) and Augmented Reality (AR) therapies (WAI-VAR). The relationship between the therapeutic alliance (TA) with VR and AR and clinically significant change (CSC) is also explored. Seventy-five patients took part in this study (74.7% women, Mage = 34.41). Fear of flying and adjustment disorder patients received VR therapy, and cockroach phobia patients received AR therapy. Psychometric properties, CSC, one-way ANOVA, Spearman’s Correlations and Multiple Regression were calculated. The WAI-VAR showed a unidimensional structure, high internal consistency and adequate convergent validity. “Not changed” patients scored lower on the WAI-VAR than “improved” and “recovered” patients. Correlation between the WAI-VAR and CSC was moderate. The best fitting model for predicting CSC was a linear combination of the TA with therapist (WAI-S) and the TA with VR and AR (WAI-VAR), due to the latter variable slightly increased the percentage of variability accounted for in CSC. The WAI-VAR is the first validated instrument to measure the TA with VR and AR in research and clinical practice. This study reveals the importance of the quality of the TA with technologies in achieving positive outcomes in the therapy. PMID:26500589
Miragall, Marta; Baños, Rosa M; Cebolla, Ausiàs; Botella, Cristina
2015-01-01
This study examines the psychometric properties of the Working Alliance Inventory-Short (WAI-S) adaptation to Virtual Reality (VR) and Augmented Reality (AR) therapies (WAI-VAR). The relationship between the therapeutic alliance (TA) with VR and AR and clinically significant change (CSC) is also explored. Seventy-five patients took part in this study (74.7% women, M age = 34.41). Fear of flying and adjustment disorder patients received VR therapy, and cockroach phobia patients received AR therapy. Psychometric properties, CSC, one-way ANOVA, Spearman's Correlations and Multiple Regression were calculated. The WAI-VAR showed a unidimensional structure, high internal consistency and adequate convergent validity. "Not changed" patients scored lower on the WAI-VAR than "improved" and "recovered" patients. Correlation between the WAI-VAR and CSC was moderate. The best fitting model for predicting CSC was a linear combination of the TA with therapist (WAI-S) and the TA with VR and AR (WAI-VAR), due to the latter variable slightly increased the percentage of variability accounted for in CSC. The WAI-VAR is the first validated instrument to measure the TA with VR and AR in research and clinical practice. This study reveals the importance of the quality of the TA with technologies in achieving positive outcomes in the therapy.
Review of high energy diffraction in real and virtual photon-proton scattering at HERA
NASA Astrophysics Data System (ADS)
Wolf, G.
2010-11-01
The electron-proton collider HERA at DESY opened the door for the study of diffraction in real and virtual photon-proton scattering at centre-of-mass energies W up to 250 GeV and for large negative mass squared -Q2 of the virtual photon up to Q2 = 1600 GeV2. At W = 220 GeV and Q2 = 4 GeV2, diffraction accounts for about 15% of the total virtual photon-proton cross section, decreasing to ≈5% at Q2 = 200 GeV2. An overview of the results obtained by the experiments H1 and ZEUS on the production of neutral vector mesons and on inclusive diffraction up to the year 2008 is presented.
Pose tracking for augmented reality applications in outdoor archaeological sites
NASA Astrophysics Data System (ADS)
Younes, Georges; Asmar, Daniel; Elhajj, Imad; Al-Harithy, Howayda
2017-01-01
In recent years, agencies around the world have invested huge amounts of effort toward digitizing many aspects of the world's cultural heritage. Of particular importance is the digitization of outdoor archaeological sites. In the spirit of valorization of this digital information, many groups have developed virtual or augmented reality (AR) computer applications themed around a particular archaeological object. The problem of pose tracking in outdoor AR applications is addressed. Different positional systems are analyzed, resulting in the selection of a monocular camera-based user tracker. The limitations that challenge this technique from map generation, scale, anchoring, to lighting conditions are analyzed and systematically addressed. Finally, as a case study, our pose tracking system is implemented within an AR experience in the Byblos Roman theater in Lebanon.
A see through future: augmented reality and health information systems.
Monkman, Helen; Kushniruk, Andre W
2015-01-01
Augmented Reality (AR) is a method whereby virtual objects are superimposed on the real world. AR technology is becoming increasingly accessible and affordable and it has many potential health applications. This paper discusses current research on AR health applications such as medical education and medical practice. Some of the potential future uses for this technology (e.g., health information systems, consumer health applications) will also be presented. Additionally, there will be a discussion outlining some of usability and human factors challenges associated with AR in healthcare. It is expected that AR will become increasingly prevalent in healthcare; however, further investigation is required to demonstrate that they provide benefits over traditional methods. Moreover, AR applications must be thoroughly tested to ensure they do not introduce new errors into practice and have patient safety implications.
Visualizing DOM super-spectrum covariance in vanKrevelen space
NASA Astrophysics Data System (ADS)
Fatland, D. R.; Kalawe, J.; Stubbins, A.; Spencer, R. G.; Sleighter, R. L.; Abdulla, H. A.; Dittmar, T.
2011-12-01
We investigate the fate of terrigenous organic matter, DOM exported to the coastal marine environ. Many methods (fluor., FT-ICR-MS, NMR, 13C, lignin, etc) help characterize this DOM. We define a 'super spectrum' as amalgamation of analyses to a data stack and we search for physically significant patterns therein beginning with covariance across 31 samples from six circum-Arctic rivers: The Ob, Kolyma, Mackenzie, Yukon, Lena, and Yenisey sampled five times throughout the year. A vanKrevelen diagram is convenient to view distributions of molecules provided by Fourier Transform Ion Cyclotron Resonance Mass Spectometry (FT-ICR-MS). We augment this distribution space in the vertical dimension, for example to show peak height, molecular mass, principle component weighting or covariance. We use Worldwide Telescope, a virtual globe with strong data support from Microsoft Research to explore covariance results along 3+ dimensions (adding brightness, color and a parameter slide). The results show interesting covariance e.g. between molecules and PARAFAC peaks, a step towards fluorophore and cohort identification in the terrigenous DOM spectrum. Given the geoscience explosion in data volume and data complexity we feel these results should survive beyond the end point of a journal article. We are building a cloud-based Library on the Microsoft Azure platform to support this and subsequent analyses to enable data and methods to carry over and benefit other research groups and objectives.
Ribozyme-mediated signal augmentation on a mass-sensitive biosensor.
Knudsen, Scott M; Lee, Joonhyung; Ellington, Andrew D; Savran, Cagri A
2006-12-20
Mass-based detection methods such as the quartz crystal microbalance (QCM) offer an attractive option to label-based methods; however the sensitivity is generally lower by comparison. In particular, low-molecular-weight analytes can be difficult to detect based on mass addition alone. In this communication, we present the use of effector-dependent ribozymes (aptazymes) as reagents for augmenting small ligand detection on a mass-sensitive device. Two distinct aptazymes were chosen: an L1-ligase-based aptazyme (L1-Rev), which is activated by a small peptide (MW approximately 2.4 kDa) from the HIV-1 Rev protein, and a hammerhead cleavase-based aptazyme (HH-theo3) activated by theophylline (MW = 180 Da). Aptazyme activity was observed in real time, and low-molecular-weight analyte detection has been successfully demonstrated with both aptazymes.
Haptic feedback in OP:Sense - augmented reality in telemanipulated robotic surgery.
Beyl, T; Nicolai, P; Mönnich, H; Raczkowksy, J; Wörn, H
2012-01-01
In current research, haptic feedback in robot assisted interventions plays an important role. However most approaches to haptic feedback only regard the mapping of the current forces at the surgical instrument to the haptic input devices, whereas surgeons demand a combination of medical imaging and telemanipulated robotic setups. In this paper we describe how this feature is integrated in our robotic research platform OP:Sense. The proposed method allows the automatic transfer of segmented imaging data to the haptic renderer and therefore allows enriching the haptic feedback with virtual fixtures based on imaging data. Anatomical structures are extracted from pre-operative generated medical images or virtual walls are defined by the surgeon inside the imaging data. Combining real forces with virtual fixtures can guide the surgeon to the regions of interest as well as helps to prevent the risk of damage to critical structures inside the patient. We believe that the combination of medical imaging and telemanipulation is a crucial step for the next generation of MIRS-systems.
Virtual reality and brain computer interface in neurorehabilitation
Dahdah, Marie; Driver, Simon; Parsons, Thomas D.; Richter, Kathleen M.
2016-01-01
The potential benefit of technology to enhance recovery after central nervous system injuries is an area of increasing interest and exploration. The primary emphasis to date has been motor recovery/augmentation and communication. This paper introduces two original studies to demonstrate how advanced technology may be integrated into subacute rehabilitation. The first study addresses the feasibility of brain computer interface with patients on an inpatient spinal cord injury unit. The second study explores the validity of two virtual environments with acquired brain injury as part of an intensive outpatient neurorehabilitation program. These preliminary studies support the feasibility of advanced technologies in the subacute stage of neurorehabilitation. These modalities were well tolerated by participants and could be incorporated into patients' inpatient and outpatient rehabilitation regimens without schedule disruptions. This paper expands the limited literature base regarding the use of advanced technologies in the early stages of recovery for neurorehabilitation populations and speaks favorably to the potential integration of brain computer interface and virtual reality technologies as part of a multidisciplinary treatment program. PMID:27034541
NASA Astrophysics Data System (ADS)
Rahman, Hameedur; Arshad, Haslina; Mahmud, Rozi; Mahayuddin, Zainal Rasyid
2017-10-01
Breast Cancer patients who require breast biopsy has increased over the past years. Augmented Reality guided core biopsy of breast has become the method of choice for researchers. However, this cancer visualization has limitations to the extent of superimposing the 3D imaging data only. In this paper, we are introducing an Augmented Reality visualization framework that enables breast cancer biopsy image guidance by using X-Ray vision technique on a mobile display. This framework consists of 4 phases where it initially acquires the image from CT/MRI and process the medical images into 3D slices, secondly it will purify these 3D grayscale slices into 3D breast tumor model using 3D modeling reconstruction technique. Further, in visualization processing this virtual 3D breast tumor model has been enhanced using X-ray vision technique to see through the skin of the phantom and the final composition of it is displayed on handheld device to optimize the accuracy of the visualization in six degree of freedom. The framework is perceived as an improved visualization experience because the Augmented Reality x-ray vision allowed direct understanding of the breast tumor beyond the visible surface and direct guidance towards accurate biopsy targets.
Projector-Based Augmented Reality for Quality Inspection of Scanned Objects
NASA Astrophysics Data System (ADS)
Kern, J.; Weinmann, M.; Wursthorn, S.
2017-09-01
After scanning or reconstructing the geometry of objects, we need to inspect the result of our work. Are there any parts missing? Is every detail covered in the desired quality? We typically do this by looking at the resulting point clouds or meshes of our objects on-screen. What, if we could see the information directly visualized on the object itself? Augmented reality is the generic term for bringing virtual information into our real environment. In our paper, we show how we can project any 3D information like thematic visualizations or specific monitoring information with reference to our object onto the object's surface itself, thus augmenting it with additional information. For small objects that could for instance be scanned in a laboratory, we propose a low-cost method involving a projector-camera system to solve this task. The user only needs a calibration board with coded fiducial markers to calibrate the system and to estimate the projector's pose later on for projecting textures with information onto the object's surface. Changes within the projected 3D information or of the projector's pose will be applied in real-time. Our results clearly reveal that such a simple setup will deliver a good quality of the augmented information.
Developing an Augmented Reality Environment for Earth Science Education
NASA Astrophysics Data System (ADS)
Pratt, M. J.; Skemer, P. A.; Arvidson, R. E.
2017-12-01
The emerging field of augmented reality (AR) provides new and exciting ways to explore geologic phenomena for research and education. The primary advantage of AR is that it allows users to physically explore complex three-dimensional structures that were previously inaccessible, for example a remote geologic outcrop or a mineral structure at the atomic scale. It is used, for example, with OnSight software during tactical operations to plan the Mars Curiosity rover's traverses by providing virtual views to walk through terrain and the rover at true scales. This mode of physical exploration allows users more freedom to investigate and understand the 3D structure than is possible on a flat computer screen, or within a static PowerPoint presentation during a classroom lecture. The Microsoft HoloLens headset provides the most-advanced, mobile AR platform currently available to developers. The Fossett Laboratory for Virtual Planetary Exploration at Washington University in St. Louis has applied this technology, coupled with photogrammetric software and the Unity 3D gaming engine, to develop photorealistic environments of 3D geologic outcrops from around the world. The untethered HoloLens provides an ideal platform for a classroom setting as it allows for shared experiences of the holograms of interest, projecting them in the same location for all users to explore. Furthermore, the HoloLens allows for face-to-face communication during use that is important in teaching, a feature that virtual reality does not allow. Our development of an AR application includes the design of an online database of photogrammetric outcrop models curated for the current limitations of AR technology. This database will be accessible to both those wishing to submit models, and is free to those wishing to use the application for teaching, outreach or research purposes.
Olfactory Stimuli Increase Presence in Virtual Environments
Munyan, Benson G.; Neer, Sandra M.; Beidel, Deborah C.; Jentsch, Florian
2016-01-01
Background Exposure therapy (EXP) is the most empirically supported treatment for anxiety and trauma-related disorders. EXP consists of repeated exposure to a feared object or situation in the absence of the feared outcome in order to extinguish associated anxiety. Key to the success of EXP is the need to present the feared object/event/situation in as much detail and utilizing as many sensory modalities as possible, in order to augment the sense of presence during exposure sessions. Various technologies used to augment the exposure therapy process by presenting multi-sensory cues (e.g., sights, smells, sounds). Studies have shown that scents can elicit emotionally charged memories, but no prior research has examined the effect of olfactory stimuli upon the patient’s sense of presence during simulated exposure tasks. Methods 60 adult participants navigated a mildly anxiety-producing virtual environment (VE) similar to those used in the treatment of anxiety disorders. Participants had no autobiographical memory associated with the VE. State anxiety, Presence ratings, and electrodermal (EDA) activity were collected throughout the experiment. Results Utilizing a Bonferroni corrected Linear Mixed Model, our results showed statistically significant relationships between olfactory stimuli and presence as assessed by both the Igroup Presence Questionnaire (IPQ: R2 = 0.85, (F(3,52) = 6.625, p = 0.0007) and a single item visual-analogue scale (R2 = 0.85, (F(3,52) = 5.382, p = 0.0027). State anxiety was unaffected by the presence or absence of olfactory cues. EDA was unaffected by experimental condition. Conclusion Olfactory stimuli increase presence in virtual environments that approximate those typical in exposure therapy, but did not increase EDA. Additionally, once administered, the removal of scents resulted in a disproportionate decrease in presence. Implications for incorporating the use of scents to increase the efficacy of exposure therapy is discussed. PMID:27310253
Vroom: designing an augmented environment for remote collaboration in digital cinema production
NASA Astrophysics Data System (ADS)
Margolis, Todd; Cornish, Tracy
2013-03-01
As media technologies become increasingly affordable, compact and inherently networked, new generations of telecollaborative platforms continue to arise which integrate these new affordances. Virtual reality has been primarily concerned with creating simulations of environments that can transport participants to real or imagined spaces that replace the "real world". Meanwhile Augmented Reality systems have evolved to interleave objects from Virtual Reality environments into the physical landscape. Perhaps now there is a new class of systems that reverse this precept to enhance dynamic media landscapes and immersive physical display environments to enable intuitive data exploration through collaboration. Vroom (Virtual Room) is a next-generation reconfigurable tiled display environment in development at the California Institute for Telecommunications and Information Technology (Calit2) at the University of California, San Diego. Vroom enables freely scalable digital collaboratories, connecting distributed, high-resolution visualization resources for collaborative work in the sciences, engineering and the arts. Vroom transforms a physical space into an immersive media environment with large format interactive display surfaces, video teleconferencing and spatialized audio built on a highspeed optical network backbone. Vroom enables group collaboration for local and remote participants to share knowledge and experiences. Possible applications include: remote learning, command and control, storyboarding, post-production editorial review, high resolution video playback, 3D visualization, screencasting and image, video and multimedia file sharing. To support these various scenarios, Vroom features support for multiple user interfaces (optical tracking, touch UI, gesture interface, etc.), support for directional and spatialized audio, giga-pixel image interactivity, 4K video streaming, 3D visualization and telematic production. This paper explains the design process that has been utilized to make Vroom an accessible and intuitive immersive environment for remote collaboration specifically for digital cinema production.
White, Ian; Buchberg, Brian; Tsikitis, V Liana; Herzig, Daniel O; Vetto, John T; Lu, Kim C
2014-06-01
Colorectal cancer is the second most common cause of death in the USA. The need for screening colonoscopies, and thus adequately trained endoscopists, particularly in rural areas, is on the rise. Recent increases in required endoscopic cases for surgical resident graduation by the Surgery Residency Review Committee (RRC) further emphasize the need for more effective endoscopic training during residency to determine if a virtual reality colonoscopy simulator enhances surgical resident endoscopic education by detecting improvement in colonoscopy skills before and after 6 weeks of formal clinical endoscopic training. We conducted a retrospective review of prospectively collected surgery resident data on an endoscopy simulator. Residents performed four different clinical scenarios on the endoscopic simulator before and after a 6-week endoscopic training course. Data were collected over a 5-year period from 94 different residents performing a total of 795 colonoscopic simulation scenarios. Main outcome measures included time to cecal intubation, "red out" time, and severity of simulated patient discomfort (mild, moderate, severe, extreme) during colonoscopy scenarios. Average time to intubation of the cecum was 6.8 min for those residents who had not undergone endoscopic training versus 4.4 min for those who had undergone endoscopic training (p < 0.001). Residents who could be compared against themselves (pre vs. post-training), cecal intubation times decreased from 7.1 to 4.3 min (p < 0.001). Post-endoscopy rotation residents caused less severe discomfort during simulated colonoscopy than pre-endoscopy rotation residents (4 vs. 10%; p = 0.004). Virtual reality endoscopic simulation is an effective tool for both augmenting surgical resident endoscopy cancer education and measuring improvement in resident performance after formal clinical endoscopic training.
Davis, Matthew Christopher; Can, Dang D; Pindrik, Jonathan; Rocque, Brandon G; Johnston, James M
2016-02-01
Technology allowing a remote, experienced surgeon to provide real-time guidance to local surgeons has great potential for training and capacity building in medical centers worldwide. Virtual interactive presence and augmented reality (VIPAR), an iPad-based tool, allows surgeons to provide long-distance, virtual assistance wherever a wireless internet connection is available. Local and remote surgeons view a composite image of video feeds at each station, allowing for intraoperative telecollaboration in real time. Local and remote stations were established in Ho Chi Minh City, Vietnam, and Birmingham, Alabama, as part of ongoing neurosurgical collaboration. Endoscopic third ventriculostomy with choroid plexus coagulation with VIPAR was used for subjective and objective evaluation of system performance. VIPAR allowed both surgeons to engage in complex visual and verbal communication during the procedure. Analysis of 5 video clips revealed video delay of 237 milliseconds (range, 93-391 milliseconds) relative to the audio signal. Excellent image resolution allowed the remote neurosurgeon to visualize all critical anatomy. The remote neurosurgeon could gesture to structures with no detectable difference in accuracy between stations, allowing for submillimeter precision. Fifteen endoscopic third ventriculostomy with choroid plexus coagulation procedures have been performed with the use of VIPAR between Vietnam and the United States, with no significant complications. 80% of these patients remain shunt-free. Evolving technologies that allow long-distance, intraoperative guidance, and knowledge transfer hold great potential for highly efficient international neurosurgical education. VIPAR is one example of an inexpensive, scalable platform for increasing global neurosurgical capacity. Efforts to create a network of Vietnamese neurosurgeons who use VIPAR for collaboration are underway. Copyright © 2016 Elsevier Inc. All rights reserved.
Perform light and optic experiments in Augmented Reality
NASA Astrophysics Data System (ADS)
Wozniak, Peter; Vauderwange, Oliver; Curticapean, Dan; Javahiraly, Nicolas; Israel, Kai
2015-10-01
In many scientific studies lens experiments are part of the curriculum. The conducted experiments are meant to give the students a basic understanding for the laws of optics and its applications. Most of the experiments need special hardware like e.g. an optical bench, light sources, apertures and different lens types. Therefore it is not possible for the students to conduct any of the experiments outside of the university's laboratory. Simple optical software simulators enabling the students to virtually perform lens experiments already exist, but are mostly desktop or web browser based. Augmented Reality (AR) is a special case of mediated and mixed reality concepts, where computers are used to add, subtract or modify one's perception of reality. As a result of the success and widespread availability of handheld mobile devices, like e.g. tablet computers and smartphones, mobile augmented reality applications are easy to use. Augmented reality can be easily used to visualize a simulated optical bench. The students can interactively modify properties like e.g. lens type, lens curvature, lens diameter, lens refractive index and the positions of the instruments in space. Light rays can be visualized and promote an additional understanding of the laws of optics. An AR application like this is ideally suited to prepare the actual laboratory sessions and/or recap the teaching content. The authors will present their experience with handheld augmented reality applications and their possibilities for light and optic experiments without the needs for specialized optical hardware.
Besharati Tabrizi, Leila; Mahvash, Mehran
2015-07-01
An augmented reality system has been developed for image-guided neurosurgery to project images with regions of interest onto the patient's head, skull, or brain surface in real time. The aim of this study was to evaluate system accuracy and to perform the first intraoperative application. Images of segmented brain tumors in different localizations and sizes were created in 10 cases and were projected to a head phantom using a video projector. Registration was performed using 5 fiducial markers. After each registration, the distance of the 5 fiducial markers from the visualized tumor borders was measured on the virtual image and on the phantom. The difference was considered a projection error. Moreover, the image projection technique was intraoperatively applied in 5 patients and was compared with a standard navigation system. Augmented reality visualization of the tumors succeeded in all cases. The mean time for registration was 3.8 minutes (range 2-7 minutes). The mean projection error was 0.8 ± 0.25 mm. There were no significant differences in accuracy according to the localization and size of the tumor. Clinical feasibility and reliability of the augmented reality system could be proved intraoperatively in 5 patients (projection error 1.2 ± 0.54 mm). The augmented reality system is accurate and reliable for the intraoperative projection of images to the head, skull, and brain surface. The ergonomic advantage of this technique improves the planning of neurosurgical procedures and enables the surgeon to use direct visualization for image-guided neurosurgery.
Rehm, Imogen C.; Foenander, Emily; Wallace, Klaire; Abbott, Jo-Anne M.; Kyrios, Michael; Thomas, Neil
2016-01-01
In the burgeoning field of e-mental health interventions, avatars are increasingly being utilized to facilitate online communication between clients and therapists, and among peers. Avatars are digital self-representations, which enable individuals to interact with each other in computer-based virtual environments. In this narrative review, we examine the psychotherapeutic applications of avatars that have been investigated and trialed to date. Five key applications were identified (1) in the formation of online peer support communities; (2) replicating traditional modes of psychotherapy by using avatars as a vehicle to communicate within a wholly virtual environment; (3) using avatar technology to facilitate or augment face-to-face treatment; (4) as part of serious games; and (5) communication with an autonomous virtual therapist. Across these applications, avatars appeared to serve several functions conducive to treatment engagement by (1) facilitating the development of a virtual therapeutic alliance; (2) reducing communication barriers; (3) promoting treatment-seeking through anonymity; (4) promoting expression and exploration of client identity; and (5) enabling therapists to control and manipulate treatment stimuli. Further research into the feasibility and ethical implementation of avatar-based psychotherapies is required. PMID:27917128
Virtual reality and hallucination: a technoetic perspective
NASA Astrophysics Data System (ADS)
Slattery, Diana R.
2008-02-01
Virtual Reality (VR), especially in a technologically focused discourse, is defined by a class of hardware and software, among them head-mounted displays (HMDs), navigation and pointing devices; and stereoscopic imaging. This presentation examines the experiential aspect of VR. Putting "virtual" in front of "reality" modifies the ontological status of a class of experience-that of "reality." Reality has also been modified [by artists, new media theorists, technologists and philosophers] as augmented, mixed, simulated, artificial, layered, and enhanced. Modifications of reality are closely tied to modifications of perception. Media theorist Roy Ascott creates a model of three "VR's": Verifiable Reality, Virtual Reality, and Vegetal (entheogenically induced) Reality. The ways in which we shift our perceptual assumptions, create and verify illusions, and enter "the willing suspension of disbelief" that allows us entry into imaginal worlds is central to the experience of VR worlds, whether those worlds are explicitly representational (robotic manipulations by VR) or explicitly imaginal (VR artistic creations). The early rhetoric surrounding VR was interwoven with psychedelics, a perception amplified by Timothy Leary's presence on the historic SIGGRAPH panel, and the Wall Street Journal's tag of VR as "electronic LSD." This paper discusses the connections-philosophical, social-historical, and psychological-perceptual between these two domains.
Weisman, David
2010-01-01
Face-to-face bioinformatics courses commonly include a weekly, in-person computer lab to facilitate active learning, reinforce conceptual material, and teach practical skills. Similarly, fully-online bioinformatics courses employ hands-on exercises to achieve these outcomes, although students typically perform this work offsite. Combining a face-to-face lecture course with a web-based virtual laboratory presents new opportunities for collaborative learning of the conceptual material, and for fostering peer support of technical bioinformatics questions. To explore this combination, an in-person lecture-only undergraduate bioinformatics course was augmented with a remote web-based laboratory, and tested with a large class. This study hypothesized that the collaborative virtual lab would foster active learning and peer support, and tested this hypothesis by conducting a student survey near the end of the semester. Respondents broadly reported strong benefits from the online laboratory, and strong benefits from peer-provided technical support. In comparison with traditional in-person teaching labs, students preferred the virtual lab by a factor of two. Key aspects of the course architecture and design are described to encourage further experimentation in teaching collaborative online bioinformatics laboratories. Copyright © 2010 International Union of Biochemistry and Molecular Biology, Inc.
Low cost heads-up virtual reality (HUVR) with optical tracking and haptic feedback
NASA Astrophysics Data System (ADS)
Margolis, Todd; DeFanti, Thomas A.; Dawe, Greg; Prudhomme, Andrew; Schulze, Jurgen P.; Cutchin, Steve
2011-03-01
Researchers at the University of California, San Diego, have created a new, relatively low-cost augmented reality system that enables users to touch the virtual environment they are immersed in. The Heads-Up Virtual Reality device (HUVR) couples a consumer 3D HD flat screen TV with a half-silvered mirror to project any graphic image onto the user's hands and into the space surrounding them. With his or her head position optically tracked to generate the correct perspective view, the user maneuvers a force-feedback (haptic) device to interact with the 3D image, literally 'touching' the object's angles and contours as if it was a tangible physical object. HUVR can be used for training and education in structural and mechanical engineering, archaeology and medicine as well as other tasks that require hand-eye coordination. One of the most unique characteristics of HUVR is that a user can place their hands inside of the virtual environment without occluding the 3D image. Built using open-source software and consumer level hardware, HUVR offers users a tactile experience in an immersive environment that is functional, affordable and scalable.
An optical brain computer interface for environmental control.
Ayaz, Hasan; Shewokis, Patricia A; Bunce, Scott; Onaral, Banu
2011-01-01
A brain computer interface (BCI) is a system that translates neurophysiological signals detected from the brain to supply input to a computer or to control a device. Volitional control of neural activity and its real-time detection through neuroimaging modalities are key constituents of BCI systems. The purpose of this study was to develop and test a new BCI design that utilizes intention-related cognitive activity within the dorsolateral prefrontal cortex using functional near infrared (fNIR) spectroscopy. fNIR is a noninvasive, safe, portable and affordable optical technique with which to monitor hemodynamic changes, in the brain's cerebral cortex. Because of its portability and ease of use, fNIR is amenable to deployment in ecologically valid natural working environments. We integrated a control paradigm in a computerized 3D virtual environment to augment interactivity. Ten healthy participants volunteered for a two day study in which they navigated a virtual environment with keyboard inputs, but were required to use the fNIR-BCI for interaction with virtual objects. Results showed that participants consistently utilized the fNIR-BCI with an overall success rate of 84% and volitionally increased their cerebral oxygenation level to trigger actions within the virtual environment.
Rehm, Imogen C; Foenander, Emily; Wallace, Klaire; Abbott, Jo-Anne M; Kyrios, Michael; Thomas, Neil
2016-01-01
In the burgeoning field of e-mental health interventions, avatars are increasingly being utilized to facilitate online communication between clients and therapists, and among peers. Avatars are digital self-representations, which enable individuals to interact with each other in computer-based virtual environments. In this narrative review, we examine the psychotherapeutic applications of avatars that have been investigated and trialed to date. Five key applications were identified (1) in the formation of online peer support communities; (2) replicating traditional modes of psychotherapy by using avatars as a vehicle to communicate within a wholly virtual environment; (3) using avatar technology to facilitate or augment face-to-face treatment; (4) as part of serious games; and (5) communication with an autonomous virtual therapist. Across these applications, avatars appeared to serve several functions conducive to treatment engagement by (1) facilitating the development of a virtual therapeutic alliance; (2) reducing communication barriers; (3) promoting treatment-seeking through anonymity; (4) promoting expression and exploration of client identity; and (5) enabling therapists to control and manipulate treatment stimuli. Further research into the feasibility and ethical implementation of avatar-based psychotherapies is required.
Haddad, Tarek; Himes, Adam; Thompson, Laura; Irony, Telba; Nair, Rajesh
2017-01-01
Evaluation of medical devices via clinical trial is often a necessary step in the process of bringing a new product to market. In recent years, device manufacturers are increasingly using stochastic engineering models during the product development process. These models have the capability to simulate virtual patient outcomes. This article presents a novel method based on the power prior for augmenting a clinical trial using virtual patient data. To properly inform clinical evaluation, the virtual patient model must simulate the clinical outcome of interest, incorporating patient variability, as well as the uncertainty in the engineering model and in its input parameters. The number of virtual patients is controlled by a discount function which uses the similarity between modeled and observed data. This method is illustrated by a case study of cardiac lead fracture. Different discount functions are used to cover a wide range of scenarios in which the type I error rates and power vary for the same number of enrolled patients. Incorporation of engineering models as prior knowledge in a Bayesian clinical trial design can provide benefits of decreased sample size and trial length while still controlling type I error rate and power.
Yang, Yea-Ru; Tsai, Meng-Pin; Chuang, Tien-Yow; Sung, Wen-Hsu; Wang, Ray-Yau
2008-08-01
This is a single blind randomized controlled trial to examine the effect of virtual reality-based training on the community ambulation in individuals with stroke. Twenty subjects with stroke were assigned randomly to either the control group (n=9) or the experimental group (n=11). Subjects in the control group received the treadmill training. Subjects in the experimental group underwent the virtual reality-based treadmill training. Walking speed, community walking time, walking ability questionnaire (WAQ), and activities-specific balance confidence (ABC) scale were evaluated. Subjects in the experimental group improved significantly in walking speed, community walking time, and WAQ score at posttraining and 1-month follow-up periods. Their ABC score also significantly increased at posttraining but did not maintain at follow-up period. Regarding the between-group comparisons, the experimental group improved significantly more than control group in walking speed (P=0.03) and community walking time (P=0.04) at posttraining period and in WAQ score (P=0.03) at follow-up period. Our results support the perceived benefits of gait training programs that incorporate virtual reality to augment the community ambulation of individuals with stroke.
van Duren, B H; Sugand, K; Wescott, R; Carrington, R; Hart, A
2018-05-01
Hip fractures contribute to a significant clinical burden globally with over 1.6 million cases per annum and up to 30% mortality rate within the first year. Insertion of a dynamic hip screw (DHS) is a frequently performed procedure to treat extracapsular neck of femur fractures. Poorly performed DHS fixation of extracapsular neck of femur fractures can result in poor mobilisation, chronic pain, and increased cut-out rate requiring revision surgery. A realistic, affordable, and portable fluoroscopic simulation system can improve performance metrics in trainees, including the tip-apex distance (the only clinically validated outcome), and improve outcomes. We developed a digital fluoroscopic imaging simulator using orthogonal cameras to track coloured markers attached to the guide-wire which created a virtual overlay on fluoroscopic images of the hip. To test the accuracy with which the augmented reality system could track a guide-wire, a standard workshop femur was used to calibrate the system with a positional marker fixed to indicate the apex; this allowed for comparison between guide-wire tip-apex distance (TAD) calculated by the system to be compared to that physically measured. Tests were undertaken to determine: (1) how well the apex could be targeted; (2) the accuracy of the calculated TAD. (3) The number of iterations through the algorithm giving the optimal accuracy-time relationship. The calculated TAD was found to have an average root mean square error of 4.2 mm. The accuracy of the algorithm was shown to increase with the number of iterations up to 20 beyond which the error asymptotically converged to an error of 2 mm. This work demonstrates a novel augmented reality simulation of guide-wire insertion in DHS surgery. To our knowledge this has not been previously achieved. In contrast to virtual reality, augmented reality is able to simulate fluoroscopy while allowing the trainee to interact with real instrumentation and performing the procedure on workshop bone models. Copyright © 2018 IPEM. Published by Elsevier Ltd. All rights reserved.
Detection of Nuclear Sources by UAV Teleoperation Using a Visuo-Haptic Augmented Reality Interface
Micconi, Giorgio; Caselli, Stefano; Benassi, Giacomo; Zambelli, Nicola; Bettelli, Manuele
2017-01-01
A visuo-haptic augmented reality (VHAR) interface is presented enabling an operator to teleoperate an unmanned aerial vehicle (UAV) equipped with a custom CdZnTe-based spectroscopic gamma-ray detector in outdoor environments. The task is to localize nuclear radiation sources, whose location is unknown to the user, without the close exposure of the operator. The developed detector also enables identification of the localized nuclear sources. The aim of the VHAR interface is to increase the situation awareness of the operator. The user teleoperates the UAV using a 3DOF haptic device that provides an attractive force feedback around the location of the most intense detected radiation source. Moreover, a fixed camera on the ground observes the environment where the UAV is flying. A 3D augmented reality scene is displayed on a computer screen accessible to the operator. Multiple types of graphical overlays are shown, including sensor data acquired by the nuclear radiation detector, a virtual cursor that tracks the UAV and geographical information, such as buildings. Experiments performed in a real environment are reported using an intense nuclear source. PMID:28961198
NASA Astrophysics Data System (ADS)
Goma, Sergio R.
2015-03-01
In current times, mobile technologies are ubiquitous and the complexity of problems is continuously increasing. In the context of advancement of engineering, we explore in this paper possible reasons that could cause a saturation in technology evolution - namely the ability of problem solving based on previous results and the ability of expressing solutions in a more efficient way, concluding that `thinking outside of brain' - as in solving engineering problems that are expressed in a virtual media due to their complexity - would benefit from mobile technology augmentation. This could be the necessary evolutionary step that would provide the efficiency required to solve new complex problems (addressing the `running out of time' issue) and remove the communication of results barrier (addressing the human `perception/expression imbalance' issue). Some consequences are discussed, as in this context the artificial intelligence becomes an automation tool aid instead of a necessary next evolutionary step. The paper concludes that research in modeling as problem solving aid and data visualization as perception aid augmented with mobile technologies could be the path to an evolutionary step in advancing engineering.
Detection of Nuclear Sources by UAV Teleoperation Using a Visuo-Haptic Augmented Reality Interface.
Aleotti, Jacopo; Micconi, Giorgio; Caselli, Stefano; Benassi, Giacomo; Zambelli, Nicola; Bettelli, Manuele; Zappettini, Andrea
2017-09-29
A visuo-haptic augmented reality (VHAR) interface is presented enabling an operator to teleoperate an unmanned aerial vehicle (UAV) equipped with a custom CdZnTe-based spectroscopic gamma-ray detector in outdoor environments. The task is to localize nuclear radiation sources, whose location is unknown to the user, without the close exposure of the operator. The developed detector also enables identification of the localized nuclear sources. The aim of the VHAR interface is to increase the situation awareness of the operator. The user teleoperates the UAV using a 3DOF haptic device that provides an attractive force feedback around the location of the most intense detected radiation source. Moreover, a fixed camera on the ground observes the environment where the UAV is flying. A 3D augmented reality scene is displayed on a computer screen accessible to the operator. Multiple types of graphical overlays are shown, including sensor data acquired by the nuclear radiation detector, a virtual cursor that tracks the UAV and geographical information, such as buildings. Experiments performed in a real environment are reported using an intense nuclear source.
Boosting physics education through mobile augmented reality
NASA Astrophysics Data System (ADS)
Crǎciun, Dana; Bunoiu, Mǎdǎlin
2017-12-01
The integration of collaborative applications, based on modern learning technologies and the Internet, of various visualization techniques and digital strategies in open, flexible modern learning environments which facilitate access to resources, represents a challenge for physics teachers in Romania in general, and for novice teachers in particular. Although large efforts have been made worldwide to invest in educational technologies, their impact on the students' learning outcomes is quite modest. In this paper, we describe and analyze various curricular and extracurricular activities specifically designed for and undertaken by pre-service physics teachers. These activities employ new educational technologies, mobile augmented reality (MAR) and are based on modern teaching and learning theories. MAR is an extension for mobile devices of augmented reality, an interactive and in real time combination, of real and virtual objects overlaid in the real environment. The obtained results show that pre-service physics teachers are confident in using MAR in their teaching and learning activities, and consider that the activities performed helped them develop the skills necessary for science teachers in a technology-based society and to reflect upon the role of technology in the current Romanian educational context.
Augmented reality in medical education?
Kamphuis, Carolien; Barsom, Esther; Schijven, Marlies; Christoph, Noor
2014-09-01
Learning in the medical domain is to a large extent workplace learning and involves mastery of complex skills that require performance up to professional standards in the work environment. Since training in this real-life context is not always possible for reasons of safety, costs, or didactics, alternative ways are needed to achieve clinical excellence. Educational technology and more specifically augmented reality (AR) has the potential to offer a highly realistic situated learning experience supportive of complex medical learning and transfer. AR is a technology that adds virtual content to the physical real world, thereby augmenting the perception of reality. Three examples of dedicated AR learning environments for the medical domain are described. Five types of research questions are identified that may guide empirical research into the effects of these learning environments. Up to now, empirical research mainly appears to focus on the development, usability and initial implementation of AR for learning. Limited review results reflect the motivational value of AR, its potential for training psychomotor skills and the capacity to visualize the invisible, possibly leading to enhanced conceptual understanding of complex causality.
Kin, Taichi; Nakatomi, Hirofumi; Shono, Naoyuki; Nomura, Seiji; Saito, Toki; Oyama, Hiroshi; Saito, Nobuhito
2017-10-15
Simulation and planning of surgery using a virtual reality model is becoming common with advances in computer technology. In this study, we conducted a literature search to find trends in virtual simulation of surgery for brain tumors. A MEDLINE search for "neurosurgery AND (simulation OR virtual reality)" retrieved a total of 1,298 articles published in the past 10 years. After eliminating studies designed solely for education and training purposes, 28 articles about the clinical application remained. The finding that the vast majority of the articles were about education and training rather than clinical applications suggests that several issues need be addressed for clinical application of surgical simulation. In addition, 10 of the 28 articles were from Japanese groups. In general, the 28 articles demonstrated clinical benefits of virtual surgical simulation. Simulation was particularly useful in better understanding complicated spatial relations of anatomical landmarks and in examining surgical approaches. In some studies, Virtual reality models were used on either surgical navigation system or augmented reality technology, which projects virtual reality images onto the operating field. Reported problems were difficulties in standardized, objective evaluation of surgical simulation systems; inability to respond to tissue deformation caused by surgical maneuvers; absence of the system functionality to reflect features of tissue (e.g., hardness and adhesion); and many problems with image processing. The amount of description about image processing tended to be insufficient, indicating that the level of evidence, risk of bias, precision, and reproducibility need to be addressed for further advances and ultimately for full clinical application.
On-Board GPS Clock Monitoring for Signal Integrity
2010-11-01
to-alert requirements to permit primary reliance for safety -of-life applications. Augmentation systems are being developed and deployed to address...virtually no false alerts from the combined system . With three running, on-board AFSs , occasional breaks of the error threshold can be allowed if the... system can be assured of transfer to another AFS within a period shorter than the required TTA. With only two AFSs on board running and measured
2012-06-01
virtual augmentation of the antenna size to the equivalent distance travelled by the...Portugal 7. Instituto Hidrografico - Biblioteca Portuguese Navy, Instituto Hidrografico Lisbon, Portugal 8. Escola Naval - CINAV Base Naval...de Lisboa, 2810-001 Alfeite Almada, Portugal 9. Escola Naval - Biblioteca Base Naval de Lisboa, 2810-001 Alfeite Almada, Portugal 10. LT Ricardo Vicente Portuguese Navy, Instituto Hidrografico Lisbon, Portugal
2006-06-01
allowing substantial see-around capability. Regions of visual suppression due to binocular rivalry ( luning ) are shown along the shaded flanks of...that the visual suppression of binocular rivalry, luning , (Velger, 1998, p.56-58) associated with the partial overlap conditions did not materially...tags were displayed. Thus, the frequency of conflicting binocular contours was reduced. In any case, luning does not seem to introduce major
Human Factors Issues in the Use of Virtual and Augmented Reality for Military Purposes - USA
2005-12-01
and provide a means of output, MOVES has built a prototype system and continues research into the artificial intelligence and other factors required...role in any attempt to create automaton warriors. Indeed game-theoretic notions have been utilized in applications of artificial intelligence to...Review Board at the Defense Intelligence Agency (DIA). AFRL was notified that DIA will sponsor DTNG for Certification and Accreditation. Det 4 is expected
Lam, Chee Kiang; Sundaraj, Kenneth; Sulaiman, Mohd Nazri; Qamarruddin, Fazilawati A
2016-06-14
Computer based surgical training is believed to be capable of providing a controlled virtual environment for medical professionals to conduct standardized training or new experimental procedures on virtual human body parts, which are generated and visualised three-dimensionally on a digital display unit. The main objective of this study was to conduct virtual phacoemulsification cataract surgery to compare performance by users with different proficiency on a virtual reality platform equipped with a visual guidance system and a set of performance parameters. Ten experienced ophthalmologists and six medical residents were invited to perform the virtual surgery of the four main phacoemulsification cataract surgery procedures - 1) corneal incision (CI), 2) capsulorhexis (C), 3) phacoemulsification (P), and 4) intraocular lens implantation (IOL). Each participant was required to perform the complete phacoemulsification cataract surgery using the simulator for three consecutive trials (a standardized 30-min session). The performance of the participants during the three trials was supported using a visual guidance system and evaluated by referring to a set of parameters that was implemented in the performance evaluation system of the simulator. Subjects with greater experience obtained significantly higher scores in all four main procedures - CI1 (ρ = 0.038), CI2 (ρ = 0.041), C1 (ρ = 0.032), P2 (ρ = 0.035) and IOL1 (ρ = 0.011). It was also found that experience improved the completion times in all modules - CI4 (ρ = 0.026), C4 (ρ = 0.018), P6 (ρ = 0.028) and IOL4 (ρ = 0.029). Positive correlation was observed between experience and anti-tremor - C2 (ρ = 0.026), P3 (ρ = 0.015), P4 (ρ = 0.042) and IOL2 (ρ = 0.048) and similarly with anti-rupture - CI3 (ρ = 0.013), C3 (ρ = 0.027), P5 (ρ = 0.021) and IOL3 (ρ = 0.041). No significant difference was observed between the groups with regards to P1 (ρ = 0.077). Statistical analysis of the results obtained from repetitive trials between two groups of users reveal that augmented virtual reality (VR) simulators have the potential and capability to be used as a feasible proficiency assessment tool for the complete four main procedures of phacoemulsification cataract surgery (ρ < 0.05), indicating the construct validity of the modules simulated with augmented visual guidance and assessed through performance parameters.
NASA Astrophysics Data System (ADS)
Qu, Junbo; Yan, Tie; Sun, Xiaofeng; Chen, Ye; Pan, Yi
2017-10-01
With the development of drilling technology to deeper stratum, overflowing especially gas cut occurs frequently, and then flow regime in wellbore annulus is from the original drilling fluid single-phase flow into gas & liquid two-phase flow. By using averaged two-fluid model equations and the basic principle of fluid mechanics to establish the continuity equations and momentum conservation equations of gas phase & liquid phase respectively. Relationship between pressure and density of gas & liquid was introduced to obtain hyperbolic equation, and get the expression of the dimensionless eigenvalue of the equation by using the characteristic line method, and analyze wellbore flow regime to get the critical gas content under different virtual mass force coefficients. Results show that the range of equation eigenvalues is getting smaller and smaller with the increase of gas content. When gas content reaches the critical point, the dimensionless eigenvalue of equation has no real solution, and the wellbore flow regime changed from bubble flow to bomb flow. When virtual mass force coefficients are 0.50, 0.60, 0.70 and 0.80 respectively, the critical gas contents are 0.32, 0.34, 0.37 and 0.39 respectively. The higher the coefficient of virtual mass force, the higher gas content in wellbore corresponding to the critical point of transition flow regime, which is in good agreement with previous experimental results. Therefore, it is possible to determine whether there is a real solution of the dimensionless eigenvalue of equation by virtual mass force coefficient and wellbore gas content, from which we can obtain the critical condition of wellbore flow regime transformation. It can provide theoretical support for the accurate judgment of the annular flow regime.
A novel upper limb rehabilitation system with self-driven virtual arm illusion.
Aung, Yee Mon; Al-Jumaily, Adel; Anam, Khairul
2014-01-01
This paper proposes a novel upper extremity rehabilitation system with virtual arm illusion. It aims for fast recovery from lost functions of the upper limb as a result of stroke to provide a novel rehabilitation system for paralyzed patients. The system is integrated with a number of technologies that include Augmented Reality (AR) technology to develop game like exercise, computer vision technology to create the illusion scene, 3D modeling and model simulation, and signal processing to detect user intention via EMG signal. The effectiveness of the developed system has evaluated via usability study and questionnaires which is represented by graphical and analytical methods. The evaluation provides with positive results and this indicates the developed system has potential as an effective rehabilitation system for upper limb impairment.
Design of a 3D Navigation Technique Supporting VR Interaction
NASA Astrophysics Data System (ADS)
Boudoin, Pierre; Otmane, Samir; Mallem, Malik
2008-06-01
Multimodality is a powerful paradigm to increase the realness and the easiness of the interaction in Virtual Environments (VEs). In particular, the search for new metaphors and techniques for 3D interaction adapted to the navigation task is an important stage for the realization of future 3D interaction systems that support multimodality, in order to increase efficiency and usability. In this paper we propose a new multimodal 3D interaction model called Fly Over. This model is especially devoted to the navigation task. We present a qualitative comparison between Fly Over and a classical navigation technique called gaze-directed steering. The results from preliminary evaluation on the IBISC semi-immersive Virtual Reality/Augmented Realty EVR@ platform show that Fly Over is a user friendly and efficient navigation technique.
Watson, Alice J.; Grant, Richard W.; Bello, Heather; Hoch, Daniel B.
2008-01-01
New technologies, such as online networking tools, offer innovative ways to engage patients in their diabetes care. Second Life (SL) is one such virtual world that allows patients to interact in a 3D environment with peers and healthcare providers. This article presents a framework that demonstrates how applications within SL can be constructed to meet the needs of patients with diabetes, allowing them to attend group visits, learn more about lifestyle changes, and foster a sense of support and emotional well-being. This experiential approach to education may prove more engaging, and therefore successful, than existing strategies. Addressing concerns relating to privacy and liability is a necessary first step to engage providers in this new approach to patient care. PMID:19885247
2D virtual texture on 3D real object with coded structured light
NASA Astrophysics Data System (ADS)
Molinier, Thierry; Fofi, David; Salvi, Joaquim; Gorria, Patrick
2008-02-01
Augmented reality is used to improve color segmentation on human body or on precious no touch artifacts. We propose a technique to project a synthesized texture on real object without contact. Our technique can be used in medical or archaeological application. By projecting a suitable set of light patterns onto the surface of a 3D real object and by capturing images with a camera, a large number of correspondences can be found and the 3D points can be reconstructed. We aim to determine these points of correspondence between cameras and projector from a scene without explicit points and normals. We then project an adjusted texture onto the real object surface. We propose a global and automatic method to virtually texture a 3D real object.
Lee, Hanju; Kanakogi, Yasuhiro; Hiraki, Kazuo
2015-01-01
Animated pedagogical agents are lifelike virtual characters designed to augment learning. A review of developmental psychology literature led to the hypothesis that the temporal contingency of such agents would promote human learning. We developed a Pedagogical Agent with Gaze Interaction (PAGI), an experimental animated pedagogical agent that engages in gaze interaction with students. In this study, university students learned words of a foreign language, with temporally contingent PAGI (live group) or recorded version of PAGI (recorded group), which played pre-recorded sequences from live sessions. The result revealed that students in the live group scored considerably better than those in the recorded group. The finding indicates that incorporating temporal contingency of gaze interaction from a pedagogical agent has positive effect on learning. PMID:26064584
VRACK: measuring pedal kinematics during stationary bike cycling.
Farjadian, Amir B; Kong, Qingchao; Gade, Venkata K; Deutsch, Judith E; Mavroidis, Constantinos
2013-06-01
Ankle impairment and lower limb asymmetries in strength and coordination are common symptoms for individuals with selected musculoskeletal and neurological impairments. The virtual reality augmented cycling kit (VRACK) was designed as a compact mechatronics system for lower limb and mobility rehabilitation. The system measures interaction forces and cardiac activity during cycling in a virtual environment. The kinematics measurement was added to the system. Due to the constrained problem definition, the combination of inertial measurement unit (IMU) and Kalman filtering was recruited to compute the optimal pedal angular displacement during dynamic cycling exercise. Using a novel benchmarking method the accuracy of IMU-based kinematics measurement was evaluated. Relatively accurate angular measurements were achieved. The enhanced VRACK system can serve as a rehabilitation device to monitor biomechanical and physiological variables during cycling on a stationary bike.
Immersive Environments - A Connectivist Approach
NASA Astrophysics Data System (ADS)
Loureiro, Ana; Bettencourt, Teresa
We are conducting a research project with the aim of achieving better and more efficient ways to facilitate teaching and learning in Higher Level Education. We have chosen virtual environments, with particular emphasis to Second Life® platform augmented by web 2.0 tools, to develop the study. The Second Life® environment has some interesting characteristics that captured our attention, it is immersive; it is a real world simulator; it is a social network; it allows real time communication, cooperation, collaboration and interaction; it is a safe and controlled environment. We specifically chose tools from web 2.0 that enable sharing and collaborative way of learning. Through understanding the characteristics of this learning environment, we believe that immersive learning along with other virtual tools can be integrated in today's pedagogical practices.
Augmented standard model and the simplest scenario
NASA Astrophysics Data System (ADS)
Wu, Tai Tsun; Wu, Sau Lan
2015-11-01
The experimental discovery of the Higgs particle in 2012 by the ATLAS Collaboration and the CMS Collaboration at CERN ushers in a new era of particle physics. On the basis of these data, scalar quarks and scalar leptons are added to each generation of quarks and leptons. The resulting augmented standard model has fermion-boson symmetry for each of three generations, but only one Higgs doublet giving masses to all the elementary particles. A specific special case, the simplest scenario, is studied in detail. In this case, there are twenty six quadratic divergences, and all these divergences are cancelled provided that one single relation between the masses is satisfied. This mass relation contains a great deal of information, and in particular determines the masses of all the right-handed scalar quarks and scalar leptons, while gives relations for the masses of the left-handed ones. An alternative procedure is also given with a different starting point and less reliance on the experimental data. The result is of course the same.
NASA Astrophysics Data System (ADS)
Chen, Jiao-Kai
2018-04-01
We present one reduction of the Bethe-Salpeter equation for the bound states composed of two off-mass-shell constituents. Both the relativistic effects and the virtuality effects can be considered in the obtained spinless virtuality distribution equation. The eigenvalues of the spinless virtuality distribution equation are perturbatively calculated and the bound states e+e-, μ+μ-, τ+τ-, μ+e-, and τ+e- are discussed.
Marker Registration Technique for Handwritten Text Marker in Augmented Reality Applications
NASA Astrophysics Data System (ADS)
Thanaborvornwiwat, N.; Patanukhom, K.
2018-04-01
Marker registration is a fundamental process to estimate camera poses in marker-based Augmented Reality (AR) systems. We developed AR system that creates correspondence virtual objects on handwritten text markers. This paper presents a new method for registration that is robust for low-content text markers, variation of camera poses, and variation of handwritten styles. The proposed method uses Maximally Stable Extremal Regions (MSER) and polygon simplification for a feature point extraction. The experiment shows that we need to extract only five feature points per image which can provide the best registration results. An exhaustive search is used to find the best matching pattern of the feature points in two images. We also compared performance of the proposed method to some existing registration methods and found that the proposed method can provide better accuracy and time efficiency.
Augmented Reality in Neurosurgery: A Review of Current Concepts and Emerging Applications.
Guha, Daipayan; Alotaibi, Naif M; Nguyen, Nhu; Gupta, Shaurya; McFaul, Christopher; Yang, Victor X D
2017-05-01
Augmented reality (AR) superimposes computer-generated virtual objects onto the user's view of the real world. Among medical disciplines, neurosurgery has long been at the forefront of image-guided surgery, and it continues to push the frontiers of AR technology in the operating room. In this systematic review, we explore the history of AR in neurosurgery and examine the literature on current neurosurgical applications of AR. Significant challenges to surgical AR exist, including compounded sources of registration error, impaired depth perception, visual and tactile temporal asynchrony, and operator inattentional blindness. Nevertheless, the ability to accurately display multiple three-dimensional datasets congruently over the area where they are most useful, coupled with future advances in imaging, registration, display technology, and robotic actuation, portend a promising role for AR in the neurosurgical operating room.
Using Skype to support palliative care surveillance.
Jones, Jacqueline
2014-02-01
The aim of this article is to demonstrate how a novel yet important tool can facilitate family involvement in person-centred care, despite geographical distance. The author presents a case study as an in-depth example of the use of Skype in the context of palliative care at home. Skype enhanced family surveillance and symptom management, augmented shared decision making, provided a space for virtual bedside vigil, and ultimately provided the rapport necessary for optimal end of life care.
The segmentation of the HMD market: optics for smart glasses, smart eyewear, AR and VR headsets
NASA Astrophysics Data System (ADS)
Kress, Bernard; Saeedi, Ehsan; Brac-de-la-Perriere, Vincent
2014-09-01
This paper reviews the various optical technologies that have been developed to implement HMDs (Head Mounted Displays), both as AR (Augmented Reality) devices, VR (Virtual Reality) devices and more recently as smart glasses, smart eyewear or connected glasses. We review the typical requirements and optical performances of such devices and categorize them into distinct groups, which are suited for different (and constantly evolving) market segments, and analyze such market segmentation.
2015-09-01
Training System ARB Aircraft Recovery Bulletins AR Augmented Reality CAG Carrier Air Group CATCC Carrier Air Traffic Control Center COTS...in integration of an optical lens systems into the aircraft carrier. The current generation of optical lens systems integrated into aircraft ...The use of MOVLAS on an aircraft carrier represents a direct communication link between the LSO and pilot. As a backup landing aid system to
An Augmented γ-Spray System to Visualize Biological Effects for Human Body
NASA Astrophysics Data System (ADS)
Manabe, Seiya; Tenzou, Hideki; Kasuga, Takaaki; Iwakura, Yukiko; Johnston, Robert
2017-09-01
The purpose of this study was to develop a new educational system with an easy-to-use interface in order to support comprehension of the biological effects of radiation on the human body within a short period of time. A paint spray-gun was used as a gamma rays source mock-up for the system. The application screen shows the figure of a human body for radiation deposition using the γ-Sprayer, a virtual radiation source, as well as equivalent dosage and a panel for setting the irradiation conditions. While the learner stands in front of the PC monitor, the virtual radiation source is used to deposit radiation on the graphic of the human body that is displayed. Tissue damage is calculated using an interpolation method from the data calculated by the PHITS simulation code in advance while the learner is pulling the trigger with respect to the irradiation time, incident position, and distance from the screen. It was confirmed that the damage was well represented by the interpolation method. The augmented ?-Spray system was assessed by questionnaire. Pre-post questionnaire was taken for our 41 students in National Institute of Technology, Kagawa College. It was also confirmed that the system has a capability of teaching the basic radiation protection concept, quantitative feeling of the radiation dose, and the biological effects
Augmented reality 3D display based on integral imaging
NASA Astrophysics Data System (ADS)
Deng, Huan; Zhang, Han-Le; He, Min-Yang; Wang, Qiong-Hua
2017-02-01
Integral imaging (II) is a good candidate for augmented reality (AR) display, since it provides various physiological depth cues so that viewers can freely change the accommodation and convergence between the virtual three-dimensional (3D) images and the real-world scene without feeling any visual discomfort. We propose two AR 3D display systems based on the theory of II. In the first AR system, a micro II display unit reconstructs a micro 3D image, and the mciro-3D image is magnified by a convex lens. The lateral and depth distortions of the magnified 3D image are analyzed and resolved by the pitch scaling and depth scaling. The magnified 3D image and real 3D scene are overlapped by using a half-mirror to realize AR 3D display. The second AR system uses a micro-lens array holographic optical element (HOE) as an image combiner. The HOE is a volume holographic grating which functions as a micro-lens array for the Bragg-matched light, and as a transparent glass for Bragg mismatched light. A reference beam can reproduce a virtual 3D image from one side and a reference beam with conjugated phase can reproduce the second 3D image from other side of the micro-lens array HOE, which presents double-sided 3D display feature.
Analysis of Mental Workload in Online Shopping: Are Augmented and Virtual Reality Consistent?
Zhao, Xiaojun; Shi, Changxiu; You, Xuqun; Zong, Chenming
2017-01-01
A market research company (Nielsen) reported that consumers in the Asia-Pacific region have become the most active group in online shopping. Focusing on augmented reality (AR), which is one of three major techniques used to change the method of shopping in the future, this study used a mixed design to discuss the influences of the method of online shopping, user gender, cognitive style, product value, and sensory channel on mental workload in virtual reality (VR) and AR situations. The results showed that males' mental workloads were significantly higher than females'. For males, high-value products' mental workload was significantly higher than that of low-value products. In the VR situation, the visual mental workload of field-independent and field-dependent consumers showed a significant difference, but the difference was reduced under audio-visual conditions. In the AR situation, the visual mental workload of field-independent and field-dependent consumers showed a significant difference, but the difference increased under audio-visual conditions. This study provided a psychological study of online shopping with AR and VR technology with applications in the future. Based on the perspective of embodied cognition, AR online shopping may be potential focus of research and market application. For the future design of online shopping platforms and the updating of user experience, this study provides a reference.
Analysis of Mental Workload in Online Shopping: Are Augmented and Virtual Reality Consistent?
Zhao, Xiaojun; Shi, Changxiu; You, Xuqun; Zong, Chenming
2017-01-01
A market research company (Nielsen) reported that consumers in the Asia-Pacific region have become the most active group in online shopping. Focusing on augmented reality (AR), which is one of three major techniques used to change the method of shopping in the future, this study used a mixed design to discuss the influences of the method of online shopping, user gender, cognitive style, product value, and sensory channel on mental workload in virtual reality (VR) and AR situations. The results showed that males’ mental workloads were significantly higher than females’. For males, high-value products’ mental workload was significantly higher than that of low-value products. In the VR situation, the visual mental workload of field-independent and field-dependent consumers showed a significant difference, but the difference was reduced under audio–visual conditions. In the AR situation, the visual mental workload of field-independent and field-dependent consumers showed a significant difference, but the difference increased under audio–visual conditions. This study provided a psychological study of online shopping with AR and VR technology with applications in the future. Based on the perspective of embodied cognition, AR online shopping may be potential focus of research and market application. For the future design of online shopping platforms and the updating of user experience, this study provides a reference. PMID:28184207
Zhu, Ming; Chai, Gang; Lin, Li; Xin, Yu; Tan, Andy; Bogari, Melia; Zhang, Yan; Li, Qingfeng
2016-12-01
Augmented reality (AR) technology can superimpose the virtual image generated by computer onto the real operating field to present an integral image to enhance surgical safety. The purpose of our study is to develop a novel AR-based navigation system for craniofacial surgery. We focus on orbital hypertelorism correction, because the surgery requires high preciseness and is considered tough even for senior craniofacial surgeon. Twelve patients with orbital hypertelorism were selected. The preoperative computed tomography data were imported into 3-dimensional platform for preoperational design. The position and orientation of virtual information and real world were adjusted by image registration process. The AR toolkits were used to realize the integral image. Afterward, computed tomography was also performed after operation for comparing the difference between preoperational plan and actual operational outcome. Our AR-based navigation system was successfully used in these patients, directly displaying 3-dimensional navigational information onto the surgical field. They all achieved a better appearance by the guidance of navigation image. The difference in interdacryon distance and the dacryon point of each side appear no significant (P > 0.05) between preoperational plan and actual surgical outcome. This study reports on an effective visualized approach for guiding orbital hypertelorism correction. Our AR-based navigation system may lay a foundation for craniofacial surgery navigation. The AR technology could be considered as a helpful tool for precise osteotomy in craniofacial surgery.
van Oosterom, Matthias N; van der Poel, Henk G; Navab, Nassir; van de Velde, Cornelis J H; van Leeuwen, Fijs W B
2018-03-01
To provide an overview of the developments made for virtual- and augmented-reality navigation procedures in urological interventions/surgery. Navigation efforts have demonstrated potential in the field of urology by supporting guidance for various disorders. The navigation approaches differ between the individual indications, but seem interchangeable to a certain extent. An increasing number of pre- and intra-operative imaging modalities has been used to create detailed surgical roadmaps, namely: (cone-beam) computed tomography, MRI, ultrasound, and single-photon emission computed tomography. Registration of these surgical roadmaps with the real-life surgical view has occurred in different forms (e.g. electromagnetic, mechanical, vision, or near-infrared optical-based), whereby the combination of approaches was suggested to provide superior outcome. Soft-tissue deformations demand the use of confirmatory interventional (imaging) modalities. This has resulted in the introduction of new intraoperative modalities such as drop-in US, transurethral US, (drop-in) gamma probes and fluorescence cameras. These noninvasive modalities provide an alternative to invasive technologies that expose the patients to X-ray doses. Whereas some reports have indicated navigation setups provide equal or better results than conventional approaches, most trials have been performed in relatively small patient groups and clear follow-up data are missing. The reported computer-assisted surgery research concepts provide a glimpse in to the future application of navigation technologies in the field of urology.
Borrel, Alexandre; Fourches, Denis
2017-12-01
There is a growing interest for the broad use of Augmented Reality (AR) and Virtual Reality (VR) in the fields of bioinformatics and cheminformatics to visualize complex biological and chemical structures. AR and VR technologies allow for stunning and immersive experiences, offering untapped opportunities for both research and education purposes. However, preparing 3D models ready to use for AR and VR is time-consuming and requires a technical expertise that severely limits the development of new contents of potential interest for structural biologists, medicinal chemists, molecular modellers and teachers. Herein we present the RealityConvert software tool and associated website, which allow users to easily convert molecular objects to high quality 3D models directly compatible for AR and VR applications. For chemical structures, in addition to the 3D model generation, RealityConvert also generates image trackers, useful to universally call and anchor that particular 3D model when used in AR applications. The ultimate goal of RealityConvert is to facilitate and boost the development and accessibility of AR and VR contents for bioinformatics and cheminformatics applications. http://www.realityconvert.com. dfourch@ncsu.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Augmented reality-assisted bypass surgery: embracing minimal invasiveness.
Cabrilo, Ivan; Schaller, Karl; Bijlenga, Philippe
2015-04-01
The overlay of virtual images on the surgical field, defined as augmented reality, has been used for image guidance during various neurosurgical procedures. Although this technology could conceivably address certain inherent problems of extracranial-to-intracranial bypass procedures, this potential has not been explored to date. We evaluate the usefulness of an augmented reality-based setup, which could help in harvesting donor vessels through their precise localization in real-time, in performing tailored craniotomies, and in identifying preoperatively selected recipient vessels for the purpose of anastomosis. Our method was applied to 3 patients with Moya-Moya disease who underwent superficial temporal artery-to-middle cerebral artery anastomoses and 1 patient who underwent an occipital artery-to-posteroinferior cerebellar artery bypass because of a dissecting aneurysm of the vertebral artery. Patients' heads, skulls, and extracranial and intracranial vessels were segmented preoperatively from 3-dimensional image data sets (3-dimensional digital subtraction angiography, angio-magnetic resonance imaging, angio-computed tomography), and injected intraoperatively into the operating microscope's eyepiece for image guidance. In each case, the described setup helped in precisely localizing donor and recipient vessels and in tailoring craniotomies to the injected images. The presented system based on augmented reality can optimize the workflow of extracranial-to-intracranial bypass procedures by providing essential anatomical information, entirely integrated to the surgical field, and help to perform minimally invasive procedures. Copyright © 2015 Elsevier Inc. All rights reserved.
Vemuri, Anant S; Wu, Jungle Chi-Hsiang; Liu, Kai-Che; Wu, Hurng-Sheng
2012-12-01
Surgical procedures have undergone considerable advancement during the last few decades. More recently, the availability of some imaging methods intraoperatively has added a new dimension to minimally invasive techniques. Augmented reality in surgery has been a topic of intense interest and research. Augmented reality involves usage of computer vision algorithms on video from endoscopic cameras or cameras mounted in the operating room to provide the surgeon additional information that he or she otherwise would have to recognize intuitively. One of the techniques combines a virtual preoperative model of the patient with the endoscope camera using natural or artificial landmarks to provide an augmented reality view in the operating room. The authors' approach is to provide this with the least number of changes to the operating room. Software architecture is presented to provide interactive adjustment in the registration of a three-dimensional (3D) model and endoscope video. Augmented reality including adrenalectomy, ureteropelvic junction obstruction, and retrocaval ureter and pancreas was used to perform 12 surgeries. The general feedback from the surgeons has been very positive not only in terms of deciding the positions for inserting points but also in knowing the least change in anatomy. The approach involves providing a deformable 3D model architecture and its application to the operating room. A 3D model with a deformable structure is needed to show the shape change of soft tissue during the surgery. The software architecture to provide interactive adjustment in registration of the 3D model and endoscope video with adjustability of every 3D model is presented.
Percussive Augmenter of Rotary Drills for Operating as a Rotary-Hammer Drill
NASA Technical Reports Server (NTRS)
Aldrich, Jack Barron (Inventor); Bar-Cohen, Yoseph (Inventor); Sherrit, Stewart (Inventor); Badescu, Mircea (Inventor); Bao, Xiaoqi (Inventor); Scott, James Samson (Inventor)
2014-01-01
A percussive augmenter bit includes a connection shaft for mounting the bit onto a rotary drill. In a first modality, an actuator percussively drives the bit, and an electric slip-ring provides power to the actuator while being rotated by the drill. Hammering action from the actuator and rotation from the drill are applied directly to material being drilled. In a second modality, a percussive augmenter includes an actuator that operates as a hammering mechanism that drives a free mass into the bit creating stress pulses that fracture material that is in contact with the bit.
Shinbane, Jerold S; Saxon, Leslie A
Advances in imaging technology have led to a paradigm shift from planning of cardiovascular procedures and surgeries requiring the actual patient in a "brick and mortar" hospital to utilization of the digitalized patient in the virtual hospital. Cardiovascular computed tomographic angiography (CCTA) and cardiovascular magnetic resonance (CMR) digitalized 3-D patient representation of individual patient anatomy and physiology serves as an avatar allowing for virtual delineation of the most optimal approaches to cardiovascular procedures and surgeries prior to actual hospitalization. Pre-hospitalization reconstruction and analysis of anatomy and pathophysiology previously only accessible during the actual procedure could potentially limit the intrinsic risks related to time in the operating room, cardiac procedural laboratory and overall hospital environment. Although applications are specific to areas of cardiovascular specialty focus, there are unifying themes related to the utilization of technologies. The virtual patient avatar computer can also be used for procedural planning, computational modeling of anatomy, simulation of predicted therapeutic result, printing of 3-D models, and augmentation of real time procedural performance. Examples of the above techniques are at various stages of development for application to the spectrum of cardiovascular disease processes, including percutaneous, surgical and hybrid minimally invasive interventions. A multidisciplinary approach within medicine and engineering is necessary for creation of robust algorithms for maximal utilization of the virtual patient avatar in the digital medical center. Utilization of the virtual advanced cardiac imaging patient avatar will play an important role in the virtual health care system. Although there has been a rapid proliferation of early data, advanced imaging applications require further assessment and validation of accuracy, reproducibility, standardization, safety, efficacy, quality, cost effectiveness, and overall value to medical care. Copyright © 2018 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.
Open multi-agent control architecture to support virtual-reality-based man-machine interfaces
NASA Astrophysics Data System (ADS)
Freund, Eckhard; Rossmann, Juergen; Brasch, Marcel
2001-10-01
Projective Virtual Reality is a new and promising approach to intuitively operable man machine interfaces for the commanding and supervision of complex automation systems. The user interface part of Projective Virtual Reality heavily builds on latest Virtual Reality techniques, a task deduction component and automatic action planning capabilities. In order to realize man machine interfaces for complex applications, not only the Virtual Reality part has to be considered but also the capabilities of the underlying robot and automation controller are of great importance. This paper presents a control architecture that has proved to be an ideal basis for the realization of complex robotic and automation systems that are controlled by Virtual Reality based man machine interfaces. The architecture does not just provide a well suited framework for the real-time control of a multi robot system but also supports Virtual Reality metaphors and augmentations which facilitate the user's job to command and supervise a complex system. The developed control architecture has already been used for a number of applications. Its capability to integrate sensor information from sensors of different levels of abstraction in real-time helps to make the realized automation system very responsive to real world changes. In this paper, the architecture will be described comprehensively, its main building blocks will be discussed and one realization that is built based on an open source real-time operating system will be presented. The software design and the features of the architecture which make it generally applicable to the distributed control of automation agents in real world applications will be explained. Furthermore its application to the commanding and control of experiments in the Columbus space laboratory, the European contribution to the International Space Station (ISS), is only one example which will be described.
NASA Astrophysics Data System (ADS)
Palestini, C.; Basso, A.
2017-11-01
In recent years, an increase in international investment in hardware and software technology to support programs that adopt algorithms for photomodeling or data management from laser scanners significantly reduced the costs of operations in support of Augmented Reality and Virtual Reality, designed to generate real-time explorable digital environments integrated to virtual stereoscopic headset. The research analyzes transversal methodologies related to the acquisition of these technologies in order to intervene directly on the phenomenon of acquiring the current VR tools within a specific workflow, in light of any issues related to the intensive use of such devices , outlining a quick overview of the possible "virtual migration" phenomenon, assuming a possible integration with the new internet hyper-speed systems, capable of triggering a massive cyberspace colonization process that paradoxically would also affect the everyday life and more in general, on human space perception. The contribution aims at analyzing the application systems used for low cost 3d photogrammetry by means of a precise pipeline, clarifying how a 3d model is generated, automatically retopologized, textured by color painting or photo-cloning techniques, and optimized for parametric insertion on virtual exploration platforms. Workflow analysis will follow some case studies related to photomodeling, digital retopology and "virtual 3d transfer" of some small archaeological artifacts and an architectural compartment corresponding to the pronaus of Aurum, a building designed in the 1940s by Michelucci. All operations will be conducted on cheap or free licensed software that today offer almost the same performance as their paid counterparts, progressively improving in the data processing speed and management.
Virtual Globes, where we were, are and will be
NASA Astrophysics Data System (ADS)
Dehn, J.; Webley, P. W.; Worden, A. K.
2016-12-01
Ten years ago, Google Earth was new, and the first "Virtual Globes" session was held at AGU. Only a few of us realized the potential of this technology at the time, but the idea quickly caught on. At that time a virtual globe came in two flavors, first a complex GIS system that was utterly impenetrable for the public, or a more accessible version with limited functionality and layers that was available on a desktop computer with a good internet connection. Google Earth's use of the Keyhole Markup Language opened the door for scientists and the public to share data and visualizations across disciplines and revolutionized how everyone uses geographic data. In the following 10 years, KML became more advanced, virtual globes moved to mobile and handheld platforms, and the Google Earth engine allowed for more complex data sharing among scientists. Virtual globe images went from a rare commodity to being everywhere in our lives, from weather forecasts, in our cars, on our smart-phones and shape how we receive and process data. This is a fantastic tool for education and with newer technologies can reach the the remote corners of the world and developing countries. New and emerging technologies allow for augmented reality to be merged with the globes, and for real-time data integration with sensors built into mobile devices or add-ons. This presentation will follow the history of virtual globes in the geosciences, show how robust technologies can be used in the field and classroom today, and make some suggestions for the future.
Virtual Surgery for Conduit Reconstruction of the Right Ventricular Outflow Tract.
Ong, Chin Siang; Loke, Yue-Hin; Opfermann, Justin; Olivieri, Laura; Vricella, Luca; Krieger, Axel; Hibino, Narutoshi
2017-05-01
Virtual surgery involves the planning and simulation of surgical reconstruction using three-dimensional (3D) modeling based upon individual patient data, augmented by simulation of planned surgical alterations including implantation of devices or grafts. Here we describe a case in which virtual cardiac surgery aided us in determining the optimal conduit size to use for the reconstruction of the right ventricular outflow tract. The patient is a young adolescent male with a history of tetralogy of Fallot with pulmonary atresia, requiring right ventricle-to-pulmonary artery (RV-PA) conduit replacement. Utilizing preoperative magnetic resonance imaging data, virtual surgery was undertaken to construct his heart in 3D and to simulate the implantation of three different sizes of RV-PA conduit (18, 20, and 22 mm). Virtual cardiac surgery allowed us to predict the ability to implant a conduit of a size that would likely remain adequate in the face of continued somatic growth and also allow for the possibility of transcatheter pulmonary valve implantation at some time in the future. Subsequently, the patient underwent uneventful conduit change surgery with implantation of a 22-mm Hancock valved conduit. As predicted, the intrathoracic space was sufficient to accommodate the relatively large conduit size without geometric distortion or sternal compression. Virtual cardiac surgery gives surgeons the ability to simulate the implantation of prostheses of different sizes in relation to the dimensions of a specific patient's own heart and thoracic cavity in 3D prior to surgery. This can be very helpful in predicting optimal conduit size, determining appropriate timing of surgery, and patient education.
Fusion interfaces for tactical environments: An application of virtual reality technology
NASA Technical Reports Server (NTRS)
Haas, Michael W.
1994-01-01
The term Fusion Interface is defined as a class of interface which integrally incorporates both virtual and nonvirtual concepts and devices across the visual, auditory, and haptic sensory modalities. A fusion interface is a multisensory virtually-augmented synthetic environment. A new facility has been developed within the Human Engineering Division of the Armstrong Laboratory dedicated to exploratory development of fusion interface concepts. This new facility, the Fusion Interfaces for Tactical Environments (FITE) Facility is a specialized flight simulator enabling efficient concept development through rapid prototyping and direct experience of new fusion concepts. The FITE Facility also supports evaluation of fusion concepts by operation fighter pilots in an air combat environment. The facility is utilized by a multidisciplinary design team composed of human factors engineers, electronics engineers, computer scientists, experimental psychologists, and oeprational pilots. The FITE computational architecture is composed of twenty-five 80486-based microcomputers operating in real-time. The microcomputers generate out-the-window visuals, in-cockpit and head-mounted visuals, localized auditory presentations, haptic displays on the stick and rudder pedals, as well as executing weapons models, aerodynamic models, and threat models.
Moving forward in treatment of posttraumatic stress disorder: innovations to exposure-based therapy.
Nijdam, Mirjam J; Vermetten, Eric
2018-01-01
The field of treatment of posttraumatic stress disorder (PTSD) has been a pacesetter for the changing face of psychotherapy, as is illustrated in the introduction of Virtual Reality Exposure Therapy. This paper outlines a novel approach that builds on a cognitive-motor interaction in a virtual interactive environment. It is based on the theory of memory reconsolidation and the embodiment of cognition. The framework we envision allows the patient to 'step into the past' by using forward motion as an essential ingredient to augment the impact of exposure to traumatic events. The behavioural response of approaching that is the exact opposite from the avoidance usually applied by patients and the enhancement of divergent thinking are the most prominent hypothesized mechanisms of action. This can contribute to strengthening of personal efficacy and self-reflection that is generated by high emotional engagement, as well as a sense of accomplishment and enhanced recovery as illustrated by a clinical case example. We argue that innovations with personalized virtual reality and motion need to be further investigated and implemented in current therapy settings.
Computer-Based Technologies in Dentistry: Types and Applications
Albuha Al-Mussawi, Raja’a M.; Farid, Farzaneh
2016-01-01
During dental education, dental students learn how to examine patients, make diagnosis, plan treatment and perform dental procedures perfectly and efficiently. However, progresses in computer-based technologies including virtual reality (VR) simulators, augmented reality (AR) and computer aided design/computer aided manufacturing (CAD/CAM) systems have resulted in new modalities for instruction and practice of dentistry. Virtual reality dental simulators enable repeated, objective and assessable practice in various controlled situations. Superimposition of three-dimensional (3D) virtual images on actual images in AR allows surgeons to simultaneously visualize the surgical site and superimpose informative 3D images of invisible regions on the surgical site to serve as a guide. The use of CAD/CAM systems for designing and manufacturing of dental appliances and prostheses has been well established. This article reviews computer-based technologies, their application in dentistry and their potentials and limitations in promoting dental education, training and practice. Practitioners will be able to choose from a broader spectrum of options in their field of practice by becoming familiar with new modalities of training and practice. PMID:28392819
Moving forward in treatment of posttraumatic stress disorder: innovations to exposure-based therapy
Nijdam, Mirjam J.; Vermetten, Eric
2018-01-01
ABSTRACT The field of treatment of posttraumatic stress disorder (PTSD) has been a pacesetter for the changing face of psychotherapy, as is illustrated in the introduction of Virtual Reality Exposure Therapy. This paper outlines a novel approach that builds on a cognitive-motor interaction in a virtual interactive environment. It is based on the theory of memory reconsolidation and the embodiment of cognition. The framework we envision allows the patient to 'step into the past' by using forward motion as an essential ingredient to augment the impact of exposure to traumatic events. The behavioural response of approaching that is the exact opposite from the avoidance usually applied by patients and the enhancement of divergent thinking are the most prominent hypothesized mechanisms of action. This can contribute to strengthening of personal efficacy and self-reflection that is generated by high emotional engagement, as well as a sense of accomplishment and enhanced recovery as illustrated by a clinical case example. We argue that innovations with personalized virtual reality and motion need to be further investigated and implemented in current therapy settings. PMID:29805777
A Visual Servoing-Based Method for ProCam Systems Calibration
Berry, Francois; Aider, Omar Ait; Mosnier, Jeremie
2013-01-01
Projector-camera systems are currently used in a wide field of applications, such as 3D reconstruction and augmented reality, and can provide accurate measurements, depending on the configuration and calibration. Frequently, the calibration task is divided into two steps: camera calibration followed by projector calibration. The latter still poses certain problems that are not easy to solve, such as the difficulty in obtaining a set of 2D–3D points to compute the projection matrix between the projector and the world. Existing methods are either not sufficiently accurate or not flexible. We propose an easy and automatic method to calibrate such systems that consists in projecting a calibration pattern and superimposing it automatically on a known printed pattern. The projected pattern is provided by a virtual camera observing a virtual pattern in an OpenGL model. The projector displays what the virtual camera visualizes. Thus, the projected pattern can be controlled and superimposed on the printed one with the aid of visual servoing. Our experimental results compare favorably with those of other methods considering both usability and accuracy. PMID:24084121
Computer-Based Technologies in Dentistry: Types and Applications.
Albuha Al-Mussawi, Raja'a M; Farid, Farzaneh
2016-06-01
During dental education, dental students learn how to examine patients, make diagnosis, plan treatment and perform dental procedures perfectly and efficiently. However, progresses in computer-based technologies including virtual reality (VR) simulators, augmented reality (AR) and computer aided design/computer aided manufacturing (CAD/CAM) systems have resulted in new modalities for instruction and practice of dentistry. Virtual reality dental simulators enable repeated, objective and assessable practice in various controlled situations. Superimposition of three-dimensional (3D) virtual images on actual images in AR allows surgeons to simultaneously visualize the surgical site and superimpose informative 3D images of invisible regions on the surgical site to serve as a guide. The use of CAD/CAM systems for designing and manufacturing of dental appliances and prostheses has been well established. This article reviews computer-based technologies, their application in dentistry and their potentials and limitations in promoting dental education, training and practice. Practitioners will be able to choose from a broader spectrum of options in their field of practice by becoming familiar with new modalities of training and practice.
Oral and maxillofacial surgery with computer-assisted navigation system.
Kawachi, Homare; Kawachi, Yasuyuki; Ikeda, Chihaya; Takagi, Ryo; Katakura, Akira; Shibahara, Takahiko
2010-01-01
Intraoperative computer-assisted navigation has gained acceptance in maxillofacial surgery with applications in an increasing number of indications. We adapted a commercially available wireless passive marker system which allows calibration and tracking of virtually every instrument in maxillofacial surgery. Virtual computer-generated anatomical structures are displayed intraoperatively in a semi-immersive head-up display. Continuous observation of the operating field facilitated by computer assistance enables surgical navigation in accordance with the physician's preoperative plans. This case report documents the potential for augmented visualization concepts in surgical resection of tumors in the oral and maxillofacial region. We report a case of T3N2bM0 carcinoma of the maxillary gingival which was surgically resected with the assistance of the Stryker Navigation Cart System. This system was found to be useful in assisting preoperative planning and intraoperative monitoring.
De Luca, Rosaria; Torrisi, Michele; Piccolo, Adriana; Bonfiglio, Giovanni; Tomasello, Provvidenza; Naro, Antonino; Calabrò, Rocco Salvatore
2017-10-11
Cognitive impairment, as well as mood and anxiety disorders, occur frequently in patients following stroke. Aim of this study was to evaluate the effects of a combined rehabilitative treatment using conventional relaxation and respiratory techniques, in a specific rehabilitative virtual environment (by using Bts-Nirvana). A 58-year-old woman, affected by hemorrhagic stroke, underwent two different rehabilitation trainings, including either standard relaxation techniques alone in a common clinical setting or the same psychological approach in a semi-immersive virtual environment with an augmented sensorial (audio-video) and motor feedback (sensory motor-interaction). We evaluated the patient's cognitive and psychological profile before and after the two different trainings, by using a specific psychometric battery, aimed to assess cognitive status, attention processes and to estimate the presence of mood alterations, anxiety and coping strategies. Only at the end of the combined approach, we observed a significant improvement in attention and memory functions, with a nearly complete relief of anxiety symptoms and an improvement in coping strategies. Relaxation and respiratory techniques in a semi-immersive virtual reality environment, using Bts-Nirvana, may be a promising tool in improving attention process, coping strategies, and anxiety in individuals with neurological disorders, including stroke.
Virtual Reality As a Training Tool to Treat Physical Inactivity in Children.
Kiefer, Adam W; Pincus, David; Richardson, Michael J; Myer, Gregory D
2017-01-01
Lack of adequate physical activity in children is an epidemic that can result in obesity and other poor health outcomes across the lifespan. Physical activity interventions focused on motor skill competence continue to be developed, but some interventions, such as neuromuscular training (NMT), may be limited in how early they can be implemented due to dependence on the child's level of cognitive and perceptual-motor development. Early implementation of motor-rich activities that support motor skill development in children is critical for the development of healthy levels of physical activity that carry through into adulthood. Virtual reality (VR) training may be beneficial in this regard. VR training, when grounded in an information-based theory of perceptual-motor behavior that modifies the visual information in the virtual world, can promote early development of motor skills in youth akin to more natural, real-world development as opposed to strictly formalized training. This approach can be tailored to the individual child and training scenarios can increase in complexity as the child develops. Ultimately, training in VR may help serve as a precursor to "real-world" NMT, and once the child reaches the appropriate training age can also augment more complex NMT regimens performed outside of the virtual environment.
Z-depth integration: a new technique for manipulating z-depth properties in composited scenes
NASA Astrophysics Data System (ADS)
Steckel, Kayla; Whittinghill, David
2014-02-01
This paper presents a new technique in the production pipeline of asset creation for virtual environments called Z-Depth Integration (ZeDI). ZeDI is intended to reduce the time required to place elements at the appropriate z-depth within a scene. Though ZeDI is intended for use primarily in two-dimensional scene composition, depth-dependent "flat" animated objects are often critical elements of augmented and virtual reality applications (AR/VR). ZeDI is derived from "deep image compositing", a capacity implemented within the OpenEXR file format. In order to trick the human eye into perceiving overlapping scene elements as being in front of or behind one another, the developer must manually manipulate which pixels of an element are visible in relation to other objects embedded within the environment's image sequence. ZeDI improves on this process by providing a means for interacting with procedurally extracted z-depth data from a virtual environment scene. By streamlining the process of defining objects' depth characteristics, it is expected that the time and energy required for developers to create compelling AR/VR scenes will be reduced. In the proof of concept presented in this manuscript, ZeDI is implemented for pre-rendered virtual scene construction via an AfterEffects software plug-in.
Virtual Reality As a Training Tool to Treat Physical Inactivity in Children
Kiefer, Adam W.; Pincus, David; Richardson, Michael J.; Myer, Gregory D.
2017-01-01
Lack of adequate physical activity in children is an epidemic that can result in obesity and other poor health outcomes across the lifespan. Physical activity interventions focused on motor skill competence continue to be developed, but some interventions, such as neuromuscular training (NMT), may be limited in how early they can be implemented due to dependence on the child’s level of cognitive and perceptual-motor development. Early implementation of motor-rich activities that support motor skill development in children is critical for the development of healthy levels of physical activity that carry through into adulthood. Virtual reality (VR) training may be beneficial in this regard. VR training, when grounded in an information-based theory of perceptual-motor behavior that modifies the visual information in the virtual world, can promote early development of motor skills in youth akin to more natural, real-world development as opposed to strictly formalized training. This approach can be tailored to the individual child and training scenarios can increase in complexity as the child develops. Ultimately, training in VR may help serve as a precursor to “real-world” NMT, and once the child reaches the appropriate training age can also augment more complex NMT regimens performed outside of the virtual environment. PMID:29376045
The Virtual Dollhouse: Body Image and Weight Stigma in Second Life
NASA Astrophysics Data System (ADS)
Linares, R.; Bailenson, J.; Bailey, J.; Stevenson Won, A.
2012-12-01
Second Life is a virtual world where fantasy and reality collide as users can customize their digital representation or avatar. The act of wanting to ignore or avoid the real world's physical limitations can be called "avatar escapism" (Ducheneaut, Wen, Yee, Wadley, 2009). In the media the increasingly thin standard of beauty (Berel, Irving, 1998) has augmented negative stereotypes of overweight people to the point of making it acceptable for people to ridicule others' bodies image (Wang, Brownell, Wadden, 2004). In the real world, these concepts hurt people who are unable or unwilling to achieve an "acceptable" body size often leading them to be ridiculed. In the virtual world, a person may portray their desired body potentially escaping judgment from others. Can this more liberated form of bodily expression lead people to expect and need that perfection to a point where they abandon the real world in order to live in that perfection? With this knowledge we looked at the implications of the real world idolization of the perfect body and how this is transferred into the virtual space. In addition, we investigated how the reactions and behaviors that people have when others rebel against the "Barbie doll" appearance (Ducheneaut, Wen, Yee, Wadley, 2009) affect us in the real world.
NASA Astrophysics Data System (ADS)
Bruder, Friedrich-Karl; Fäcke, Thomas; Grote, Fabian; Hagen, Rainer; Hönel, Dennis; Koch, Eberhard; Rewitz, Christian; Walze, Günther; Wewer, Brita
2017-03-01
Volume Holographic Optical Elements (vHOEs) gained wide attention as optical combiners for the use in augmented and virtual reality (AR and VR, respectively) consumer electronics and automotive head-up display applications. The unique characteristics of these diffractive grating structures - being lightweight, thin and flat - make them perfectly suitable for use in integrated optical components like spectacle lenses and car windshields. While being transparent in Off-Bragg condition, they provide full color capability and adjustable diffraction efficiency. The instant developing photopolymer Bayfol® HX film provides an ideal technology platform to optimize the performance of vHOEs in a wide range of applications. Important for any commercialization are simple and robust mass production schemes. In this paper, we present an efficient and easy to control one-beam recording scheme to copy a so-called master vHOE in a step-and-repeat process. In this contact-copy scheme, Bayfol® HX film is laminated to a master stack before being exposed by a scanning laser line. Subsequently, the film is delaminated in a controlled fashion and bleached. We explain working principles of the one-beam copy concept and discuss the mechanical construction of the installed vHOE replication line. Moreover, we treat aspects like master design, effects of vibration and suppression of noise gratings. Furthermore, digital vHOEs are introduced as master holograms. They enable new ways of optical design and paths to large scale vHOEs.
Lin, Chien-Yu; Chang, Yu-Ming
2015-02-01
This study uses a body motion interactive game developed in Scratch 2.0 to enhance the body strength of children with disabilities. Scratch 2.0, using an augmented-reality function on a program platform, creates real world and virtual reality displays at the same time. This study uses a webcam integration that tracks movements and allows participants to interact physically with the project, to enhance the motivation of children with developmental disabilities to perform physical activities. This study follows a single-case research using an ABAB structure, in which A is the baseline and B is the intervention. The experimental period was 2 months. The experimental results demonstrated that the scores for 3 children with developmental disabilities increased considerably during the intervention phrases. The developmental applications of these results are also discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.
Jedi training: playful evaluation of head-mounted augmented reality display systems
NASA Astrophysics Data System (ADS)
Ozbek, Christopher S.; Giesler, Bjorn; Dillmann, Ruediger
2004-05-01
A fundamental decision in building augmented reality (AR) systems is how to accomplish the combining of the real and virtual worlds. Nowadays this key-question boils down to the two alternatives video-see-through (VST) vs. optical-see-through (OST). Both systems have advantages and disadvantages in areas like production-simplicity, resolution, flexibility in composition strategies, field of view etc. To provide additional decision criteria for high dexterity, accuracy tasks and subjective user-acceptance a gaming environment was programmed that allowed good evaluation of hand-eye coordination, and that was inspired by the Star Wars movies. During an experimentation session with more than thirty participants a preference for optical-see-through glasses in conjunction with infra-red-tracking was found. Especially the high-computational demand for video-capture, processing and the resulting drop in frame rate emerged as a key-weakness of the VST-system.
Simulation and augmented reality in endovascular neurosurgery: lessons from aviation.
Mitha, Alim P; Almekhlafi, Mohammed A; Janjua, Major Jameel J; Albuquerque, Felipe C; McDougall, Cameron G
2013-01-01
Endovascular neurosurgery is a discipline strongly dependent on imaging. Therefore, technology that improves how much useful information we can garner from a single image has the potential to dramatically assist decision making during endovascular procedures. Furthermore, education in an image-enhanced environment, especially with the incorporation of simulation, can improve the safety of the procedures and give interventionalists and trainees the opportunity to study or perform simulated procedures before the intervention, much like what is practiced in the field of aviation. Here, we examine the use of simulators in the training of fighter pilots and discuss how similar benefits can compensate for current deficiencies in endovascular training. We describe the types of simulation used for endovascular procedures, including virtual reality, and discuss the relevant data on its utility in training. Finally, the benefit of augmented reality during endovascular procedures is discussed, along with future computerized image enhancement techniques.
Towards multi-platform software architecture for Collaborative Teleoperation
NASA Astrophysics Data System (ADS)
Domingues, Christophe; Otmane, Samir; Davesne, Frederic; Mallem, Malik
2009-03-01
Augmented Reality (AR) can provide to a Human Operator (HO) a real help in achieving complex tasks, such as remote control of robots and cooperative teleassistance. Using appropriate augmentations, the HO can interact faster, safer and easier with the remote real world. In this paper, we present an extension of an existing distributed software and network architecture for collaborative teleoperation based on networked human-scaled mixed reality and mobile platform. The first teleoperation system was composed by a VR application and a Web application. However the 2 systems cannot be used together and it is impossible to control a distant robot simultaneously. Our goal is to update the teleoperation system to permit a heterogeneous collaborative teleoperation between the 2 platforms. An important feature of this interface is based on the use of different Virtual Reality platforms and different Mobile platforms to control one or many robots.