Sample records for realistic haptic rendering

  1. Collision detection and modeling of rigid and deformable objects in laparoscopic simulator

    NASA Astrophysics Data System (ADS)

    Dy, Mary-Clare; Tagawa, Kazuyoshi; Tanaka, Hiromi T.; Komori, Masaru

    2015-03-01

    Laparoscopic simulators are viable alternatives for surgical training and rehearsal. Haptic devices can also be incorporated with virtual reality simulators to provide additional cues to the users. However, to provide realistic feedback, the haptic device must be updated by 1kHz. On the other hand, realistic visual cues, that is, the collision detection and deformation between interacting objects must be rendered at least 30 fps. Our current laparoscopic simulator detects the collision between a point on the tool tip, and on the organ surfaces, in which haptic devices are attached on actual tool tips for realistic tool manipulation. The triangular-mesh organ model is rendered using a mass spring deformation model, or finite element method-based models. In this paper, we investigated multi-point-based collision detection on the rigid tool rods. Based on the preliminary results, we propose a method to improve the collision detection scheme, and speed up the organ deformation reaction. We discuss our proposal for an efficient method to compute simultaneous multiple collision between rigid (laparoscopic tools) and deformable (organs) objects, and perform the subsequent collision response, with haptic feedback, in real-time.

  2. Importance of Matching Physical Friction, Hardness, and Texture in Creating Realistic Haptic Virtual Surfaces.

    PubMed

    Culbertson, Heather; Kuchenbecker, Katherine J

    2017-01-01

    Interacting with physical objects through a tool elicits tactile and kinesthetic sensations that comprise your haptic impression of the object. These cues, however, are largely missing from interactions with virtual objects, yielding an unrealistic user experience. This article evaluates the realism of virtual surfaces rendered using haptic models constructed from data recorded during interactions with real surfaces. The models include three components: surface friction, tapping transients, and texture vibrations. We render the virtual surfaces on a SensAble Phantom Omni haptic interface augmented with a Tactile Labs Haptuator for vibration output. We conducted a human-subject study to assess the realism of these virtual surfaces and the importance of the three model components. Following a perceptual discrepancy paradigm, subjects compared each of 15 real surfaces to a full rendering of the same surface plus versions missing each model component. The realism improvement achieved by including friction, tapping, or texture in the rendering was found to directly relate to the intensity of the surface's property in that domain (slipperiness, hardness, or roughness). A subsequent analysis of forces and vibrations measured during interactions with virtual surfaces indicated that the Omni's inherent mechanical properties corrupted the user's haptic experience, decreasing realism of the virtual surface.

  3. Simulation and training of lumbar punctures using haptic volume rendering and a 6DOF haptic device

    NASA Astrophysics Data System (ADS)

    Färber, Matthias; Heller, Julika; Handels, Heinz

    2007-03-01

    The lumbar puncture is performed by inserting a needle into the spinal chord of the patient to inject medicaments or to extract liquor. The training of this procedure is usually done on the patient guided by experienced supervisors. A virtual reality lumbar puncture simulator has been developed in order to minimize the training costs and the patient's risk. We use a haptic device with six degrees of freedom (6DOF) to feedback forces that resist needle insertion and rotation. An improved haptic volume rendering approach is used to calculate the forces. This approach makes use of label data of relevant structures like skin, bone, muscles or fat and original CT data that contributes information about image structures that can not be segmented. A real-time 3D visualization with optional stereo view shows the punctured region. 2D visualizations of orthogonal slices enable a detailed impression of the anatomical context. The input data consisting of CT and label data and surface models of relevant structures is defined in an XML file together with haptic rendering and visualization parameters. In a first evaluation the visible human male data has been used to generate a virtual training body. Several users with different medical experience tested the lumbar puncture trainer. The simulator gives a good haptic and visual impression of the needle insertion and the haptic volume rendering technique enables the feeling of unsegmented structures. Especially, the restriction of transversal needle movement together with rotation constraints enabled by the 6DOF device facilitate a realistic puncture simulation.

  4. Perception-based 3D tactile rendering from a single image for human skin examinations by dynamic touch.

    PubMed

    Kim, K; Lee, S

    2015-05-01

    Diagnosis of skin conditions is dependent on the assessment of skin surface properties that are represented by more tactile properties such as stiffness, roughness, and friction than visual information. Due to this reason, adding tactile feedback to existing vision based diagnosis systems can help dermatologists diagnose skin diseases or disorders more accurately. The goal of our research was therefore to develop a tactile rendering system for skin examinations by dynamic touch. Our development consists of two stages: converting a single image to a 3D haptic surface and rendering the generated haptic surface in real-time. Converting to 3D surfaces from 2D single images was implemented with concerning human perception data collected by a psychophysical experiment that measured human visual and haptic sensibility to 3D skin surface changes. For the second stage, we utilized real skin biomechanical properties found by prior studies. Our tactile rendering system is a standalone system that can be used with any single cameras and haptic feedback devices. We evaluated the performance of our system by conducting an identification experiment with three different skin images with five subjects. The participants had to identify one of the three skin surfaces by using a haptic device (Falcon) only. No visual cue was provided for the experiment. The results indicate that our system provides sufficient performance to render discernable tactile rendering with different skin surfaces. Our system uses only a single skin image and automatically generates a 3D haptic surface based on human haptic perception. Realistic skin interactions can be provided in real-time for the purpose of skin diagnosis, simulations, or training. Our system can also be used for other applications like virtual reality and cosmetic applications. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  5. Haptic interface of the KAIST-Ewha colonoscopy simulator II.

    PubMed

    Woo, Hyun Soo; Kim, Woo Seok; Ahn, Woojin; Lee, Doo Yong; Yi, Sun Young

    2008-11-01

    This paper presents an improved haptic interface for the Korea Advanced Institute of Science and Technology Ewha Colonoscopy Simulator II. The haptic interface enables the distal portion of the colonoscope to be freely bent while guaranteeing sufficient workspace and reflective forces for colonoscopy simulation. Its force-torque sensor measures the profiles of the user. Manipulation of the colonoscope tip is monitored by four deflection sensors and triggers computations to render accurate graphic images corresponding to the rotation of the angle knob. Tack sensors are attached to the valve-actuation buttons of the colonoscope to simulate air injection or suction as well as the corresponding deformation of the colon. A survey study for face validation was conducted, and the result shows that the developed haptic interface provides realistic haptic feedback for colonoscopy simulations.

  6. Virtual reality in neurosurgical education: part-task ventriculostomy simulation with dynamic visual and haptic feedback.

    PubMed

    Lemole, G Michael; Banerjee, P Pat; Luciano, Cristian; Neckrysh, Sergey; Charbel, Fady T

    2007-07-01

    Mastery of the neurosurgical skill set involves many hours of supervised intraoperative training. Convergence of political, economic, and social forces has limited neurosurgical resident operative exposure. There is need to develop realistic neurosurgical simulations that reproduce the operative experience, unrestricted by time and patient safety constraints. Computer-based, virtual reality platforms offer just such a possibility. The combination of virtual reality with dynamic, three-dimensional stereoscopic visualization, and haptic feedback technologies makes realistic procedural simulation possible. Most neurosurgical procedures can be conceptualized and segmented into critical task components, which can be simulated independently or in conjunction with other modules to recreate the experience of a complex neurosurgical procedure. We use the ImmersiveTouch (ImmersiveTouch, Inc., Chicago, IL) virtual reality platform, developed at the University of Illinois at Chicago, to simulate the task of ventriculostomy catheter placement as a proof-of-concept. Computed tomographic data are used to create a virtual anatomic volume. Haptic feedback offers simulated resistance and relaxation with passage of a virtual three-dimensional ventriculostomy catheter through the brain parenchyma into the ventricle. A dynamic three-dimensional graphical interface renders changing visual perspective as the user's head moves. The simulation platform was found to have realistic visual, tactile, and handling characteristics, as assessed by neurosurgical faculty, residents, and medical students. We have developed a realistic, haptics-based virtual reality simulator for neurosurgical education. Our first module recreates a critical component of the ventriculostomy placement task. This approach to task simulation can be assembled in a modular manner to reproduce entire neurosurgical procedures.

  7. A kinesthetic washout filter for force-feedback rendering.

    PubMed

    Danieau, Fabien; Lecuyer, Anatole; Guillotel, Philippe; Fleureau, Julien; Mollet, Nicolas; Christie, Marc

    2015-01-01

    Today haptic feedback can be designed and associated to audiovisual content (haptic-audiovisuals or HAV). Although there are multiple means to create individual haptic effects, the issue of how to properly adapt such effects on force-feedback devices has not been addressed and is mostly a manual endeavor. We propose a new approach for the haptic rendering of HAV, based on a washout filter for force-feedback devices. A body model and an inverse kinematics algorithm simulate the user's kinesthetic perception. Then, the haptic rendering is adapted in order to handle transitions between haptic effects and to optimize the amplitude of effects regarding the device capabilities. Results of a user study show that this new haptic rendering can successfully improve the HAV experience.

  8. Algorithms for Haptic Rendering of 3D Objects

    NASA Technical Reports Server (NTRS)

    Basdogan, Cagatay; Ho, Chih-Hao; Srinavasan, Mandayam

    2003-01-01

    Algorithms have been developed to provide haptic rendering of three-dimensional (3D) objects in virtual (that is, computationally simulated) environments. The goal of haptic rendering is to generate tactual displays of the shapes, hardnesses, surface textures, and frictional properties of 3D objects in real time. Haptic rendering is a major element of the emerging field of computer haptics, which invites comparison with computer graphics. We have already seen various applications of computer haptics in the areas of medicine (surgical simulation, telemedicine, haptic user interfaces for blind people, and rehabilitation of patients with neurological disorders), entertainment (3D painting, character animation, morphing, and sculpting), mechanical design (path planning and assembly sequencing), and scientific visualization (geophysical data analysis and molecular manipulation).

  9. Characteristic analysis and simulation for polysilicon comb micro-accelerometer

    NASA Astrophysics Data System (ADS)

    Liu, Fengli; Hao, Yongping

    2008-10-01

    High force update rate is a key factor for achieving high performance haptic rendering, which imposes a stringent real time requirement upon the execution environment of the haptic system. This requirement confines the haptic system to simplified environment for reducing the computation cost of haptic rendering algorithms. In this paper, we present a novel "hyper-threading" architecture consisting of several threads for haptic rendering. The high force update rate is achieved with relatively large computation time interval for each haptic loop. The proposed method was testified and proved to be effective with experiments on virtual wall prototype haptic system via Delta Haptic Device.

  10. Neurosurgery simulation using non-linear finite element modeling and haptic interaction

    NASA Astrophysics Data System (ADS)

    Lee, Huai-Ping; Audette, Michel; Joldes, Grand R.; Enquobahrie, Andinet

    2012-02-01

    Real-time surgical simulation is becoming an important component of surgical training. To meet the realtime requirement, however, the accuracy of the biomechancial modeling of soft tissue is often compromised due to computing resource constraints. Furthermore, haptic integration presents an additional challenge with its requirement for a high update rate. As a result, most real-time surgical simulation systems employ a linear elasticity model, simplified numerical methods such as the boundary element method or spring-particle systems, and coarse volumetric meshes. However, these systems are not clinically realistic. We present here an ongoing work aimed at developing an efficient and physically realistic neurosurgery simulator using a non-linear finite element method (FEM) with haptic interaction. Real-time finite element analysis is achieved by utilizing the total Lagrangian explicit dynamic (TLED) formulation and GPU acceleration of per-node and per-element operations. We employ a virtual coupling method for separating deformable body simulation and collision detection from haptic rendering, which needs to be updated at a much higher rate than the visual simulation. The system provides accurate biomechancial modeling of soft tissue while retaining a real-time performance with haptic interaction. However, our experiments showed that the stability of the simulator depends heavily on the material property of the tissue and the speed of colliding objects. Hence, additional efforts including dynamic relaxation are required to improve the stability of the system.

  11. Real-time mandibular angle reduction surgical simulation with haptic rendering.

    PubMed

    Wang, Qiong; Chen, Hui; Wu, Wen; Jin, Hai-Yang; Heng, Pheng-Ann

    2012-11-01

    Mandibular angle reduction is a popular and efficient procedure widely used to alter the facial contour. The primary surgical instruments, the reciprocating saw and the round burr, employed in the surgery have a common feature: operating at a high-speed. Generally, inexperienced surgeons need a long-time practice to learn how to minimize the risks caused by the uncontrolled contacts and cutting motions in manipulation of instruments with high-speed reciprocation or rotation. A virtual reality-based surgical simulator for the mandibular angle reduction was designed and implemented on a CUDA-based platform in this paper. High-fidelity visual and haptic feedbacks are provided to enhance the perception in a realistic virtual surgical environment. The impulse-based haptic models were employed to simulate the contact forces and torques on the instruments. It provides convincing haptic sensation for surgeons to control the instruments under different reciprocation or rotation velocities. The real-time methods for bone removal and reconstruction during surgical procedures have been proposed to support realistic visual feedbacks. The simulated contact forces were verified by comparing against the actual force data measured through the constructed mechanical platform. An empirical study based on the patient-specific data was conducted to evaluate the ability of the proposed system in training surgeons with various experiences. The results confirm the validity of our simulator.

  12. Haptic interfaces: Hardware, software and human performance

    NASA Technical Reports Server (NTRS)

    Srinivasan, Mandayam A.

    1995-01-01

    Virtual environments are computer-generated synthetic environments with which a human user can interact to perform a wide variety of perceptual and motor tasks. At present, most of the virtual environment systems engage only the visual and auditory senses, and not the haptic sensorimotor system that conveys the sense of touch and feel of objects in the environment. Computer keyboards, mice, and trackballs constitute relatively simple haptic interfaces. Gloves and exoskeletons that track hand postures have more interaction capabilities and are available in the market. Although desktop and wearable force-reflecting devices have been built and implemented in research laboratories, the current capabilities of such devices are quite limited. To realize the full promise of virtual environments and teleoperation of remote systems, further developments of haptic interfaces are critical. In this paper, the status and research needs in human haptics, technology development and interactions between the two are described. In particular, the excellent performance characteristics of Phantom, a haptic interface recently developed at MIT, are highlighted. Realistic sensations of single point of contact interactions with objects of variable geometry (e.g., smooth, textured, polyhedral) and material properties (e.g., friction, impedance) in the context of a variety of tasks (e.g., needle biopsy, switch panels) achieved through this device are described and the associated issues in haptic rendering are discussed.

  13. Direct Visuo-Haptic 4D Volume Rendering Using Respiratory Motion Models.

    PubMed

    Fortmeier, Dirk; Wilms, Matthias; Mastmeyer, Andre; Handels, Heinz

    2015-01-01

    This article presents methods for direct visuo-haptic 4D volume rendering of virtual patient models under respiratory motion. Breathing models are computed based on patient-specific 4D CT image data sequences. Virtual patient models are visualized in real-time by ray casting based rendering of a reference CT image warped by a time-variant displacement field, which is computed using the motion models at run-time. Furthermore, haptic interaction with the animated virtual patient models is provided by using the displacements computed at high rendering rates to translate the position of the haptic device into the space of the reference CT image. This concept is applied to virtual palpation and the haptic simulation of insertion of a virtual bendable needle. To this aim, different motion models that are applicable in real-time are presented and the methods are integrated into a needle puncture training simulation framework, which can be used for simulated biopsy or vessel puncture in the liver. To confirm real-time applicability, a performance analysis of the resulting framework is given. It is shown that the presented methods achieve mean update rates around 2,000 Hz for haptic simulation and interactive frame rates for volume rendering and thus are well suited for visuo-haptic rendering of virtual patients under respiratory motion.

  14. What you can't feel won't hurt you: Evaluating haptic hardware using a haptic contrast sensitivity function.

    PubMed

    Salisbury, C M; Gillespie, R B; Tan, H Z; Barbagli, F; Salisbury, J K

    2011-01-01

    In this paper, we extend the concept of the contrast sensitivity function - used to evaluate video projectors - to the evaluation of haptic devices. We propose using human observers to determine if vibrations rendered using a given haptic device are accompanied by artifacts detectable to humans. This determination produces a performance measure that carries particular relevance to applications involving texture rendering. For cases in which a device produces detectable artifacts, we have developed a protocol that localizes deficiencies in device design and/or hardware implementation. In this paper, we present results from human vibration detection experiments carried out using three commercial haptic devices and one high performance voice coil motor. We found that all three commercial devices produced perceptible artifacts when rendering vibrations near human detection thresholds. Our protocol allowed us to pinpoint the deficiencies, however, and we were able to show that minor modifications to the haptic hardware were sufficient to make these devices well suited for rendering vibrations, and by extension, the vibratory components of textures. We generalize our findings to provide quantitative design guidelines that ensure the ability of haptic devices to proficiently render the vibratory components of textures.

  15. Perception of force and stiffness in the presence of low-frequency haptic noise

    PubMed Central

    Gurari, Netta; Okamura, Allison M.; Kuchenbecker, Katherine J.

    2017-01-01

    Objective This work lays the foundation for future research on quantitative modeling of human stiffness perception. Our goal was to develop a method by which a human’s ability to perceive suprathreshold haptic force stimuli and haptic stiffness stimuli can be affected by adding haptic noise. Methods Five human participants performed a same-different task with a one-degree-of-freedom force-feedback device. Participants used the right index finger to actively interact with variations of force (∼5 and ∼8 N) and stiffness (∼290 N/m) stimuli that included one of four scaled amounts of haptically rendered noise (None, Low, Medium, High). The haptic noise was zero-mean Gaussian white noise that was low-pass filtered with a 2 Hz cut-off frequency; the resulting low-frequency signal was added to the force rendered while the participant interacted with the force and stiffness stimuli. Results We found that the precision with which participants could identify the magnitude of both the force and stiffness stimuli was affected by the magnitude of the low-frequency haptically rendered noise added to the haptic stimulus, as well as the magnitude of the haptic stimulus itself. The Weber fraction strongly correlated with the standard deviation of the low-frequency haptic noise with a Pearson product-moment correlation coefficient of ρ > 0.83. The mean standard deviation of the low-frequency haptic noise in the haptic stimuli ranged from 0.184 N to 1.111 N across the four haptically rendered noise levels, and the corresponding mean Weber fractions spanned between 0.042 and 0.101. Conclusions The human ability to perceive both suprathreshold haptic force and stiffness stimuli degrades in the presence of added low-frequency haptic noise. Future work can use the reported methods to investigate how force perception and stiffness perception may relate, with possible applications in haptic watermarking and in the assessment of the functionality of peripheral pathways in individuals with haptic impairments. PMID:28575068

  16. Evaluation of haptic interfaces for simulation of drill vibration in virtual temporal bone surgery.

    PubMed

    Ghasemloonia, Ahmad; Baxandall, Shalese; Zareinia, Kourosh; Lui, Justin T; Dort, Joseph C; Sutherland, Garnette R; Chan, Sonny

    2016-11-01

    Surgical training is evolving from an observership model towards a new paradigm that includes virtual-reality (VR) simulation. In otolaryngology, temporal bone dissection has become intimately linked with VR simulation as the complexity of anatomy demands a high level of surgeon aptitude and confidence. While an adequate 3D visualization of the surgical site is available in current simulators, the force feedback rendered during haptic interaction does not convey vibrations. This lack of vibration rendering limits the simulation fidelity of a surgical drill such as that used in temporal bone dissection. In order to develop an immersive simulation platform capable of haptic force and vibration feedback, the efficacy of hand controllers for rendering vibration in different drilling circumstances needs to be investigated. In this study, the vibration rendering ability of four different haptic hand controllers were analyzed and compared to find the best commercial haptic hand controller. A test-rig was developed to record vibrations encountered during temporal bone dissection and a software was written to render the recorded signals without adding hardware to the system. An accelerometer mounted on the end-effector of each device recorded the rendered vibration signals. The newly recorded vibration signal was compared with the input signal in both time and frequency domains by coherence and cross correlation analyses to quantitatively measure the fidelity of these devices in terms of rendering vibrotactile drilling feedback in different drilling conditions. This method can be used to assess the vibration rendering ability in VR simulation systems and selection of ideal haptic devices. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Realistic soft tissue deformation strategies for real time surgery simulation.

    PubMed

    Shen, Yunhe; Zhou, Xiangmin; Zhang, Nan; Tamma, Kumar; Sweet, Robert

    2008-01-01

    A volume-preserving deformation method (VPDM) is developed in complement with the mass-spring method (MSM) to improve the deformation quality of the MSM to model soft tissue in surgical simulation. This method can also be implemented as a stand-alone model. The proposed VPDM satisfies the Newton's laws of motion by obtaining the resultant vectors form an equilibrium condition. The proposed method has been tested in virtual surgery systems with haptic rendering demands.

  18. A design of hardware haptic interface for gastrointestinal endoscopy simulation.

    PubMed

    Gu, Yunjin; Lee, Doo Yong

    2011-01-01

    Gastrointestinal endoscopy simulations have been developed to train endoscopic procedures which require hundreds of practices to be competent in the skills. Even though realistic haptic feedback is important to provide realistic sensation to the user, most of previous simulations including commercialized simulation have mainly focused on providing realistic visual feedback. In this paper, we propose a novel design of portable haptic interface, which provides 2DOF force feedback, for the gastrointestinal endoscopy simulation. The haptic interface consists of translational and rotational force feedback mechanism which are completely decoupled, and gripping mechanism for controlling connection between the endoscope and the force feedback mechanism.

  19. Using the PhysX engine for physics-based virtual surgery with force feedback.

    PubMed

    Maciel, Anderson; Halic, Tansel; Lu, Zhonghua; Nedel, Luciana P; De, Suvranu

    2009-09-01

    The development of modern surgical simulators is highly challenging, as they must support complex simulation environments. The demand for higher realism in such simulators has driven researchers to adopt physics-based models, which are computationally very demanding. This poses a major problem, since real-time interactions must permit graphical updates of 30 Hz and a much higher rate of 1 kHz for force feedback (haptics). Recently several physics engines have been developed which offer multi-physics simulation capabilities, including rigid and deformable bodies, cloth and fluids. While such physics engines provide unique opportunities for the development of surgical simulators, their higher latencies, compared to what is necessary for real-time graphics and haptics, offer significant barriers to their use in interactive simulation environments. In this work, we propose solutions to this problem and demonstrate how a multimodal surgical simulation environment may be developed based on NVIDIA's PhysX physics library. Hence, models that are undergoing relatively low-frequency updates in PhysX can exist in an environment that demands much higher frequency updates for haptics. We use a collision handling layer to interface between the physical response provided by PhysX and the haptic rendering device to provide both real-time tissue response and force feedback. Our simulator integrates a bimanual haptic interface for force feedback and per-pixel shaders for graphics realism in real time. To demonstrate the effectiveness of our approach, we present the simulation of the laparoscopic adjustable gastric banding (LAGB) procedure as a case study. To develop complex and realistic surgical trainers with realistic organ geometries and tissue properties demands stable physics-based deformation methods, which are not always compatible with the interaction level required for such trainers. We have shown that combining different modelling strategies for behaviour, collision and graphics is possible and desirable. Such multimodal environments enable suitable rates to simulate the major steps of the LAGB procedure.

  20. Mixed reality temporal bone surgical dissector: mechanical design.

    PubMed

    Hochman, Jordan Brent; Sepehri, Nariman; Rampersad, Vivek; Kraut, Jay; Khazraee, Milad; Pisa, Justyn; Unger, Bertram

    2014-08-08

    The Development of a Novel Mixed Reality (MR) Simulation. An evolving training environment emphasizes the importance of simulation. Current haptic temporal bone simulators have difficulty representing realistic contact forces and while 3D printed models convincingly represent vibrational properties of bone, they cannot reproduce soft tissue. This paper introduces a mixed reality model, where the effective elements of both simulations are combined; haptic rendering of soft tissue directly interacts with a printed bone model. This paper addresses one aspect in a series of challenges, specifically the mechanical merger of a haptic device with an otic drill. This further necessitates gravity cancelation of the work assembly gripper mechanism. In this system, the haptic end-effector is replaced by a high-speed drill and the virtual contact forces need to be repositioned to the drill tip from the mid wand. Previous publications detail generation of both the requisite printed and haptic simulations. Custom software was developed to reposition the haptic interaction point to the drill tip. A custom fitting, to hold the otic drill, was developed and its weight was offset using the haptic device. The robustness of the system to disturbances and its stable performance during drilling were tested. The experiments were performed on a mixed reality model consisting of two drillable rapid-prototyped layers separated by a free-space. Within the free-space, a linear virtual force model is applied to simulate drill contact with soft tissue. Testing illustrated the effectiveness of gravity cancellation. Additionally, the system exhibited excellent performance given random inputs and during the drill's passage between real and virtual components of the model. No issues with registration at model boundaries were encountered. These tests provide a proof of concept for the initial stages in the development of a novel mixed-reality temporal bone simulator.

  1. Absence of modulatory action on haptic height perception with musical pitch

    PubMed Central

    Geronazzo, Michele; Avanzini, Federico; Grassi, Massimo

    2015-01-01

    Although acoustic frequency is not a spatial property of physical objects, in common language, pitch, i.e., the psychological correlated of frequency, is often labeled spatially (i.e., “high in pitch” or “low in pitch”). Pitch-height is known to modulate (and interact with) the response of participants when they are asked to judge spatial properties of non-auditory stimuli (e.g., visual) in a variety of behavioral tasks. In the current study we investigated whether the modulatory action of pitch-height extended to the haptic estimation of height of a virtual step. We implemented a HW/SW setup which is able to render virtual 3D objects (stair-steps) haptically through a PHANTOM device, and to provide real-time continuous auditory feedback depending on the user interaction with the object. The haptic exploration was associated with a sinusoidal tone whose pitch varied as a function of the interaction point's height within (i) a narrower and (ii) a wider pitch range, or (iii) a random pitch variation acting as a control audio condition. Explorations were also performed with no sound (haptic only). Participants were instructed to explore the virtual step freely, and to communicate height estimation by opening their thumb and index finger to mimic the step riser height, or verbally by reporting the height in centimeters of the step riser. We analyzed the role of musical expertise by dividing participants into non-musicians and musicians. Results showed no effects of musical pitch on high-realistic haptic feedback. Overall there is no difference between the two groups in the proposed multimodal conditions. Additionally, we observed a different haptic response distribution between musicians and non-musicians when estimations of the auditory conditions are matched with estimations in the no sound condition. PMID:26441745

  2. A review of haptic simulator for oral and maxillofacial surgery based on virtual reality.

    PubMed

    Chen, Xiaojun; Hu, Junlei

    2018-06-01

    Traditional medical training in oral and maxillofacial surgery (OMFS) may be limited by its low efficiency and high price due to the shortage of cadaver resources. With the combination of visual rendering and feedback force, surgery simulators become increasingly popular in hospitals and medical schools as an alternative to the traditional training. Areas covered: The major goal of this review is to provide a comprehensive reference source of current and future developments of haptic OMFS simulators based on virtual reality (VR) for relevant researchers. Expert commentary: Visual rendering, haptic rendering, tissue deformation, and evaluation are key components of haptic surgery simulator based on VR. Compared with traditional medical training, virtual and tactical fusion of virtual environment in surgery simulator enables considerably vivid sensation, and the operators have more opportunities to practice surgical skills and receive objective evaluation as reference.

  3. A predictive bone drilling force model for haptic rendering with experimental validation using fresh cadaveric bone.

    PubMed

    Lin, Yanping; Chen, Huajiang; Yu, Dedong; Zhang, Ying; Yuan, Wen

    2017-01-01

    Bone drilling simulators with virtual and haptic feedback provide a safe, cost-effective and repeatable alternative to traditional surgical training methods. To develop such a simulator, accurate haptic rendering based on a force model is required to feedback bone drilling forces based on user input. Current predictive bone drilling force models based on bovine bones with various drilling conditions and parameters are not representative of the bone drilling process in bone surgery. The objective of this study was to provide a bone drilling force model for haptic rendering based on calibration and validation experiments in fresh cadaveric bones with different bone densities. Using a commonly used drill bit geometry (2 mm diameter), feed rates (20-60 mm/min) and spindle speeds (4000-6000 rpm) in orthognathic surgeries, the bone drilling forces of specimens from two groups were measured and the calibration coefficients of the specific normal and frictional pressures were determined. The comparison of the predicted forces and the measured forces from validation experiments with a large range of feed rates and spindle speeds demonstrates that the proposed bone drilling forces can predict the trends and average forces well. The presented bone drilling force model can be used for haptic rendering in surgical simulators.

  4. Parametric model of the scala tympani for haptic-rendered cochlear implantation.

    PubMed

    Todd, Catherine; Naghdy, Fazel

    2005-01-01

    A parametric model of the human scala tympani has been designed for use in a haptic-rendered computer simulation of cochlear implant surgery. It will be the first surgical simulator of this kind. A geometric model of the Scala Tympani has been derived from measured data for this purpose. The model is compared with two existing descriptions of the cochlear spiral. A first approximation of the basilar membrane is also produced. The structures are imported into a force-rendering software application for system development.

  5. The visible ear simulator: a public PC application for GPU-accelerated haptic 3D simulation of ear surgery based on the visible ear data.

    PubMed

    Sorensen, Mads Solvsten; Mosegaard, Jesper; Trier, Peter

    2009-06-01

    Existing virtual simulators for middle ear surgery are based on 3-dimensional (3D) models from computed tomographic or magnetic resonance imaging data in which image quality is limited by the lack of detail (maximum, approximately 50 voxels/mm3), natural color, and texture of the source material.Virtual training often requires the purchase of a program, a customized computer, and expensive peripherals dedicated exclusively to this purpose. The Visible Ear freeware library of digital images from a fresh-frozen human temporal bone was segmented, and real-time volume rendered as a 3D model of high-fidelity, true color, and great anatomic detail and realism of the surgically relevant structures. A haptic drilling model was developed for surgical interaction with the 3D model. Realistic visualization in high-fidelity (approximately 125 voxels/mm3) and true color, 2D, or optional anaglyph stereoscopic 3D was achieved on a standard Core 2 Duo personal computer with a GeForce 8,800 GTX graphics card, and surgical interaction was provided through a relatively inexpensive (approximately $2,500) Phantom Omni haptic 3D pointing device. This prototype is published for download (approximately 120 MB) as freeware at http://www.alexandra.dk/ves/index.htm.With increasing personal computer performance, future versions may include enhanced resolution (up to 8,000 voxels/mm3) and realistic interaction with deformable soft tissue components such as skin, tympanic membrane, dura, and cholesteatomas-features some of which are not possible with computed tomographic-/magnetic resonance imaging-based systems.

  6. 6-DoF Haptic Rendering Using Continuous Collision Detection between Points and Signed Distance Fields.

    PubMed

    Hongyi Xu; Barbic, Jernej

    2017-01-01

    We present an algorithm for fast continuous collision detection between points and signed distance fields, and demonstrate how to robustly use it for 6-DoF haptic rendering of contact between objects with complex geometry. Continuous collision detection is often needed in computer animation, haptics, and virtual reality applications, but has so far only been investigated for polygon (triangular) geometry representations. We demonstrate how to robustly and continuously detect intersections between points and level sets of the signed distance field. We suggest using an octree subdivision of the distance field for fast traversal of distance field cells. We also give a method to resolve continuous collisions between point clouds organized into a tree hierarchy and a signed distance field, enabling rendering of contact between rigid objects with complex geometry. We investigate and compare two 6-DoF haptic rendering methods now applicable to point-versus-distance field contact for the first time: continuous integration of penalty forces, and a constraint-based method. An experimental comparison to discrete collision detection demonstrates that the continuous method is more robust and can correctly resolve collisions even under high velocities and during complex contact.

  7. Virtual reality simulation: basic concepts and use in endoscopic neurosurgery training.

    PubMed

    Cohen, Alan R; Lohani, Subash; Manjila, Sunil; Natsupakpong, Suriya; Brown, Nathan; Cavusoglu, M Cenk

    2013-08-01

    Virtual reality simulation is a promising alternative to training surgical residents outside the operating room. It is also a useful aide to anatomic study, residency training, surgical rehearsal, credentialing, and recertification. Surgical simulation is based on a virtual reality with varying degrees of immersion and realism. Simulators provide a no-risk environment for harmless and repeatable practice. Virtual reality has three main components of simulation: graphics/volume rendering, model behavior/tissue deformation, and haptic feedback. The challenge of accurately simulating the forces and tactile sensations experienced in neurosurgery limits the sophistication of a virtual simulator. The limited haptic feedback available in minimally invasive neurosurgery makes it a favorable subject for simulation. Virtual simulators with realistic graphics and force feedback have been developed for ventriculostomy, intraventricular surgery, and transsphenoidal pituitary surgery, thus allowing preoperative study of the individual anatomy and increasing the safety of the procedure. The authors also present experiences with their own virtual simulation of endoscopic third ventriculostomy.

  8. Mixed reality temporal bone surgical dissector: mechanical design

    PubMed Central

    2014-01-01

    Objective The Development of a Novel Mixed Reality (MR) Simulation. An evolving training environment emphasizes the importance of simulation. Current haptic temporal bone simulators have difficulty representing realistic contact forces and while 3D printed models convincingly represent vibrational properties of bone, they cannot reproduce soft tissue. This paper introduces a mixed reality model, where the effective elements of both simulations are combined; haptic rendering of soft tissue directly interacts with a printed bone model. This paper addresses one aspect in a series of challenges, specifically the mechanical merger of a haptic device with an otic drill. This further necessitates gravity cancelation of the work assembly gripper mechanism. In this system, the haptic end-effector is replaced by a high-speed drill and the virtual contact forces need to be repositioned to the drill tip from the mid wand. Previous publications detail generation of both the requisite printed and haptic simulations. Method Custom software was developed to reposition the haptic interaction point to the drill tip. A custom fitting, to hold the otic drill, was developed and its weight was offset using the haptic device. The robustness of the system to disturbances and its stable performance during drilling were tested. The experiments were performed on a mixed reality model consisting of two drillable rapid-prototyped layers separated by a free-space. Within the free-space, a linear virtual force model is applied to simulate drill contact with soft tissue. Results Testing illustrated the effectiveness of gravity cancellation. Additionally, the system exhibited excellent performance given random inputs and during the drill’s passage between real and virtual components of the model. No issues with registration at model boundaries were encountered. Conclusion These tests provide a proof of concept for the initial stages in the development of a novel mixed-reality temporal bone simulator. PMID:25927300

  9. Development of a Haptic Interface for Natural Orifice Translumenal Endoscopic Surgery Simulation

    PubMed Central

    Dargar, Saurabh; Sankaranarayanan, Ganesh

    2016-01-01

    Natural orifice translumenal endoscopic surgery (NOTES) is a minimally invasive procedure, which utilizes the body’s natural orifices to gain access to the peritoneal cavity. The NOTES procedure is designed to minimize external scarring and patient trauma, however flexible endoscopy based pure NOTES procedures require critical scope handling skills. The delicate nature of the NOTES procedure requires extensive training, thus to improve access to training while reducing risk to patients we have designed and developed the VTEST©, a virtual reality NOTES simulator. As part of the simulator, a novel decoupled 2-DOF haptic device was developed to provide realistic force feedback to the user in training. A series of experiments were performed to determine the behavioral characteristics of the device. The device was found capable of rendering up to 5.62N and 0.190Nm of continuous force and torque in the translational and rotational DOF, respectively. The device possesses 18.1Hz and 5.7Hz of force bandwidth in the translational and rotational DOF, respectively. A feedforward friction compensator was also successfully implemented to minimize the negative impact of friction during the interaction with the device. In this work we have presented the detailed development and evaluation of the haptic device for the VTEST©. PMID:27008674

  10. Portable haptic interface with omni-directional movement and force capability.

    PubMed

    Avizzano, Carlo Alberto; Satler, Massimo; Ruffaldi, Emanuele

    2014-01-01

    We describe the design of a new mobile haptic interface that employs wheels for force rendering. The interface, consisting of an omni-directional Killough type platform, provides 2DOF force feedback with different control modalities. The system autonomously performs sensor fusion for localization and force rendering. This paper explains the relevant choices concerning the functional aspects, the control design, the mechanical and electronic solution. Experimental results for force feedback characterization are reported.

  11. Characterization of a smartphone size haptic rendering system based on thin-film AlN actuators on glass substrates

    NASA Astrophysics Data System (ADS)

    Bernard, F.; Casset, F.; Danel, J. S.; Chappaz, C.; Basrour, S.

    2016-08-01

    This paper presents for the first time the characterization of a smartphone-size haptic rendering system based on the friction modulation effect. According to previous work and finite element modeling, the homogeneous flexural modes are needed to get the haptic feedback effect. The device studied consists of a thin film AlN transducers deposited on an 110  ×  65 mm2 glass substrate. The transducer’s localization on the glass plate allows a transparent central area of 90  ×  49 mm2. Electrical and mechanical parameters of the system are extracted from measurement. From this extraction, the electrical impedance matching reduced the applied voltage to 17.5 V AC and the power consumption to 1.53 W at the resonance frequency of the vibrating system to reach the haptic rendering specification. Transient characterizations of the actuation highlight a delay under the dynamic tactile detection. The characterization of the AlN transducers used as sensors, including the noise rejection, the delay or the output charge amplitude allows detections with high accuracy of any variation due to external influences. Those specifications are the first step to a low-power-consumption feedback-looped system.

  12. A one degree of freedom haptic system to investigate issues in human perception with particular application to probing tissue.

    PubMed

    Dibble, Edward; Zivanovic, Aleksandar; Davies, Brian

    2004-01-01

    This paper presents the results of several early studies relating to human haptic perception sensitivity when probing a virtual object. A 1 degree of freedom (DoF) rotary haptic system, that was designed and built for this purpose, is also presented. The experiments were to assess the maximum forces applied in a minimally invasive surgery (MIS) procedure, quantify the compliance sensitivity threshold when probing virtual tissue and identify the haptic system loop rate necessary for haptic feedback to feel realistic.

  13. Experimental Study on the Perception Characteristics of Haptic Texture by Multidimensional Scaling.

    PubMed

    Wu, Juan; Li, Na; Liu, Wei; Song, Guangming; Zhang, Jun

    2015-01-01

    Recent works regarding real texture perception demonstrate that physical factors such as stiffness and spatial period play a fundamental role in texture perception. This research used a multidimensional scaling (MDS) analysis to further characterize and quantify the effects of the simulation parameters on haptic texture rendering and perception. In a pilot experiment, 12 haptic texture samples were generated by using a 3-degrees-of-freedom (3-DOF) force-feedback device with varying spatial period, height, and stiffness coefficient parameter values. The subjects' perceptions of the virtual textures indicate that roughness, denseness, flatness and hardness are distinguishing characteristics of texture. In the main experiment, 19 participants rated the dissimilarities of the textures and estimated the magnitudes of their characteristics. The MDS method was used to recover the underlying perceptual space and reveal the significance of the space from the recorded data. The physical parameters and their combinations have significant effects on the perceptual characteristics. A regression model was used to quantitatively analyze the parameters and their effects on the perceptual characteristics. This paper is to illustrate that haptic texture perception based on force feedback can be modeled in two- or three-dimensional space and provide suggestions on improving perception-based haptic texture rendering.

  14. Perceptualization of geometry using intelligent haptic and visual sensing

    NASA Astrophysics Data System (ADS)

    Weng, Jianguang; Zhang, Hui

    2013-01-01

    We present a set of paradigms for investigating geometric structures using haptic and visual sensing. Our principal test cases include smoothly embedded geometry shapes such as knotted curves embedded in 3D and knotted surfaces in 4D, that contain massive intersections when projected to one lower dimension. One can exploit a touch-responsive 3D interactive probe to haptically override this conflicting evidence in the rendered images, by forcing continuity in the haptic representation to emphasize the true topology. In our work, we exploited a predictive haptic guidance, a "computer-simulated hand" with supplementary force suggestion, to support intelligent exploration of geometry shapes that will smooth and maximize the probability of recognition. The cognitive load can be reduced further when enabling an attention-driven visual sensing during the haptic exploration. Our methods combine to reveal the full richness of the haptic exploration of geometric structures, and to overcome the limitations of traditional 4D visualization.

  15. Enhancing audiovisual experience with haptic feedback: a survey on HAV.

    PubMed

    Danieau, F; Lecuyer, A; Guillotel, P; Fleureau, J; Mollet, N; Christie, M

    2013-01-01

    Haptic technology has been widely employed in applications ranging from teleoperation and medical simulation to art and design, including entertainment, flight simulation, and virtual reality. Today there is a growing interest among researchers in integrating haptic feedback into audiovisual systems. A new medium emerges from this effort: haptic-audiovisual (HAV) content. This paper presents the techniques, formalisms, and key results pertinent to this medium. We first review the three main stages of the HAV workflow: the production, distribution, and rendering of haptic effects. We then highlight the pressing necessity for evaluation techniques in this context and discuss the key challenges in the field. By building on existing technologies and tackling the specific challenges of the enhancement of audiovisual experience with haptics, we believe the field presents exciting research perspectives whose financial and societal stakes are significant.

  16. Roughness based perceptual analysis towards digital skin imaging system with haptic feedback.

    PubMed

    Kim, K

    2016-08-01

    To examine psoriasis or atopic eczema, analyzing skin roughness by palpation is essential to precisely diagnose skin diseases. However, optical sensor based skin imaging systems do not allow dermatologists to touch skin images. To solve the problem, a new haptic rendering technology that can accurately display skin roughness must be developed. In addition, the rendering algorithm must be able to filter spatial noises created during 2D to 3D image conversion without losing the original roughness on the skin image. In this study, a perceptual way to design a noise filter that will remove spatial noises and in the meantime recover maximized roughness is introduced by understanding human sensitivity on surface roughness. A visuohaptic rendering system that can provide a user with seeing and touching digital skin surface roughness has been developed including a geometric roughness estimation method from a meshed surface. In following, a psychophysical experiment was designed and conducted with 12 human subjects to measure human perception with the developed visual and haptic interfaces to examine surface roughness. From the psychophysical experiment, it was found that touch is more sensitive at lower surface roughness, and vice versa. Human perception with both senses, vision and touch, becomes less sensitive to surface distortions as roughness increases. When interact with both channels, visual and haptic interfaces, the performance to detect abnormalities on roughness is greatly improved by sensory integration with the developed visuohaptic rendering system. The result can be used as a guideline to design a noise filter that can perceptually remove spatial noises while recover maximized roughness values from a digital skin image obtained by optical sensors. In addition, the result also confirms that the developed visuohaptic rendering system can help dermatologists or skin care professionals examine skin conditions by using vision and touch at the same time. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  17. Improved haptic interface for colonoscopy simulation.

    PubMed

    Woo, Hyun Soo; Kim, Woo Seok; Ahn, Woojin; Lee, Doo Yong; Yi, Sun Young

    2007-01-01

    This paper presents an improved haptic interface of the KAIST-Ewha colonoscopy simulator II. The haptic interface enables the distal portion of the colonoscope to be freely bent while guaranteeing enough workspace and reflective forces for colonoscopy simulation. Its force-torque sensor measures profiles of the user. Manipulation of the colonoscope tip is monitored by four deflection sensors, and triggers computation to render accurate graphic images corresponding to the angle knob rotation. Tack switches are attached on the valve-actuation buttons of the colonoscope to simulate air-injection or suction, and the corresponding deformation of the colon.

  18. Real-time, haptics-enabled simulator for probing ex vivo liver tissue.

    PubMed

    Lister, Kevin; Gao, Zhan; Desai, Jaydev P

    2009-01-01

    The advent of complex surgical procedures has driven the need for realistic surgical training simulators. Comprehensive simulators that provide realistic visual and haptic feedback during surgical tasks are required to familiarize surgeons with the procedures they are to perform. Complex organ geometry inherent to biological tissues and intricate material properties drive the need for finite element methods to assure accurate tissue displacement and force calculations. Advances in real-time finite element methods have not reached the state where they are applicable to soft tissue surgical simulation. Therefore a real-time, haptics-enabled simulator for probing of soft tissue has been developed which utilizes preprocessed finite element data (derived from accurate constitutive model of the soft-tissue obtained from carefully collected experimental data) to accurately replicate the probing task in real-time.

  19. Patient adaptive control of end-effector based gait rehabilitation devices using a haptic control framework.

    PubMed

    Hussein, Sami; Kruger, Jörg

    2011-01-01

    Robot assisted training has proven beneficial as an extension of conventional therapy to improve rehabilitation outcome. Further facilitation of this positive impact is expected from the application of cooperative control algorithms to increase the patient's contribution to the training effort according to his level of ability. This paper presents an approach for cooperative training for end-effector based gait rehabilitation devices. Thereby it provides the basis to firstly establish sophisticated cooperative control methods in this class of devices. It uses a haptic control framework to synthesize and render complex, task specific training environments, which are composed of polygonal primitives. Training assistance is integrated as part of the environment into the haptic control framework. A compliant window is moved along a nominal training trajectory compliantly guiding and supporting the foot motion. The level of assistance is adjusted via the stiffness of the moving window. Further an iterative learning algorithm is used to automatically adjust this assistance level. Stable haptic rendering of the dynamic training environments and adaptive movement assistance have been evaluated in two example training scenarios: treadmill walking and stair climbing. Data from preliminary trials with one healthy subject is provided in this paper. © 2011 IEEE

  20. Haptic interface of web-based training system for interventional radiology procedures

    NASA Astrophysics Data System (ADS)

    Ma, Xin; Lu, Yiping; Loe, KiaFock; Nowinski, Wieslaw L.

    2004-05-01

    The existing web-based medical training systems and surgical simulators can provide affordable and accessible medical training curriculum, but they seldom offer the trainee realistic and affordable haptic feedback. Therefore, they cannot offer the trainee a suitable practicing environment. In this paper, a haptic solution for interventional radiology (IR) procedures is proposed. System architecture of a web-based training system for IR procedures is briefly presented first. Then, the mechanical structure, the working principle and the application of a haptic device are discussed in detail. The haptic device works as an interface between the training environment and the trainees and is placed at the end user side. With the system, the user can be trained on the interventional radiology procedures - navigating catheters, inflating balloons, deploying coils and placing stents on the web and get surgical haptic feedback in real time.

  1. Haptic feedback in OP:Sense - augmented reality in telemanipulated robotic surgery.

    PubMed

    Beyl, T; Nicolai, P; Mönnich, H; Raczkowksy, J; Wörn, H

    2012-01-01

    In current research, haptic feedback in robot assisted interventions plays an important role. However most approaches to haptic feedback only regard the mapping of the current forces at the surgical instrument to the haptic input devices, whereas surgeons demand a combination of medical imaging and telemanipulated robotic setups. In this paper we describe how this feature is integrated in our robotic research platform OP:Sense. The proposed method allows the automatic transfer of segmented imaging data to the haptic renderer and therefore allows enriching the haptic feedback with virtual fixtures based on imaging data. Anatomical structures are extracted from pre-operative generated medical images or virtual walls are defined by the surgeon inside the imaging data. Combining real forces with virtual fixtures can guide the surgeon to the regions of interest as well as helps to prevent the risk of damage to critical structures inside the patient. We believe that the combination of medical imaging and telemanipulation is a crucial step for the next generation of MIRS-systems.

  2. A Virtual Reality System for PTCD Simulation Using Direct Visuo-Haptic Rendering of Partially Segmented Image Data.

    PubMed

    Fortmeier, Dirk; Mastmeyer, Andre; Schröder, Julian; Handels, Heinz

    2016-01-01

    This study presents a new visuo-haptic virtual reality (VR) training and planning system for percutaneous transhepatic cholangio-drainage (PTCD) based on partially segmented virtual patient models. We only use partially segmented image data instead of a full segmentation and circumvent the necessity of surface or volume mesh models. Haptic interaction with the virtual patient during virtual palpation, ultrasound probing and needle insertion is provided. Furthermore, the VR simulator includes X-ray and ultrasound simulation for image-guided training. The visualization techniques are GPU-accelerated by implementation in Cuda and include real-time volume deformations computed on the grid of the image data. Computation on the image grid enables straightforward integration of the deformed image data into the visualization components. To provide shorter rendering times, the performance of the volume deformation algorithm is improved by a multigrid approach. To evaluate the VR training system, a user evaluation has been performed and deformation algorithms are analyzed in terms of convergence speed with respect to a fully converged solution. The user evaluation shows positive results with increased user confidence after a training session. It is shown that using partially segmented patient data and direct volume rendering is suitable for the simulation of needle insertion procedures such as PTCD.

  3. A 3-RSR Haptic Wearable Device for Rendering Fingertip Contact Forces.

    PubMed

    Leonardis, Daniele; Solazzi, Massimiliano; Bortone, Ilaria; Frisoli, Antonio

    2017-01-01

    A novel wearable haptic device for modulating contact forces at the fingertip is presented. Rendering of forces by skin deformation in three degrees of freedom (DoF), with contact-no contact capabilities, was implemented through rigid parallel kinematics. The novel asymmetrical three revolute-spherical-revolute (3-RSR) configuration allowed compact dimensions with minimum encumbrance of the hand workspace. The device was designed to render constant to low frequency deformation of the fingerpad in three DoF, combining light weight with relatively high output forces. A differential method for solving the non-trivial inverse kinematics is proposed and implemented in real time for controlling the device. The first experimental activity evaluated discrimination of different fingerpad stretch directions in a group of five subjects. The second experiment, enrolling 19 subjects, evaluated cutaneous feedback provided in a virtual pick-and-place manipulation task. Stiffness of the fingerpad plus device was measured and used to calibrate the physics of the virtual environment. The third experiment with 10 subjects evaluated interaction forces in a virtual lift-and-hold task. Although with different performance in the two manipulation experiments, overall results show that participants better controlled interaction forces when the cutaneous feedback was active, with significant differences between the visual and visuo-haptic experimental conditions.

  4. Graphic and haptic simulation system for virtual laparoscopic rectum surgery.

    PubMed

    Pan, Jun J; Chang, Jian; Yang, Xiaosong; Zhang, Jian J; Qureshi, Tahseen; Howell, Robert; Hickish, Tamas

    2011-09-01

    Medical simulators with vision and haptic feedback techniques offer a cost-effective and efficient alternative to the traditional medical trainings. They have been used to train doctors in many specialties of medicine, allowing tasks to be practised in a safe and repetitive manner. This paper describes a virtual-reality (VR) system which will help to influence surgeons' learning curves in the technically challenging field of laparoscopic surgery of the rectum. Data from MRI of the rectum and real operation videos are used to construct the virtual models. A haptic force filter based on radial basis functions is designed to offer realistic and smooth force feedback. To handle collision detection efficiently, a hybrid model is presented to compute the deformation of intestines. Finally, a real-time cutting technique based on mesh is employed to represent the incision operation. Despite numerous research efforts, fast and realistic solutions of soft tissues with large deformation, such as intestines, prove extremely challenging. This paper introduces our latest contribution to this endeavour. With this system, the user can haptically operate with the virtual rectum and simultaneously watch the soft tissue deformation. Our system has been tested by colorectal surgeons who believe that the simulated tactile and visual feedbacks are realistic. It could replace the traditional training process and effectively transfer surgical skills to novices. Copyright © 2011 John Wiley & Sons, Ltd.

  5. Haptic fMRI: combining functional neuroimaging with haptics for studying the brain's motor control representation.

    PubMed

    Menon, Samir; Brantner, Gerald; Aholt, Chris; Kay, Kendrick; Khatib, Oussama

    2013-01-01

    A challenging problem in motor control neuroimaging studies is the inability to perform complex human motor tasks given the Magnetic Resonance Imaging (MRI) scanner's disruptive magnetic fields and confined workspace. In this paper, we propose a novel experimental platform that combines Functional MRI (fMRI) neuroimaging, haptic virtual simulation environments, and an fMRI-compatible haptic device for real-time haptic interaction across the scanner workspace (above torso ∼ .65×.40×.20m(3)). We implement this Haptic fMRI platform with a novel haptic device, the Haptic fMRI Interface (HFI), and demonstrate its suitability for motor neuroimaging studies. HFI has three degrees-of-freedom (DOF), uses electromagnetic motors to enable high-fidelity haptic rendering (>350Hz), integrates radio frequency (RF) shields to prevent electromagnetic interference with fMRI (temporal SNR >100), and is kinematically designed to minimize currents induced by the MRI scanner's magnetic field during motor displacement (<2cm). HFI possesses uniform inertial and force transmission properties across the workspace, and has low friction (.05-.30N). HFI's RF noise levels, in addition, are within a 3 Tesla fMRI scanner's baseline noise variation (∼.85±.1%). Finally, HFI is haptically transparent and does not interfere with human motor tasks (tested for .4m reaches). By allowing fMRI experiments involving complex three-dimensional manipulation with haptic interaction, Haptic fMRI enables-for the first time-non-invasive neuroscience experiments involving interactive motor tasks, object manipulation, tactile perception, and visuo-motor integration.

  6. Telerobotic Haptic Exploration in Art Galleries and Museums for Individuals with Visual Impairments.

    PubMed

    Park, Chung Hyuk; Ryu, Eun-Seok; Howard, Ayanna M

    2015-01-01

    This paper presents a haptic telepresence system that enables visually impaired users to explore locations with rich visual observation such as art galleries and museums by using a telepresence robot, a RGB-D sensor (color and depth camera), and a haptic interface. The recent improvement on RGB-D sensors has enabled real-time access to 3D spatial information in the form of point clouds. However, the real-time representation of this data in the form of tangible haptic experience has not been challenged enough, especially in the case of telepresence for individuals with visual impairments. Thus, the proposed system addresses the real-time haptic exploration of remote 3D information through video encoding and real-time 3D haptic rendering of the remote real-world environment. This paper investigates two scenarios in haptic telepresence, i.e., mobile navigation and object exploration in a remote environment. Participants with and without visual impairments participated in our experiments based on the two scenarios, and the system performance was validated. In conclusion, the proposed framework provides a new methodology of haptic telepresence for individuals with visual impairments by providing an enhanced interactive experience where they can remotely access public places (art galleries and museums) with the aid of haptic modality and robotic telepresence.

  7. Development of a virtual reality haptic Veress needle insertion simulator for surgical skills training.

    PubMed

    Okrainec, A; Farcas, M; Henao, O; Choy, I; Green, J; Fotoohi, M; Leslie, R; Wight, D; Karam, P; Gonzalez, N; Apkarian, J

    2009-01-01

    The Veress needle is the most commonly used technique for creating the pneumoperitoneum at the start of a laparoscopic surgical procedure. Inserting the Veress needle correctly is crucial since errors can cause significant harm to patients. Unfortunately, this technique can be difficult to teach since surgeons rely heavily on tactile feedback while advancing the needle through the various layers of the abdominal wall. This critical step in laparoscopy, therefore, can be challenging for novice trainees to learn without adequate opportunities to practice in a safe environment with no risk of injury to patients. To address this issue, we have successfully developed a prototype of a virtual reality haptic needle insertion simulator using the tactile feedback of 22 surgeons to set realistic haptic parameters. A survey of these surgeons concluded that our device appeared and felt realistic, and could potentially be a useful tool for teaching the proper technique of Veress needle insertion.

  8. Learning Ultrasound-Guided Needle Insertion Skills through an Edutainment Game

    NASA Astrophysics Data System (ADS)

    Chan, Wing-Yin; Ni, Dong; Pang, Wai-Man; Qin, Jing; Chui, Yim-Pan; Yu, Simon Chun-Ho; Heng, Pheng-Ann

    Ultrasound-guided needle insertion is essential in many of minimally invasive surgeries or procedures, such as biopsy, drug delivery, spinal anaesthesia, etc. Accurate and safe needle insertion is a difficult task due to the high requirement of hand-eye coordination skills. Many proposed virtual reality (VR) based training systems put their emphasis on realistic simulation instead of pedagogical efficiency. The lack of schematic training scenario leads to boredom of repetitive operations. To solve this, we present our novel training system with the integration of game elements in order to retain the trainees' enthusiasm. Task-oriented scenarios, time attack scenarios and performance evaluation are introduced. Besides, some state-of-art technologies are also presented, including ultrasound simulation, needle haptic rendering as well as a mass-spring-based needle-tissue interaction simulation. These works are shown to be effective to keep the trainees up with learning.

  9. Haptic Foot Pedal: Influence of Shoe Type, Age, and Gender on Subjective Pulse Perception.

    PubMed

    Geitner, Claudia; Birrell, Stewart; Krehl, Claudia; Jennings, Paul

    2018-06-01

    This study investigates the influence of shoe type (sneakers and safety boots), age, and gender on the perception of haptic pulse feedback provided by a prototype accelerator pedal in a running stationary vehicle. Haptic feedback can be a less distracting alternative to traditionally visual and auditory in-vehicle feedback. However, to be effective, the device delivering the haptic feedback needs to be in contact with the person. Factors such as shoe type vary naturally over the season and could render feedback that is perceived well in one situation, unnoticeable in another. In this study, we evaluate factors that can influence the subjective perception of haptic feedback in a stationary but running car: shoe type, age, and gender. Thirty-six drivers within three age groups (≤39, 40-59, and ≥60) took part. For each haptic feedback, participants rated intensity, urgency, and comfort via a questionnaire. The perception of the haptic feedback is significantly influenced by the interaction between the pulse's duration and force amplitude and the participant's age and gender but not shoe type. The results indicate that it is important to consider different age groups and gender in the evaluation of haptic feedback. Future research might also look into approaches to adapt haptic feedback to the individual driver's preferences. Findings from this study can be applied to the design of an accelerator pedal in a car, for example, for a nonvisual in-vehicle warning, but also to plan user studies with a haptic pedal in general.

  10. Real-time dual-band haptic music player for mobile devices.

    PubMed

    Hwang, Inwook; Lee, Hyeseon; Choi, Seungmoon

    2013-01-01

    We introduce a novel dual-band haptic music player for real-time simultaneous vibrotactile playback with music in mobile devices. Our haptic music player features a new miniature dual-mode actuator that can produce vibrations consisting of two principal frequencies and a real-time vibration generation algorithm that can extract vibration commands from a music file for dual-band playback (bass and treble). The algorithm uses a "haptic equalizer" and provides plausible sound-to-touch modality conversion based on human perceptual data. In addition, we present a user study carried out to evaluate the subjective performance (precision, harmony, fun, and preference) of the haptic music player, in comparison with the current practice of bass-band-only vibrotactile playback via a single-frequency voice-coil actuator. The evaluation results indicated that the new dual-band playback outperforms the bass-only rendering, also providing several insights for further improvements. The developed system and experimental findings have implications for improving the multimedia experience with mobile devices.

  11. Identification of Vibrotactile Patterns Encoding Obstacle Distance Information.

    PubMed

    Kim, Yeongmi; Harders, Matthias; Gassert, Roger

    2015-01-01

    Delivering distance information of nearby obstacles from sensors embedded in a white cane-in addition to the intrinsic mechanical feedback from the cane-can aid the visually impaired in ambulating independently. Haptics is a common modality for conveying such information to cane users, typically in the form of vibrotactile signals. In this context, we investigated the effect of tactile rendering methods, tactile feedback configurations and directions of tactile flow on the identification of obstacle distance. Three tactile rendering methods with temporal variation only, spatio-temporal variation and spatial/temporal/intensity variation were investigated for two vibration feedback configurations. Results showed a significant interaction between tactile rendering method and feedback configuration. Spatio-temporal variation generally resulted in high correct identification rates for both feedback configurations. In the case of the four-finger vibration, tactile rendering with spatial/temporal/intensity variation also resulted in high distance identification rate. Further, participants expressed their preference for the four-finger vibration over the single-finger vibration in a survey. Both preferred rendering methods with spatio-temporal variation and spatial/temporal/intensity variation for the four-finger vibration could convey obstacle distance information with low workload. Overall, the presented findings provide valuable insights and guidance for the design of haptic displays for electronic travel aids for the visually impaired.

  12. Palpation simulator with stable haptic feedback.

    PubMed

    Kim, Sang-Youn; Ryu, Jee-Hwan; Lee, WooJeong

    2015-01-01

    The main difficulty in constructing palpation simulators is to compute and to generate stable and realistic haptic feedback without vibration. When a user haptically interacts with highly non-homogeneous soft tissues through a palpation simulator, a sudden change of stiffness in target tissues causes unstable interaction with the object. We propose a model consisting of a virtual adjustable damper and an energy measuring element. The energy measuring element gauges energy which is stored in a palpation simulator and the virtual adjustable damper dissipates the energy to achieve stable haptic interaction. To investigate the haptic behavior of the proposed method, impulse and continuous inputs are provided to target tissues. If a haptic interface point meets with the hardest portion in the target tissues modeled with a conventional method, we observe unstable motion and feedback force. However, when the target tissues are modeled with the proposed method, a palpation simulator provides stable interaction without vibration. The proposed method overcomes a problem in conventional haptic palpation simulators where unstable force or vibration can be generated if there is a big discrepancy in material property between an element and its neighboring elements in target tissues.

  13. Precise Haptic Device Co-Location for Visuo-Haptic Augmented Reality.

    PubMed

    Eck, Ulrich; Pankratz, Frieder; Sandor, Christian; Klinker, Gudrun; Laga, Hamid

    2015-12-01

    Visuo-haptic augmented reality systems enable users to see and touch digital information that is embedded in the real world. PHANToM haptic devices are often employed to provide haptic feedback. Precise co-location of computer-generated graphics and the haptic stylus is necessary to provide a realistic user experience. Previous work has focused on calibration procedures that compensate the non-linear position error caused by inaccuracies in the joint angle sensors. In this article we present a more complete procedure that additionally compensates for errors in the gimbal sensors and improves position calibration. The proposed procedure further includes software-based temporal alignment of sensor data and a method for the estimation of a reference for position calibration, resulting in increased robustness against haptic device initialization and external tracker noise. We designed our procedure to require minimal user input to maximize usability. We conducted an extensive evaluation with two different PHANToMs, two different optical trackers, and a mechanical tracker. Compared to state-of-the-art calibration procedures, our approach significantly improves the co-location of the haptic stylus. This results in higher fidelity visual and haptic augmentations, which are crucial for fine-motor tasks in areas such as medical training simulators, assembly planning tools, or rapid prototyping applications.

  14. Multi-arm multilateral haptics-based immersive tele-robotic system (HITS) for improvised explosive device disposal

    NASA Astrophysics Data System (ADS)

    Erickson, David; Lacheray, Hervé; Lai, Gilbert; Haddadi, Amir

    2014-06-01

    This paper presents the latest advancements of the Haptics-based Immersive Tele-robotic System (HITS) project, a next generation Improvised Explosive Device (IED) disposal (IEDD) robotic interface containing an immersive telepresence environment for a remotely-controlled three-articulated-robotic-arm system. While the haptic feedback enhances the operator's perception of the remote environment, a third teleoperated dexterous arm, equipped with multiple vision sensors and cameras, provides stereo vision with proper visual cues, and a 3D photo-realistic model of the potential IED. This decentralized system combines various capabilities including stable and scaled motion, singularity avoidance, cross-coupled hybrid control, active collision detection and avoidance, compliance control and constrained motion to provide a safe and intuitive control environment for the operators. Experimental results and validation of the current system are presented through various essential IEDD tasks. This project demonstrates that a two-armed anthropomorphic Explosive Ordnance Disposal (EOD) robot interface can achieve complex neutralization techniques against realistic IEDs without the operator approaching at any time.

  15. A “virtually minimal” visuo-haptic training of attention in severe traumatic brain injury

    PubMed Central

    2013-01-01

    Background Although common during the early stages of recovery from severe traumatic brain injury (TBI), attention deficits have been scarcely investigated. Encouraging evidence suggests beneficial effects of attention training in more chronic and higher functioning patients. Interactive technology may provide new opportunities for rehabilitation in inpatients who are earlier in their recovery. Methods We designed a “virtually minimal” approach using robot-rendered haptics in a virtual environment to train severely injured inpatients in the early stages of recovery to sustain attention to a visuo-motor task. 21 inpatients with severe TBI completed repetitive reaching toward targets that were both seen and felt. Patients were tested over two consecutive days, experiencing 3 conditions (no haptic feedback, a break-through force, and haptic nudge) in 12 successive, 4-minute blocks. Results The interactive visuo-haptic environments were well-tolerated and engaging. Patients typically remained attentive to the task. However, patients exhibited attention loss both before (prolonged initiation) and during (pauses during motion) a movement. Compared to no haptic feedback, patients benefited from haptic nudge cues but not break-through forces. As training progressed, patients increased the number of targets acquired and spontaneously improved from one day to the next. Conclusions Interactive visuo-haptic environments could be beneficial for attention training for severe TBI patients in the early stages of recovery and warrants further and more prolonged clinical testing. PMID:23938101

  16. A "virtually minimal" visuo-haptic training of attention in severe traumatic brain injury.

    PubMed

    Dvorkin, Assaf Y; Ramaiya, Milan; Larson, Eric B; Zollman, Felise S; Hsu, Nancy; Pacini, Sonia; Shah, Amit; Patton, James L

    2013-08-09

    Although common during the early stages of recovery from severe traumatic brain injury (TBI), attention deficits have been scarcely investigated. Encouraging evidence suggests beneficial effects of attention training in more chronic and higher functioning patients. Interactive technology may provide new opportunities for rehabilitation in inpatients who are earlier in their recovery. We designed a "virtually minimal" approach using robot-rendered haptics in a virtual environment to train severely injured inpatients in the early stages of recovery to sustain attention to a visuo-motor task. 21 inpatients with severe TBI completed repetitive reaching toward targets that were both seen and felt. Patients were tested over two consecutive days, experiencing 3 conditions (no haptic feedback, a break-through force, and haptic nudge) in 12 successive, 4-minute blocks. The interactive visuo-haptic environments were well-tolerated and engaging. Patients typically remained attentive to the task. However, patients exhibited attention loss both before (prolonged initiation) and during (pauses during motion) a movement. Compared to no haptic feedback, patients benefited from haptic nudge cues but not break-through forces. As training progressed, patients increased the number of targets acquired and spontaneously improved from one day to the next. Interactive visuo-haptic environments could be beneficial for attention training for severe TBI patients in the early stages of recovery and warrants further and more prolonged clinical testing.

  17. Heterogeneous Deformable Modeling of Bio-Tissues and Haptic Force Rendering for Bio-Object Modeling

    NASA Astrophysics Data System (ADS)

    Lin, Shiyong; Lee, Yuan-Shin; Narayan, Roger J.

    This paper presents a novel technique for modeling soft biological tissues as well as the development of an innovative interface for bio-manufacturing and medical applications. Heterogeneous deformable models may be used to represent the actual internal structures of deformable biological objects, which possess multiple components and nonuniform material properties. Both heterogeneous deformable object modeling and accurate haptic rendering can greatly enhance the realism and fidelity of virtual reality environments. In this paper, a tri-ray node snapping algorithm is proposed to generate a volumetric heterogeneous deformable model from a set of object interface surfaces between different materials. A constrained local static integration method is presented for simulating deformation and accurate force feedback based on the material properties of a heterogeneous structure. Biological soft tissue modeling is used as an example to demonstrate the proposed techniques. By integrating the heterogeneous deformable model into a virtual environment, users can both observe different materials inside a deformable object as well as interact with it by touching the deformable object using a haptic device. The presented techniques can be used for surgical simulation, bio-product design, bio-manufacturing, and medical applications.

  18. A Multi-Finger Interface with MR Actuators for Haptic Applications.

    PubMed

    Qin, Huanhuan; Song, Aiguo; Gao, Zhan; Liu, Yuqing; Jiang, Guohua

    2018-01-01

    Haptic devices with multi-finger input are highly desirable in providing realistic and natural feelings when interacting with the remote or virtual environment. Compared with the conventional actuators, MR (Magneto-rheological) actuators are preferable options in haptics because of larger passive torque and torque-volume ratios. Among the existing haptic MR actuators, most of them are still bulky and heavy. If they were smaller and lighter, they would become more suitable for haptics. In this paper, a small-scale yet powerful MR actuator was designed to build a multi-finger interface for the 6 DOF haptic device. The compact structure was achieved by adopting the multi-disc configuration. Based on this configuration, the MR actuator can generate the maximum torque of 480 N.mm with dimensions of only 36 mm diameter and 18 mm height. Performance evaluation showed that it can exhibit a relatively high dynamic range and good response characteristics when compared with some other haptic MR actuators. The multi-finger interface is equipped with three MR actuators and can provide up to 8 N passive force to the thumb, index and middle fingers, respectively. An application example was used to demonstrate the effectiveness and potential of this new MR actuator based interface.

  19. Haptic device for a ventricular shunt insertion simulator.

    PubMed

    Panchaphongsaphak, Bundit; Stutzer, Diego; Schwyter, Etienne; Bernays, René-Ludwig; Riener, Robert

    2006-01-01

    In this paper we propose a new one-degree-of-freedom haptic device that can be used to simulate ventricular shunt insertion procedures. The device is used together with the BRAINTRAIN training simulator developed for neuroscience education, neurological data visualization and surgical planning. The design of the haptic device is based on a push-pull cable concept. The rendered forces produced by a linear motor connected at one end of the cable are transferred to the user via a sliding mechanism at the end-effector located at the other end of the cable. The end-effector provides the range of movement up to 12 cm. The force is controlled by an open-loop impedance algorithm and can become up to 15 N.

  20. A novel shape-changing haptic table-top display

    NASA Astrophysics Data System (ADS)

    Wang, Jiabin; Zhao, Lu; Liu, Yue; Wang, Yongtian; Cai, Yi

    2018-01-01

    A shape-changing table-top display with haptic feedback allows its users to perceive 3D visual and texture displays interactively. Since few existing devices are developed as accurate displays with regulatory haptic feedback, a novel attentive and immersive shape changing mechanical interface (SCMI) consisting of image processing unit and transformation unit was proposed in this paper. In order to support a precise 3D table-top display with an offset of less than 2 mm, a custommade mechanism was developed to form precise surface and regulate the feedback force. The proposed image processing unit was capable of extracting texture data from 2D picture for rendering shape-changing surface and realizing 3D modeling. The preliminary evaluation result proved the feasibility of the proposed system.

  1. Realistic Real-Time Outdoor Rendering in Augmented Reality

    PubMed Central

    Kolivand, Hoshang; Sunar, Mohd Shahrizal

    2014-01-01

    Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems. PMID:25268480

  2. Realistic real-time outdoor rendering in augmented reality.

    PubMed

    Kolivand, Hoshang; Sunar, Mohd Shahrizal

    2014-01-01

    Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems.

  3. Design of a 4-DOF MR haptic master for application to robot surgery: virtual environment work

    NASA Astrophysics Data System (ADS)

    Oh, Jong-Seok; Choi, Seung-Hyun; Choi, Seung-Bok

    2014-09-01

    This paper presents the design and control performance of a novel type of 4-degrees-of-freedom (4-DOF) haptic master in cyberspace for a robot-assisted minimally invasive surgery (RMIS) application. By using a controllable magnetorheological (MR) fluid, the proposed haptic master can have a feedback function for a surgical robot. Due to the difficulty in utilizing real human organs in the experiment, the cyberspace that features the virtual object is constructed to evaluate the performance of the haptic master. In order to realize the cyberspace, a volumetric deformable object is represented by a shape-retaining chain-linked (S-chain) model, which is a fast volumetric model and is suitable for real-time applications. In the haptic architecture for an RMIS application, the desired torque and position induced from the virtual object of the cyberspace and the haptic master of real space are transferred to each other. In order to validate the superiority of the proposed master and volumetric model, a tracking control experiment is implemented with a nonhomogenous volumetric cubic object to demonstrate that the proposed model can be utilized in real-time haptic rendering architecture. A proportional-integral-derivative (PID) controller is then designed and empirically implemented to accomplish the desired torque trajectories. It has been verified from the experiment that tracking the control performance for torque trajectories from a virtual slave can be successfully achieved.

  4. Modeling the forces of cutting with scissors.

    PubMed

    Mahvash, Mohsen; Voo, Liming M; Kim, Diana; Jeung, Kristin; Wainer, Joshua; Okamura, Allison M

    2008-03-01

    Modeling forces applied to scissors during cutting of biological materials is useful for surgical simulation. Previous approaches to haptic display of scissor cutting are based on recording and replaying measured data. This paper presents an analytical model based on the concepts of contact mechanics and fracture mechanics to calculate forces applied to scissors during cutting of a slab of material. The model considers the process of cutting as a sequence of deformation and fracture phases. During deformation phases, forces applied to the scissors are calculated from a torque-angle response model synthesized from measurement data multiplied by a ratio that depends on the position of the cutting crack edge and the curve of the blades. Using the principle of conservation of energy, the forces of fracture are related to the fracture toughness of the material and the geometry of the blades of the scissors. The forces applied to scissors generally include high-frequency fluctuations. We show that the analytical model accurately predicts the average applied force. The cutting model is computationally efficient, so it can be used for real-time computations such as haptic rendering. Experimental results from cutting samples of paper, plastic, cloth, and chicken skin confirm the model, and the model is rendered in a haptic virtual environment.

  5. Haptic augmented skin surface generation toward telepalpation from a mobile skin image.

    PubMed

    Kim, K

    2018-05-01

    Very little is known about the methods of integrating palpation techniques to existing mobile teleskin imaging that delivers low quality tactile information (roughness) for telepalpation. However, no study has been reported yet regarding telehaptic palpation using mobile phone images for teledermatology or teleconsultations of skincare. This study is therefore aimed at introducing a new algorithm accurately reconstructing a haptic augmented skin surface for telehaptic palpation using a low-cost clip-on microscope simply attached to a mobile phone. Multiple algorithms such as gradient-based image enhancement, roughness-adaptive tactile mask generation, roughness-enhanced 3D tactile map building, and visual and haptic rendering with a three-degrees-of-freedom (DOF) haptic device were developed and integrated as one system. Evaluation experiments have been conducted to test the performance of 3D roughness reconstruction with/without the tactile mask. The results confirm that reconstructed haptic roughness with the tactile mask is superior to the reconstructed haptic roughness without the tactile mask. Additional experiments demonstrate that the proposed algorithm is robust against varying lighting conditions and blurring. In last, a user study has been designed to see the effect of the haptic modality to the existing visual only interface and the results attest that the haptic skin palpation can significantly improve the skin exam performance. Mobile image-based telehaptic palpation technology was proposed, and an initial version was developed. The developed technology was tested with several skin images and the experimental results showed the superiority of the proposed scheme in terms of the performance of haptic augmentation of real skin images. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  6. OzBot and haptics: remote surveillance to physical presence

    NASA Astrophysics Data System (ADS)

    Mullins, James; Fielding, Mick; Nahavandi, Saeid

    2009-05-01

    This paper reports on robotic and haptic technologies and capabilities developed for the law enforcement and defence community within Australia by the Centre for Intelligent Systems Research (CISR). The OzBot series of small and medium surveillance robots have been designed in Australia and evaluated by law enforcement and defence personnel to determine suitability and ruggedness in a variety of environments. Using custom developed digital electronics and featuring expandable data busses including RS485, I2C, RS232, video and Ethernet, the robots can be directly connected to many off the shelf payloads such as gas sensors, x-ray sources and camera systems including thermal and night vision. Differentiating the OzBot platform from its peers is its ability to be integrated directly with haptic technology or the 'haptic bubble' developed by CISR. Haptic interfaces allow an operator to physically 'feel' remote environments through position-force control and experience realistic force feedback. By adding the capability to remotely grasp an object, feel its weight, texture and other physical properties in real-time from the remote ground control unit, an operator's situational awareness is greatly improved through Haptic augmentation in an environment where remote-system feedback is often limited.

  7. Force Sensitive Handles and Capacitive Touch Sensor for Driving a Flexible Haptic-Based Immersive System

    PubMed Central

    Covarrubias, Mario; Bordegoni, Monica; Cugini, Umberto

    2013-01-01

    In this article, we present an approach that uses both two force sensitive handles (FSH) and a flexible capacitive touch sensor (FCTS) to drive a haptic-based immersive system. The immersive system has been developed as part of a multimodal interface for product design. The haptic interface consists of a strip that can be used by product designers to evaluate the quality of a 3D virtual shape by using touch, vision and hearing and, also, to interactively change the shape of the virtual object. Specifically, the user interacts with the FSH to move the virtual object and to appropriately position the haptic interface for retrieving the six degrees of freedom required for both manipulation and modification modalities. The FCTS allows the system to track the movement and position of the user's fingers on the strip, which is used for rendering visual and sound feedback. Two evaluation experiments are described, which involve both the evaluation and the modification of a 3D shape. Results show that the use of the haptic strip for the evaluation of aesthetic shapes is effective and supports product designers in the appreciation of the aesthetic qualities of the shape. PMID:24113680

  8. Force sensitive handles and capacitive touch sensor for driving a flexible haptic-based immersive system.

    PubMed

    Covarrubias, Mario; Bordegoni, Monica; Cugini, Umberto

    2013-10-09

    In this article, we present an approach that uses both two force sensitive handles (FSH) and a flexible capacitive touch sensor (FCTS) to drive a haptic-based immersive system. The immersive system has been developed as part of a multimodal interface for product design. The haptic interface consists of a strip that can be used by product designers to evaluate the quality of a 3D virtual shape by using touch, vision and hearing and, also, to interactively change the shape of the virtual object. Specifically, the user interacts with the FSH to move the virtual object and to appropriately position the haptic interface for retrieving the six degrees of freedom required for both manipulation and modification modalities. The FCTS allows the system to track the movement and position of the user's fingers on the strip, which is used for rendering visual and sound feedback. Two evaluation experiments are described, which involve both the evaluation and the modification of a 3D shape. Results show that the use of the haptic strip for the evaluation of aesthetic shapes is effective and supports product designers in the appreciation of the aesthetic qualities of the shape.

  9. H-Man: a planar, H-shape cabled differential robotic manipulandum for experiments on human motor control.

    PubMed

    Campolo, Domenico; Tommasino, Paolo; Gamage, Kumudu; Klein, Julius; Hughes, Charmayne M L; Masia, Lorenzo

    2014-09-30

    In the last decades more robotic manipulanda have been employed to investigate the effect of haptic environments on motor learning and rehabilitation. However, implementing complex haptic renderings can be challenging from technological and control perspectives. We propose a novel robot (H-Man) characterized by a mechanical design based on cabled differential transmission providing advantages over current robotic technology. The H-Man transmission translates to extremely simplified kinematics and homogenous dynamic properties, offering the possibility to generate haptic channels by passively blocking the mechanics, and eliminating stability concerns. We report results of experiments characterizing the performance of the device (haptic bandwidth, Z-width, and perceived impedance). We also present the results of a study investigating the influence of haptic channel compliance on motor learning in healthy individuals, which highlights the effects of channel compliance in enhancing proprioceptive information. The generation of haptic channels to study motor redundancy is not easy for actual robots because of the needs of powerful actuation and complex real-time control implementation. The mechanical design of H-Man affords the possibility to promptly create haptic channels by mechanical stoppers (on one of the motors) without compromising the superior backdriveability and high isotropic manipulability. This paper presents a novel robotic device for motor control studies and robotic rehabilitation. The hardware was designed with specific emphasis on the mechanics that result in a system that is easy to control, homogeneous, and is intrinsically safe for use. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Data-Driven Haptic Modeling and Rendering of Viscoelastic and Frictional Responses of Deformable Objects.

    PubMed

    Yim, Sunghoon; Jeon, Seokhee; Choi, Seungmoon

    2016-01-01

    In this paper, we present an extended data-driven haptic rendering method capable of reproducing force responses during pushing and sliding interaction on a large surface area. The main part of the approach is a novel input variable set for the training of an interpolation model, which incorporates the position of a proxy - an imaginary contact point on the undeformed surface. This allows us to estimate friction in both sliding and sticking states in a unified framework. Estimating the proxy position is done in real-time based on simulation using a sliding yield surface - a surface defining a border between the sliding and sticking regions in the external force space. During modeling, the sliding yield surface is first identified via an automated palpation procedure. Then, through manual palpation on a target surface, input data and resultant force data are acquired. The data are used to build a radial basis interpolation model. During rendering, this input-output mapping interpolation model is used to estimate force responses in real-time in accordance with the interaction input. Physical performance evaluation demonstrates that our approach achieves reasonably high estimation accuracy. A user study also shows plausible perceptual realism under diverse and extensive exploration.

  11. Perception and Haptic Rendering of Friction Moments.

    PubMed

    Kawasaki, H; Ohtuka, Y; Koide, S; Mouri, T

    2011-01-01

    This paper considers moments due to friction forces on the human fingertip. A computational technique called the friction moment arc method is presented. The method computes the static and/or dynamic friction moment independent of a friction force calculation. In addition, a new finger holder to display friction moment is presented. This device incorporates a small brushless motor and disk, and connects the human's finger to an interface finger of the five-fingered haptic interface robot HIRO II. Subjects' perception of friction moment while wearing the finger holder, as well as perceptions during object manipulation in a virtual reality environment, were evaluated experimentally.

  12. Roughness Perception of Haptically Displayed Fractal Surfaces

    NASA Technical Reports Server (NTRS)

    Costa, Michael A.; Cutkosky, Mark R.; Lau, Sonie (Technical Monitor)

    2000-01-01

    Surface profiles were generated by a fractal algorithm and haptically rendered on a force feedback joystick, Subjects were asked to use the joystick to explore pairs of surfaces and report to the experimenter which of the surfaces they felt was rougher. Surfaces were characterized by their root mean square (RMS) amplitude and their fractal dimension. The most important factor affecting the perceived roughness of the fractal surfaces was the RMS amplitude of the surface. When comparing surfaces of fractal dimension 1.2-1.35 it was found that the fractal dimension was negatively correlated with perceived roughness.

  13. A prototype percutaneous transhepatic cholangiography training simulator with real-time breathing motion.

    PubMed

    Villard, P F; Vidal, F P; Hunt, C; Bello, F; John, N W; Johnson, S; Gould, D A

    2009-11-01

    We present here a simulator for interventional radiology focusing on percutaneous transhepatic cholangiography (PTC). This procedure consists of inserting a needle into the biliary tree using fluoroscopy for guidance. The requirements of the simulator have been driven by a task analysis. The three main components have been identified: the respiration, the real-time X-ray display (fluoroscopy) and the haptic rendering (sense of touch). The framework for modelling the respiratory motion is based on kinematics laws and on the Chainmail algorithm. The fluoroscopic simulation is performed on the graphic card and makes use of the Beer-Lambert law to compute the X-ray attenuation. Finally, the haptic rendering is integrated to the virtual environment and takes into account the soft-tissue reaction force feedback and maintenance of the initial direction of the needle during the insertion. Five training scenarios have been created using patient-specific data. Each of these provides the user with variable breathing behaviour, fluoroscopic display tuneable to any device parameters and needle force feedback. A detailed task analysis has been used to design and build the PTC simulator described in this paper. The simulator includes real-time respiratory motion with two independent parameters (rib kinematics and diaphragm action), on-line fluoroscopy implemented on the Graphics Processing Unit and haptic feedback to feel the soft-tissue behaviour of the organs during the needle insertion.

  14. Virtual reality haptic dissection.

    PubMed

    Erolin, Caroline; Wilkinson, Caroline; Soames, Roger

    2011-12-01

    This project aims to create a three-dimensional digital model of the human hand and wrist which can be virtually 'dissected' through a haptic interface. Tissue properties will be added to the various anatomical structures to replicate a realistic look and feel. The project will explore the role of the medical artist, and investigate cross-discipline collaborations in the field of virtual anatomy. The software will be used to train anatomy students in dissection skills, before experience on a real cadaver. The effectiveness of the software will be evaluated and assessed both quantitatively as well as qualitatively.

  15. Human eye haptics-based multimedia.

    PubMed

    Velandia, David; Uribe-Quevedo, Alvaro; Perez-Gutierrez, Byron

    2014-01-01

    Immersive and interactive multimedia applications offer complementary study tools in anatomy as users can explore 3D models while obtaining information about the organ, tissue or part being explored. Haptics increases the sense of interaction with virtual objects improving user experience in a more realistic manner. Common eye studying tools are books, illustrations, assembly models, and more recently these are being complemented with mobile apps whose 3D capabilities, computing power and customers are increasing. The goal of this project is to develop a complementary eye anatomy and pathology study tool using deformable models within a multimedia application, offering the students the opportunity for exploring the eye from up close and within with relevant information. Validation of the tool provided feedback on the potential of the development, along with suggestions on improving haptic feedback and navigation.

  16. Effects of kinesthetic haptic feedback on standing stability of young healthy subjects and stroke patients.

    PubMed

    Afzal, Muhammad Raheel; Byun, Ha-Young; Oh, Min-Kyun; Yoon, Jungwon

    2015-03-13

    Haptic control is a useful therapeutic option in rehabilitation featuring virtual reality interaction. As with visual and vibrotactile biofeedback, kinesthetic haptic feedback may assist in postural control, and can achieve balance control. Kinesthetic haptic feedback in terms of body sway can be delivered via a commercially available haptic device and can enhance the balance stability of both young healthy subjects and stroke patients. Our system features a waist-attached smartphone, software running on a computer (PC), and a dedicated Phantom Omni® device. Young healthy participants performed balance tasks after assumption of each of four distinct postures for 30 s (one foot on the ground; the Tandem Romberg stance; one foot on foam; and the Tandem Romberg stance on foam) with eyes closed. Patient eyes were not closed and assumption of the Romberg stance (only) was tested during a balance task 25 s in duration. An Android application running continuously on the smartphone sent mediolateral (ML) and anteroposterior (AP) tilt angles to a PC, which generated kinesthetic haptic feedback via Phantom Omni®. A total of 16 subjects, 8 of whom were young healthy and 8 of whom had suffered stroke, participated in the study. Post-experiment data analysis was performed using MATLAB®. Mean Velocity Displacement (MVD), Planar Deviation (PD), Mediolateral Trajectory (MLT) and Anteroposterior Trajectory (APT) parameters were analyzed to measure reduction in body sway. Our kinesthetic haptic feedback system was effective to reduce postural sway in young healthy subjects regardless of posture and the condition of the substrate (the ground) and to improve MVD and PD in stroke patients who assumed the Romberg stance. Analysis of Variance (ANOVA) revealed that kinesthetic haptic feedback significantly reduced body sway in both categories of subjects. Kinesthetic haptic feedback can be implemented using a commercial haptic device and a smartphone. Intuitive balance cues were created using the handle of a haptic device, rendering the approach very simple yet efficient in practice. This novel form of biofeedback will be a useful rehabilitation tool improving the balance of stroke patients.

  17. A perspective on the role and utility of haptic feedback in laparoscopic skills training.

    PubMed

    Singapogu, Ravikiran; Burg, Timothy; Burg, Karen J L; Smith, Dane E; Eckenrode, Amanda H

    2014-01-01

    Laparoscopic surgery is a minimally invasive surgical technique with significant potential benefits to the patient, including shorter recovery time, less scarring, and decreased costs. There is a growing need to teach surgical trainees this emerging surgical technique. Simulators, ranging from simple "box" trainers to complex virtual reality (VR) trainers, have emerged as the most promising method for teaching basic laparoscopic surgical skills. Current box trainers require oversight from an expert surgeon for both training and assessing skills. VR trainers decrease the dependence on expert teachers during training by providing objective, real-time feedback and automatic skills evaluation. However, current VR trainers generally have limited credibility as a means to prepare new surgeons and have often fallen short of educators' expectations. Several researchers have speculated that the missing component in modern VR trainers is haptic feedback, which refers to the range of touch sensations encountered during surgery. These force types and ranges need to be adequately rendered by simulators for a more complete training experience. This article presents a perspective of the role and utility of haptic feedback during laparoscopic surgery and laparoscopic skills training by detailing the ranges and types of haptic sensations felt by the operating surgeon, along with quantitative studies of how this feedback is used. Further, a number of research studies that have documented human performance effects as a result of the presence of haptic feedback are critically reviewed. Finally, key research directions in using haptic feedback for laparoscopy training simulators are identified.

  18. Input and output for surgical simulation: devices to measure tissue properties in vivo and a haptic interface for laparoscopy simulators.

    PubMed

    Ottensmeyer, M P; Ben-Ur, E; Salisbury, J K

    2000-01-01

    Current efforts in surgical simulation very often focus on creating realistic graphical feedback, but neglect some or all tactile and force (haptic) feedback that a surgeon would normally receive. Simulations that do include haptic feedback do not typically use real tissue compliance properties, favoring estimates and user feedback to determine realism. When tissue compliance data are used, there are virtually no in vivo property measurements to draw upon. Together with the Center for Innovative Minimally Invasive Therapy at the Massachusetts General Hospital, the Haptics Group is developing tools to introduce more comprehensive haptic feedback in laparoscopy simulators and to provide biological tissue material property data for our software simulation. The platform for providing haptic feedback is a PHANToM Haptic Interface, produced by SensAble Technologies, Inc. Our devices supplement the PHANToM to provide for grasping and optionally, for the roll axis of the tool. Together with feedback from the PHANToM, which provides the pitch, yaw and thrust axes of a typical laparoscopy tool, we can recreate all of the haptic sensations experienced during laparoscopy. The devices integrate real laparoscopy toolhandles and a compliant torso model to complete the set of visual and tactile sensations. Biological tissues are known to exhibit non-linear mechanical properties, and change their properties dramatically when removed from a living organism. To measure the properties in vivo, two devices are being developed. The first is a small displacement, 1-D indenter. It will measure the linear tissue compliance (stiffness and damping) over a wide range of frequencies. These data will be used as inputs to a finite element or other model. The second device will be able to deflect tissues in 3-D over a larger range, so that the non-linearities due to changes in the tissue geometry will be measured. This will allow us to validate the performance of the model on large tissue deformations. Both devices are designed to pass through standard 12 mm laparoscopy trocars, and will be suitable for use during open or minimally invasive procedures. We plan to acquire data from pigs used by surgeons for training purposes, but conceivably, the tools could be refined for use on humans undergoing surgery. Our work will provide the necessary data input for surgical simulations to accurately model the force interactions that a surgeon would have with tissue, and will provide the force output to create a truly realistic simulation of minimally invasive surgery.

  19. Real-time simulation of the nonlinear visco-elastic deformations of soft tissues.

    PubMed

    Basafa, Ehsan; Farahmand, Farzam

    2011-05-01

    Mass-spring-damper (MSD) models are often used for real-time surgery simulation due to their fast response and fairly realistic deformation replication. An improved real time simulation model of soft tissue deformation due to a laparoscopic surgical indenter was developed and tested. The mechanical realization of conventional MSD models was improved using nonlinear springs and nodal dampers, while their high computational efficiency was maintained using an adapted implicit integration algorithm. New practical algorithms for model parameter tuning, collision detection, and simulation were incorporated. The model was able to replicate complex biological soft tissue mechanical properties under large deformations, i.e., the nonlinear and viscoelastic behaviors. The simulated response of the model after tuning of its parameters to the experimental data of a deer liver sample, closely tracked the reference data with high correlation and maximum relative differences of less than 5 and 10%, for the tuning and testing data sets respectively. Finally, implementation of the proposed model and algorithms in a graphical environment resulted in a real-time simulation with update rates of 150 Hz for interactive deformation and haptic manipulation, and 30 Hz for visual rendering. The proposed real time simulation model of soft tissue deformation due to a laparoscopic surgical indenter was efficient, realistic, and accurate in ex vivo testing. This model is a suitable candidate for testing in vivo during laparoscopic surgery.

  20. ProMIS augmented reality training of laparoscopic procedures face validity.

    PubMed

    Botden, Sanne M B I; Buzink, Sonja N; Schijven, Marlies P; Jakimowicz, Jack J

    2008-01-01

    Conventional video trainers lack the ability to assess the trainee objectively, but offer modalities that are often missing in virtual reality simulation, such as realistic haptic feedback. The ProMIS augmented reality laparoscopic simulator retains the benefit of a traditional box trainer, by using original laparoscopic instruments and tactile tasks, but additionally generates objective measures of performance. Fifty-five participants performed a "basic skills" and "suturing and knot-tying" task on ProMIS, after which they filled out a questionnaire regarding realism, haptics, and didactic value of the simulator, on a 5-point-Likert scale. The participants were allotted to 2 experience groups: "experienced" (>50 procedures and >5 sutures; N = 27), and "moderately experienced" (<50 procedures and <5 sutures; N = 28). General consensus among all participants, particularly the experienced, was that ProMIS is a useful tool for training (mean: 4.67, SD: 0.48). It was considered very realistic (mean: 4.44, SD: 0.66), with good haptics (mean: 4.10, SD: 0.97) and didactic value (mean 4.10, SD: 0.65). This study established the face validity of the ProMIS augmented reality simulator for "basic skills" and "suturing and knot-tying" tasks. ProMIS was considered a good tool for training in laparoscopic skills for surgical residents and surgeons.

  1. Experimental evaluation of a miniature MR device for a wide range of human perceivable haptic sensations

    NASA Astrophysics Data System (ADS)

    Yang, Tae-Heon; Koo, Jeong-Hoi

    2017-12-01

    Humans can experience a realistic and vivid haptic sensations by the sense of touch. In order to have a fully immersive haptic experience, both kinaesthetic and vibrotactile information must be presented to human users. Currently, little haptic research has been performed on small haptic actuators that can covey both vibrotactile feedback based on the frequency of vibrations up to the human-perceivable limit and multiple levels of kinaesthetic feedback rapidly. Therefore, this study intends to design a miniature haptic device based on MR fluid and experimentally evaluate its ability to convey vibrotactile feedback up to 300 Hz along with kinaesthetic feedback. After constructing a prototype device, a series of testing was performed to evaluate its performance of the prototype using an experimental setup, consisting of a precision dynamic mechanical analyzer and an accelerometer. The kinaesthetic testing results show that the prototype device can provide the force rate up to 89% at 5 V (360 mA), which can be discretized into multiple levels of ‘just noticeable difference’ force rate, indicating that the device can convey a wide range of kinaesthetic sensations. To evaluate the high frequency vibrotactile feedback performance of the device, its acceleration responses were measured and processed using the FFT analysis. The results indicate that the device can convey high frequency vibrotactile sensations up to 300 Hz with the sufficiently large intensity of accelerations that human can feel.

  2. Virtual reality haptic human dissection.

    PubMed

    Needham, Caroline; Wilkinson, Caroline; Soames, Roger

    2011-01-01

    This project aims to create a three-dimensional digital model of the human hand and wrist which can be virtually 'dissected' through a haptic interface. Tissue properties will be added to the various anatomical structures to replicate a realistic look and feel. The project will explore the role of the medical artist and investigate the cross-discipline collaborations required in the field of virtual anatomy. The software will be used to train anatomy students in dissection skills before experience on a real cadaver. The effectiveness of the software will be evaluated and assessed both quantitatively as well as qualitatively.

  3. Introduction to haptics for neurosurgeons.

    PubMed

    L'Orsa, Rachael; Macnab, Chris J B; Tavakoli, Mahdi

    2013-01-01

    Robots are becoming increasingly relevant to neurosurgeons, extending a neurosurgeon's physical capabilities, improving navigation within the surgical landscape when combined with advanced imaging, and propelling the movement toward minimally invasive surgery. Most surgical robots, however, isolate surgeons from the full range of human senses during a procedure. This forces surgeons to rely on vision alone for guidance through the surgical corridor, which limits the capabilities of the system, requires significant operator training, and increases the surgeon's workload. Incorporating haptics into these systems, ie, enabling the surgeon to "feel" forces experienced by the tool tip of the robot, could render these limitations obsolete by making the robot feel more like an extension of the surgeon's own body. Although the use of haptics in neurosurgical robots is still mostly the domain of research, neurosurgeons who keep abreast of this emerging field will be more prepared to take advantage of it as it becomes more prevalent in operating theaters. Thus, this article serves as an introduction to the field of haptics for neurosurgeons. We not only outline the current and future benefits of haptics but also introduce concepts in the fields of robotic technology and computer control. This knowledge will allow readers to be better aware of limitations in the technology that can affect performance and surgical outcomes, and "knowing the right questions to ask" will be invaluable for surgeons who have purchasing power within their departments.

  4. Random forest classification of large volume structures for visuo-haptic rendering in CT images

    NASA Astrophysics Data System (ADS)

    Mastmeyer, Andre; Fortmeier, Dirk; Handels, Heinz

    2016-03-01

    For patient-specific voxel-based visuo-haptic rendering of CT scans of the liver area, the fully automatic segmentation of large volume structures such as skin, soft tissue, lungs and intestine (risk structures) is important. Using a machine learning based approach, several existing segmentations from 10 segmented gold-standard patients are learned by random decision forests individually and collectively. The core of this paper is feature selection and the application of the learned classifiers to a new patient data set. In a leave-some-out cross-validation, the obtained full volume segmentations are compared to the gold-standard segmentations of the untrained patients. The proposed classifiers use a multi-dimensional feature space to estimate the hidden truth, instead of relying on clinical standard threshold and connectivity based methods. The result of our efficient whole-body section classification are multi-label maps with the considered tissues. For visuo-haptic simulation, other small volume structures would have to be segmented additionally. We also take a look into these structures (liver vessels). For an experimental leave-some-out study consisting of 10 patients, the proposed method performs much more efficiently compared to state of the art methods. In two variants of leave-some-out experiments we obtain best mean DICE ratios of 0.79, 0.97, 0.63 and 0.83 for skin, soft tissue, hard bone and risk structures. Liver structures are segmented with DICE 0.93 for the liver, 0.43 for blood vessels and 0.39 for bile vessels.

  5. GPU-based real-time soft tissue deformation with cutting and haptic feedback.

    PubMed

    Courtecuisse, Hadrien; Jung, Hoeryong; Allard, Jérémie; Duriez, Christian; Lee, Doo Yong; Cotin, Stéphane

    2010-12-01

    This article describes a series of contributions in the field of real-time simulation of soft tissue biomechanics. These contributions address various requirements for interactive simulation of complex surgical procedures. In particular, this article presents results in the areas of soft tissue deformation, contact modelling, simulation of cutting, and haptic rendering, which are all relevant to a variety of medical interventions. The contributions described in this article share a common underlying model of deformation and rely on GPU implementations to significantly improve computation times. This consistency in the modelling technique and computational approach ensures coherent results as well as efficient, robust and flexible solutions. Copyright © 2010 Elsevier Ltd. All rights reserved.

  6. Development and validation of an artificial wetlab training system for the lumbar discectomy.

    PubMed

    Adermann, Jens; Geissler, Norman; Bernal, Luis E; Kotzsch, Susanne; Korb, Werner

    2014-09-01

    An initial research indicated that realistic haptic simulators with an adapted training concept are needed to enhance the training for spinal surgery. A cognitive task analysis (CTA) was performed to define a realistic and helpful scenario-based simulation. Based on the results a simulator for lumbar discectomy was developed. Additionally, a realistic training operating room was built for a pilot. The results were validated. The CTA showed a need for realistic scenario-based training in spine surgery. The developed simulator consists of synthetic bone structures, synthetic soft tissue and an advanced bleeding system. Due to the close interdisciplinary cooperation of surgeons between engineers and psychologists, the iterative multicentre validation showed that the simulator is visually and haptically realistic. The simulator offers integrated sensors for the evaluation of the traction being used and the compression during surgery. The participating surgeons in the pilot workshop rated the simulator and the training concept as very useful for the improvement of their surgical skills. In the context of the present work a precise definition for the simulator and training concept was developed. The additional implementation of sensors allows the objective evaluation of the surgical training by the trainer. Compared to other training simulators and concepts, the high degree of objectivity strengthens the acceptance of the feedback. The measured data of the nerve root tension and the compression of the dura can be used for intraoperative control and a detailed postoperative evaluation.

  7. Using postural synergies to animate a low-dimensional hand avatar in haptic simulation.

    PubMed

    Mulatto, Sara; Formaglio, Alessandro; Malvezzi, Monica; Prattichizzo, Domenico

    2013-01-01

    A technique to animate a realistic hand avatar with 20 DoFs based on the biomechanics of the human hand is presented. The animation does not use any sensor glove or advanced tracker with markers. The proposed approach is based on the knowledge of a set of kinematic constraints on the model of the hand, referred to as postural synergies, which allows to represent the hand posture using a number of variables lower than the number of joints of the hand model. This low-dimensional set of parameters is estimated from direct measurement of the motion of thumb and index finger tracked using two haptic devices. A kinematic inversion algorithm has been developed, which takes synergies into account and estimates the kinematic configuration of the whole hand, i.e., also of the fingers whose end tips are not directly tracked by the two haptic devices. The hand skin is deformable and its deformation is computed using a linear vertex blending technique. The proposed synergy-based animation of the hand avatar involves only algebraic computations and is suitable for real-time implementation as required in haptics.

  8. A pervasive visual-haptic framework for virtual delivery training.

    PubMed

    Abate, Andrea F; Acampora, Giovanni; Loia, Vincenzo; Ricciardi, Stefano; Vasilakos, Athanasios V

    2010-03-01

    Thanks to the advances of voltage regulator (VR) technologies and haptic systems, virtual simulators are increasingly becoming a viable alternative to physical simulators in medicine and surgery, though many challenges still remain. In this study, a pervasive visual-haptic framework aimed to the training of obstetricians and midwives to vaginal delivery is described. The haptic feedback is provided by means of two hand-based haptic devices able to reproduce force-feedbacks on fingers and arms, thus enabling a much more realistic manipulation respect to stylus-based solutions. The interactive simulation is not solely driven by an approximated model of complex forces and physical constraints but, instead, is approached by a formal modeling of the whole labor and of the assistance/intervention procedures performed by means of a timed automata network and applied to a parametrical 3-D model of the anatomy, able to mimic a wide range of configurations. This novel methodology is able to represent not only the sequence of the main events associated to either a spontaneous or to an operative childbirth process, but also to help in validating the manual intervention as the actions performed by the user during the simulation are evaluated according to established medical guidelines. A discussion on the first results as well as on the challenges still unaddressed is included.

  9. Evaluation of Pseudo-Haptic Interactions with Soft Objects in Virtual Environments.

    PubMed

    Li, Min; Sareh, Sina; Xu, Guanghua; Ridzuan, Maisarah Binti; Luo, Shan; Xie, Jun; Wurdemann, Helge; Althoefer, Kaspar

    2016-01-01

    This paper proposes a pseudo-haptic feedback method conveying simulated soft surface stiffness information through a visual interface. The method exploits a combination of two feedback techniques, namely visual feedback of soft surface deformation and control of the indenter avatar speed, to convey stiffness information of a simulated surface of a soft object in virtual environments. The proposed method was effective in distinguishing different sizes of virtual hard nodules integrated into the simulated soft bodies. To further improve the interactive experience, the approach was extended creating a multi-point pseudo-haptic feedback system. A comparison with regards to (a) nodule detection sensitivity and (b) elapsed time as performance indicators in hard nodule detection experiments to a tablet computer incorporating vibration feedback was conducted. The multi-point pseudo-haptic interaction is shown to be more time-efficient than the single-point pseudo-haptic interaction. It is noted that multi-point pseudo-haptic feedback performs similarly well when compared to a vibration-based feedback method based on both performance measures elapsed time and nodule detection sensitivity. This proves that the proposed method can be used to convey detailed haptic information for virtual environmental tasks, even subtle ones, using either a computer mouse or a pressure sensitive device as an input device. This pseudo-haptic feedback method provides an opportunity for low-cost simulation of objects with soft surfaces and hard inclusions, as, for example, occurring in ever more realistic video games with increasing emphasis on interaction with the physical environment and minimally invasive surgery in the form of soft tissue organs with embedded cancer nodules. Hence, the method can be used in many low-budget applications where haptic sensation is required, such as surgeon training or video games, either using desktop computers or portable devices, showing reasonably high fidelity in conveying stiffness perception to the user.

  10. Validation of the updated ArthroS simulator: face and construct validity of a passive haptic virtual reality simulator with novel performance metrics.

    PubMed

    Garfjeld Roberts, Patrick; Guyver, Paul; Baldwin, Mathew; Akhtar, Kash; Alvand, Abtin; Price, Andrew J; Rees, Jonathan L

    2017-02-01

    To assess the construct and face validity of ArthroS, a passive haptic VR simulator. A secondary aim was to evaluate the novel performance metrics produced by this simulator. Two groups of 30 participants, each divided into novice, intermediate or expert based on arthroscopic experience, completed three separate tasks on either the knee or shoulder module of the simulator. Performance was recorded using 12 automatically generated performance metrics and video footage of the arthroscopic procedures. The videos were blindly assessed using a validated global rating scale (GRS). Participants completed a survey about the simulator's realism and training utility. This new simulator demonstrated construct validity of its tasks when evaluated against a GRS (p ≤ 0.003 in all cases). Regarding it's automatically generated performance metrics, established outputs such as time taken (p ≤ 0.001) and instrument path length (p ≤ 0.007) also demonstrated good construct validity. However, two-thirds of the proposed 'novel metrics' the simulator reports could not distinguish participants based on arthroscopic experience. Face validity assessment rated the simulator as a realistic and useful tool for trainees, but the passive haptic feedback (a key feature of this simulator) is rated as less realistic. The ArthroS simulator has good task construct validity based on established objective outputs, but some of the novel performance metrics could not distinguish between surgical experience. The passive haptic feedback of the simulator also needs improvement. If simulators could offer automated and validated performance feedback, this would facilitate improvements in the delivery of training by allowing trainees to practise and self-assess.

  11. Co-located haptic and 3D graphic interface for medical simulations.

    PubMed

    Berkelman, Peter; Miyasaka, Muneaki; Bozlee, Sebastian

    2013-01-01

    We describe a system which provides high-fidelity haptic feedback in the same physical location as a 3D graphical display, in order to enable realistic physical interaction with virtual anatomical tissue during modelled procedures such as needle driving, palpation, and other interventions performed using handheld instruments. The haptic feedback is produced by the interaction between an array of coils located behind a thin flat LCD screen, and permanent magnets embedded in the instrument held by the user. The coil and magnet configuration permits arbitrary forces and torques to be generated on the instrument in real time according to the dynamics of the simulated tissue by activating the coils in combination. A rigid-body motion tracker provides position and orientation feedback of the handheld instrument to the computer simulation, and the 3D display is produced using LCD shutter glasses and a head-tracking system for the user.

  12. Radiofrequency ablation of hepatic tumors: simulation, planning, and contribution of virtual reality and haptics.

    PubMed

    Villard, Caroline; Soler, Luc; Gangi, Afshin

    2005-08-01

    For radiofrequency ablation (RFA) of liver tumors, evaluation of vascular architecture, post-RFA necrosis prediction, and the choice of a suitable needle placement strategy using conventional radiological techniques remain difficult. In an attempt to enhance the safety of RFA, a 3D simulator, treatment planning, and training tool, that simulates the insertion of the needle, the necrosis of the treated area, and proposes an optimal needle placement, has been developed. The 3D scenes are automatically reconstructed from enhanced spiral CT scans. The simulator takes into account the cooling effect of local vessels greater than 3 mm in diameter, making necrosis shapes more realistic. Optimal needle positioning can be automatically generated by the software to produce complete destruction of the tumor, with maximum respect of the healthy liver and of all major structures to avoid. We also studied how the use of virtual reality and haptic devices are valuable to make simulation and training realistic and effective.

  13. Designing Media for Visually-Impaired Users of Refreshable Touch Displays: Possibilities and Pitfalls.

    PubMed

    O'Modhrain, Sile; Giudice, Nicholas A; Gardner, John A; Legge, Gordon E

    2015-01-01

    This paper discusses issues of importance to designers of media for visually impaired users. The paper considers the influence of human factors on the effectiveness of presentation as well as the strengths and weaknesses of tactile, vibrotactile, haptic, and multimodal methods of rendering maps, graphs, and models. The authors, all of whom are visually impaired researchers in this domain, present findings from their own work and work of many others who have contributed to the current understanding of how to prepare and render images for both hard-copy and technology-mediated presentation of Braille and tangible graphics.

  14. In vivo biomechanical measurement and haptic simulation of portal placement procedure in shoulder arthroscopic surgery

    PubMed Central

    Chae, Sanghoon; Jung, Sung-Weon

    2018-01-01

    A survey of 67 experienced orthopedic surgeons indicated that precise portal placement was the most important skill in arthroscopic surgery. However, none of the currently available virtual reality simulators include simulation / training in portal placement, including haptic feedback of the necessary puncture force. This study aimed to: (1) measure the in vivo force and stiffness during a portal placement procedure in an actual operating room and (2) implement active haptic simulation of a portal placement procedure using the measured in vivo data. We measured the force required for port placement and the stiffness of the joint capsule during portal placement procedures performed by an experienced arthroscopic surgeon. Based on the acquired mechanical property values, we developed a cable-driven active haptic simulator designed to train the portal placement skill and evaluated the validity of the simulated haptics. Ten patients diagnosed with rotator cuff tears were enrolled in this experiment. The maximum peak force and joint capsule stiffness during posterior portal placement procedures were 66.46 (±10.76N) and 2560.82(±252.92) N/m, respectively. We then designed an active haptic simulator using the acquired data. Our cable-driven mechanism structure had a friction force of 3.763 ± 0.341 N, less than 6% of the mean puncture force. Simulator performance was evaluated by comparing the target stiffness and force with the stiffness and force reproduced by the device. R-squared values were 0.998 for puncture force replication and 0.902 for stiffness replication, indicating that the in vivo data can be used to implement a realistic haptic simulator. PMID:29494691

  15. Design and Evaluation of a Cable-Driven fMRI-Compatible Haptic Interface to Investigate Precision Grip Control

    PubMed Central

    Vigaru, Bogdan; Sulzer, James; Gassert, Roger

    2016-01-01

    Our hands and fingers are involved in almost all activities of daily living and, as such, have a disproportionately large neural representation. Functional magnetic resonance imaging investigations into the neural control of the hand have revealed great advances, but the harsh MRI environment has proven to be a challenge to devices capable of delivering a large variety of stimuli necessary for well-controlled studies. This paper presents a fMRI-compatible haptic interface to investigate the neural mechanisms underlying precision grasp control. The interface, located at the scanner bore, is controlled remotely through a shielded electromagnetic actuation system positioned at the end of the scanner bed and then through a high stiffness, low inertia cable transmission. We present the system design, taking into account requirements defined by the biomechanics and dynamics of the human hand, as well as the fMRI environment. Performance evaluation revealed a structural stiffness of 3.3 N/mm, renderable forces up to 94 N, and a position control bandwidth of at least 19 Hz. MRI-compatibility tests showed no degradation in the operation of the haptic interface or the image quality. A preliminary fMRI experiment during a pilot study validated the usability of the haptic interface, illustrating the possibilities offered by this device. PMID:26441454

  16. Haptic-Based Neurorehabilitation in Poststroke Patients: A Feasibility Prospective Multicentre Trial for Robotics Hand Rehabilitation

    PubMed Central

    Daud Albasini, Omar A.; Oboe, Roberto; Tonin, Paolo; Paolucci, Stefano; Sandrini, Giorgio; Piron, Lamberto

    2013-01-01

    Background. Haptic robots allow the exploitation of known motor learning mechanisms, representing a valuable option for motor treatment after stroke. The aim of this feasibility multicentre study was to test the clinical efficacy of a haptic prototype, for the recovery of hand function after stroke. Methods. A prospective pilot clinical trial was planned on 15 consecutive patients enrolled in 3 rehabilitation centre in Italy. All the framework features of the haptic robot (e.g., control loop, external communication, and graphic rendering for virtual reality) were implemented into a real-time MATLAB/Simulink environment, controlling a five-bar linkage able to provide forces up to 20 [N] at the end effector, used for finger and hand rehabilitation therapies. Clinical (i.e., Fugl-Meyer upper extremity scale; nine hold pegboard test) and kinematics (i.e., time; velocity; jerk metric; normalized jerk of standard movements) outcomes were assessed before and after treatment to detect changes in patients' motor performance. Reorganization of cortical activation was detected in one patient by fMRI. Results and Conclusions. All patients showed significant improvements in both clinical and kinematic outcomes. Additionally, fMRI results suggest that the proposed approach may promote a better cortical activation in the brain. PMID:24319496

  17. Haptic-based neurorehabilitation in poststroke patients: a feasibility prospective multicentre trial for robotics hand rehabilitation.

    PubMed

    Turolla, Andrea; Daud Albasini, Omar A; Oboe, Roberto; Agostini, Michela; Tonin, Paolo; Paolucci, Stefano; Sandrini, Giorgio; Venneri, Annalena; Piron, Lamberto

    2013-01-01

    Background. Haptic robots allow the exploitation of known motor learning mechanisms, representing a valuable option for motor treatment after stroke. The aim of this feasibility multicentre study was to test the clinical efficacy of a haptic prototype, for the recovery of hand function after stroke. Methods. A prospective pilot clinical trial was planned on 15 consecutive patients enrolled in 3 rehabilitation centre in Italy. All the framework features of the haptic robot (e.g., control loop, external communication, and graphic rendering for virtual reality) were implemented into a real-time MATLAB/Simulink environment, controlling a five-bar linkage able to provide forces up to 20 [N] at the end effector, used for finger and hand rehabilitation therapies. Clinical (i.e., Fugl-Meyer upper extremity scale; nine hold pegboard test) and kinematics (i.e., time; velocity; jerk metric; normalized jerk of standard movements) outcomes were assessed before and after treatment to detect changes in patients' motor performance. Reorganization of cortical activation was detected in one patient by fMRI. Results and Conclusions. All patients showed significant improvements in both clinical and kinematic outcomes. Additionally, fMRI results suggest that the proposed approach may promote a better cortical activation in the brain.

  18. Exposure Render: An Interactive Photo-Realistic Volume Rendering Framework

    PubMed Central

    Kroes, Thomas; Post, Frits H.; Botha, Charl P.

    2012-01-01

    The field of volume visualization has undergone rapid development during the past years, both due to advances in suitable computing hardware and due to the increasing availability of large volume datasets. Recent work has focused on increasing the visual realism in Direct Volume Rendering (DVR) by integrating a number of visually plausible but often effect-specific rendering techniques, for instance modeling of light occlusion and depth of field. Besides yielding more attractive renderings, especially the more realistic lighting has a positive effect on perceptual tasks. Although these new rendering techniques yield impressive results, they exhibit limitations in terms of their exibility and their performance. Monte Carlo ray tracing (MCRT), coupled with physically based light transport, is the de-facto standard for synthesizing highly realistic images in the graphics domain, although usually not from volumetric data. Due to the stochastic sampling of MCRT algorithms, numerous effects can be achieved in a relatively straight-forward fashion. For this reason, we have developed a practical framework that applies MCRT techniques also to direct volume rendering (DVR). With this work, we demonstrate that a host of realistic effects, including physically based lighting, can be simulated in a generic and flexible fashion, leading to interactive DVR with improved realism. In the hope that this improved approach to DVR will see more use in practice, we have made available our framework under a permissive open source license. PMID:22768292

  19. Control of repulsive force in a virtual environment using an electrorheological haptic master for a surgical robot application

    NASA Astrophysics Data System (ADS)

    Oh, Jong-Seok; Choi, Seung-Hyun; Choi, Seung-Bok

    2014-01-01

    This paper presents control performances of a new type of four-degrees-of-freedom (4-DOF) haptic master that can be used for robot-assisted minimally invasive surgery (RMIS). By adopting a controllable electrorheological (ER) fluid, the function of the proposed master is realized as a haptic feedback as well as remote manipulation. In order to verify the efficacy of the proposed master and method, an experiment is conducted with deformable objects featuring human organs. Since the use of real human organs is difficult for control due to high cost and moral hazard, an excellent alternative method, the virtual reality environment, is used for control in this work. In order to embody a human organ in the virtual space, the experiment adopts a volumetric deformable object represented by a shape-retaining chain linked (S-chain) model which has salient properties such as fast and realistic deformation of elastic objects. In haptic architecture for RMIS, the desired torque/force and desired position originating from the object of the virtual slave and operator of the haptic master are transferred to each other. In order to achieve the desired torque/force trajectories, a sliding mode controller (SMC) which is known to be robust to uncertainties is designed and empirically implemented. Tracking control performances for various torque/force trajectories from the virtual slave are evaluated and presented in the time domain.

  20. Integration of soft tissue model and open haptic device for medical training simulator

    NASA Astrophysics Data System (ADS)

    Akasum, G. F.; Ramdhania, L. N.; Suprijanto; Widyotriatmo, A.

    2016-03-01

    Minimally Invasive Surgery (MIS) has been widely used to perform any surgical procedures nowadays. Currently, MIS has been applied in some cases in Indonesia. Needle insertion is one of simple MIS procedure that can be used for some purposes. Before the needle insertion technique used in the real situation, it essential to train this type of medical student skills. The research has developed an open platform of needle insertion simulator with haptic feedback that providing the medical student a realistic feel encountered during the actual procedures. There are three main steps in build the training simulator, which are configure hardware system, develop a program to create soft tissue model and the integration of hardware and software. For evaluating its performance, haptic simulator was tested by 24 volunteers on a scenario of soft tissue model. Each volunteer must insert the needle on simulator until rearch the target point with visual feedback that visualized on the monitor. From the result it can concluded that the soft tissue model can bring the sensation of touch through the perceived force feedback on haptic actuator by looking at the different force in accordance with different stiffness in each layer.

  1. Shadow-driven 4D haptic visualization.

    PubMed

    Zhang, Hui; Hanson, Andrew

    2007-01-01

    Just as we can work with two-dimensional floor plans to communicate 3D architectural design, we can exploit reduced-dimension shadows to manipulate the higher-dimensional objects generating the shadows. In particular, by taking advantage of physically reactive 3D shadow-space controllers, we can transform the task of interacting with 4D objects to a new level of physical reality. We begin with a teaching tool that uses 2D knot diagrams to manipulate the geometry of 3D mathematical knots via their projections; our unique 2D haptic interface allows the user to become familiar with sketching, editing, exploration, and manipulation of 3D knots rendered as projected imageson a 2D shadow space. By combining graphics and collision-sensing haptics, we can enhance the 2D shadow-driven editing protocol to successfully leverage 2D pen-and-paper or blackboard skills. Building on the reduced-dimension 2D editing tool for manipulating 3D shapes, we develop the natural analogy to produce a reduced-dimension 3D tool for manipulating 4D shapes. By physically modeling the correct properties of 4D surfaces, their bending forces, and their collisions in the 3D haptic controller interface, we can support full-featured physical exploration of 4D mathematical objects in a manner that is otherwise far beyond the experience accessible to human beings. As far as we are aware, this paper reports the first interactive system with force-feedback that provides "4D haptic visualization" permitting the user to model and interact with 4D cloth-like objects.

  2. Three Dimensional Projection Environment for Molecular Design and Surgical Simulation

    DTIC Science & Technology

    2011-08-01

    bypasses the cumbersome meshing process . The deformation model is only comprised of mass nodes, which are generated by sampling the object volume before...force should minimize the penetration volume, the haptic feedback force is derived directly. Additionally, a post- processing technique is developed to...render distinct physi-cal tissue properties across different interaction areas. The proposed approach does not require any pre- processing and is

  3. Virtual reality cerebral aneurysm clipping simulation with real-time haptic feedback.

    PubMed

    Alaraj, Ali; Luciano, Cristian J; Bailey, Daniel P; Elsenousi, Abdussalam; Roitberg, Ben Z; Bernardo, Antonio; Banerjee, P Pat; Charbel, Fady T

    2015-03-01

    With the decrease in the number of cerebral aneurysms treated surgically and the increase of complexity of those treated surgically, there is a need for simulation-based tools to teach future neurosurgeons the operative techniques of aneurysm clipping. To develop and evaluate the usefulness of a new haptic-based virtual reality simulator in the training of neurosurgical residents. A real-time sensory haptic feedback virtual reality aneurysm clipping simulator was developed using the ImmersiveTouch platform. A prototype middle cerebral artery aneurysm simulation was created from a computed tomographic angiogram. Aneurysm and vessel volume deformation and haptic feedback are provided in a 3-dimensional immersive virtual reality environment. Intraoperative aneurysm rupture was also simulated. Seventeen neurosurgery residents from 3 residency programs tested the simulator and provided feedback on its usefulness and resemblance to real aneurysm clipping surgery. Residents thought that the simulation would be useful in preparing for real-life surgery. About two-thirds of the residents thought that the 3-dimensional immersive anatomic details provided a close resemblance to real operative anatomy and accurate guidance for deciding surgical approaches. They thought the simulation was useful for preoperative surgical rehearsal and neurosurgical training. A third of the residents thought that the technology in its current form provided realistic haptic feedback for aneurysm surgery. Neurosurgical residents thought that the novel immersive VR simulator is helpful in their training, especially because they do not get a chance to perform aneurysm clippings until late in their residency programs.

  4. Training Surgical Residents With a Haptic Robotic Central Venous Catheterization Simulator.

    PubMed

    Pepley, David F; Gordon, Adam B; Yovanoff, Mary A; Mirkin, Katelin A; Miller, Scarlett R; Han, David C; Moore, Jason Z

    Ultrasound guided central venous catheterization (CVC) is a common surgical procedure with complication rates ranging from 5 to 21 percent. Training is typically performed using manikins that do not simulate anatomical variations such as obesity and abnormal vessel positioning. The goal of this study was to develop and validate the effectiveness of a new virtual reality and force haptic based simulation platform for CVC of the right internal jugular vein. A CVC simulation platform was developed using a haptic robotic arm, 3D position tracker, and computer visualization. The haptic robotic arm simulated needle insertion force that was based on cadaver experiments. The 3D position tracker was used as a mock ultrasound device with realistic visualization on a computer screen. Upon completion of a practice simulation, performance feedback is given to the user through a graphical user interface including scoring factors based on good CVC practice. The effectiveness of the system was evaluated by training 13 first year surgical residents using the virtual reality haptic based training system over a 3 month period. The participants' performance increased from 52% to 96% on the baseline training scenario, approaching the average score of an expert surgeon: 98%. This also resulted in improvement in positive CVC practices including a 61% decrease between final needle tip position and vein center, a decrease in mean insertion attempts from 1.92 to 1.23, and a 12% increase in time spent aspirating the syringe throughout the procedure. A virtual reality haptic robotic simulator for CVC was successfully developed. Surgical residents training on the simulation improved to near expert levels after three robotic training sessions. This suggests that this system could act as an effective training device for CVC. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  5. Customization, control, and characterization of a commercial haptic device for high-fidelity rendering of weak forces.

    PubMed

    Gurari, Netta; Baud-Bovy, Gabriel

    2014-09-30

    The emergence of commercial haptic devices offers new research opportunities to enhance our understanding of the human sensory-motor system. Yet, commercial device capabilities have limitations which need to be addressed. This paper describes the customization of a commercial force feedback device for displaying forces with a precision that exceeds the human force perception threshold. The device was outfitted with a multi-axis force sensor and closed-loop controlled to improve its transparency. Additionally, two force sensing resistors were attached to the device to measure grip force. Force errors were modeled in the frequency- and time-domain to identify contributions from the mass, viscous friction, and Coulomb friction during open- and closed-loop control. The effect of user interaction on system stability was assessed in the context of a user study which aimed to measure force perceptual thresholds. Findings based on 15 participants demonstrate that the system maintains stability when rendering forces ranging from 0-0.20 N, with an average maximum absolute force error of 0.041 ± 0.013 N. Modeling the force errors revealed that Coulomb friction and inertia were the main contributors to force distortions during respectively slow and fast motions. Existing commercial force feedback devices cannot render forces with the required precision for certain testing scenarios. Building on existing robotics work, this paper shows how a device can be customized to make it reliable for studying the perception of weak forces. The customized and closed-loop controlled device is suitable for measuring force perceptual thresholds. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Look and Feel: Haptic Interaction for Biomedicine

    DTIC Science & Technology

    1995-10-01

    algorithm that is evaluated within the topology of the model. During each time step, forces are summed for each mobile atom based on external forces...volumetric properties; (b) conserving computation power by rendering media local to the interaction point; and (c) evaluating the simulation within...alteration of the model topology. Simulation of the DSM state is accomplished by a multi-step algorithm that is evaluated within the topology of the

  7. Cranial implant design using augmented reality immersive system.

    PubMed

    Ai, Zhuming; Evenhouse, Ray; Leigh, Jason; Charbel, Fady; Rasmussen, Mary

    2007-01-01

    Software tools that utilize haptics for sculpting precise fitting cranial implants are utilized in an augmented reality immersive system to create a virtual working environment for the modelers. The virtual environment is designed to mimic the traditional working environment as closely as possible, providing more functionality for the users. The implant design process uses patient CT data of a defective area. This volumetric data is displayed in an implant modeling tele-immersive augmented reality system where the modeler can build a patient specific implant that precisely fits the defect. To mimic the traditional sculpting workspace, the implant modeling augmented reality system includes stereo vision, viewer centered perspective, sense of touch, and collaboration. To achieve optimized performance, this system includes a dual-processor PC, fast volume rendering with three-dimensional texture mapping, the fast haptic rendering algorithm, and a multi-threading architecture. The system replaces the expensive and time consuming traditional sculpting steps such as physical sculpting, mold making, and defect stereolithography. This augmented reality system is part of a comprehensive tele-immersive system that includes a conference-room-sized system for tele-immersive small group consultation and an inexpensive, easily deployable networked desktop virtual reality system for surgical consultation, evaluation and collaboration. This system has been used to design patient-specific cranial implants with precise fit.

  8. Real-time haptic cutting of high-resolution soft tissues.

    PubMed

    Wu, Jun; Westermann, Rüdiger; Dick, Christian

    2014-01-01

    We present our systematic efforts in advancing the computational performance of physically accurate soft tissue cutting simulation, which is at the core of surgery simulators in general. We demonstrate a real-time performance of 15 simulation frames per second for haptic soft tissue cutting of a deformable body at an effective resolution of 170,000 finite elements. This is achieved by the following innovative components: (1) a linked octree discretization of the deformable body, which allows for fast and robust topological modifications of the simulation domain, (2) a composite finite element formulation, which thoroughly reduces the number of simulation degrees of freedom and thus enables to carefully balance simulation performance and accuracy, (3) a highly efficient geometric multigrid solver for solving the linear systems of equations arising from implicit time integration, (4) an efficient collision detection algorithm that effectively exploits the composition structure, and (5) a stable haptic rendering algorithm for computing the feedback forces. Considering that our method increases the finite element resolution for physically accurate real-time soft tissue cutting simulation by an order of magnitude, our technique has a high potential to significantly advance the realism of surgery simulators.

  9. Virtual Reality Cerebral Aneurysm Clipping Simulation With Real-time Haptic Feedback

    PubMed Central

    Alaraj, Ali; Luciano, Cristian J.; Bailey, Daniel P.; Elsenousi, Abdussalam; Roitberg, Ben Z.; Bernardo, Antonio; Banerjee, P. Pat; Charbel, Fady T.

    2014-01-01

    Background With the decrease in the number of cerebral aneurysms treated surgically and the increase of complexity of those treated surgically, there is a need for simulation-based tools to teach future neurosurgeons the operative techniques of aneurysm clipping. Objective To develop and evaluate the usefulness of a new haptic-based virtual reality (VR) simulator in the training of neurosurgical residents. Methods A real-time sensory haptic feedback virtual reality aneurysm clipping simulator was developed using the Immersive Touch platform. A prototype middle cerebral artery aneurysm simulation was created from a computed tomography angiogram. Aneurysm and vessel volume deformation and haptic feedback are provided in a 3-D immersive VR environment. Intraoperative aneurysm rupture was also simulated. Seventeen neurosurgery residents from three residency programs tested the simulator and provided feedback on its usefulness and resemblance to real aneurysm clipping surgery. Results Residents felt that the simulation would be useful in preparing for real-life surgery. About two thirds of the residents felt that the 3-D immersive anatomical details provided a very close resemblance to real operative anatomy and accurate guidance for deciding surgical approaches. They believed the simulation is useful for preoperative surgical rehearsal and neurosurgical training. One third of the residents felt that the technology in its current form provided very realistic haptic feedback for aneurysm surgery. Conclusion Neurosurgical residents felt that the novel immersive VR simulator is helpful in their training especially since they do not get a chance to perform aneurysm clippings until very late in their residency programs. PMID:25599200

  10. Automatic Perceptual Color Map Generation for Realistic Volume Visualization

    PubMed Central

    Silverstein, Jonathan C.; Parsad, Nigel M.; Tsirline, Victor

    2008-01-01

    Advances in computed tomography imaging technology and inexpensive high performance computer graphics hardware are making high-resolution, full color (24-bit) volume visualizations commonplace. However, many of the color maps used in volume rendering provide questionable value in knowledge representation and are non-perceptual thus biasing data analysis or even obscuring information. These drawbacks, coupled with our need for realistic anatomical volume rendering for teaching and surgical planning, has motivated us to explore the auto-generation of color maps that combine natural colorization with the perceptual discriminating capacity of grayscale. As evidenced by the examples shown that have been created by the algorithm described, the merging of perceptually accurate and realistically colorized virtual anatomy appears to insightfully interpret and impartially enhance volume rendered patient data. PMID:18430609

  11. Anthro-Centric Multisensory Interface for Sensory Augmentation of Tele-Surgery (ACMI-SATS)

    DTIC Science & Technology

    2010-09-01

    surgeon from perceiving useful kinesthetic feedback from direct interaction with the tissues present in traditional “open” procedures. Additionally... Kinesthetic and haptic signals in surgical applications are critical, and prior work with VEs has shown that errors increase without realistic...telepresence related kinesthetic sensory interactions while tactile will refer to more general or abstract tactual interactions. Figure 2: (left

  12. Integrated versus isolated training of the hemiparetic upper extremity in haptically rendered virtual environments.

    PubMed

    Qiu, Qinyin; Fluet, Gerard G; Saleh, Soha; Lafond, Ian; Merians, Alma S; Adamovich, Sergei V

    2010-01-01

    This paper describes the preliminary results of an ongoing study of the effects of two training approaches on motor function and learning in persons with hemi paresis due to cerebrovascular accidents. Eighteen subjects with chronic stroke performed eight, three-hour sessions of sensorimotor training in haptically renedered environments. Eleven subjects performed training activities that integrated hand and arm movement while another seven subjects performed activities that trained the hand and arm with separately. As a whole, the eighteen subjects made statistically significant improvements in motor function as evidenced by robust improvements in Wolf Motor Function Test times and corresponding improvements in Jebsen Test of Hand Function times. There were no significant between group effects for these tests. However, the two training approaches elicited different patterns and magnitudes of performance improvement that suggest that they may elicit different types of change in motor learning and or control.

  13. Augmented versus Virtual Reality Laparoscopic Simulation: What Is the Difference?

    PubMed Central

    Botden, Sanne M.B.I.; Buzink, Sonja N.; Schijven, Marlies P.

    2007-01-01

    Background Virtual reality (VR) is an emerging new modality for laparoscopic skills training; however, most simulators lack realistic haptic feedback. Augmented reality (AR) is a new laparoscopic simulation system offering a combination of physical objects and VR simulation. Laparoscopic instruments are used within an hybrid mannequin on tissue or objects while using video tracking. This study was designed to assess the difference in realism, haptic feedback, and didactic value between AR and VR laparoscopic simulation. Methods The ProMIS AR and LapSim VR simulators were used in this study. The participants performed a basic skills task and a suturing task on both simulators, after which they filled out a questionnaire about their demographics and their opinion of both simulators scored on a 5-point Likert scale. The participants were allotted to 3 groups depending on their experience: experts, intermediates and novices. Significant differences were calculated with the paired t-test. Results There was general consensus in all groups that the ProMIS AR laparoscopic simulator is more realistic than the LapSim VR laparoscopic simulator in both the basic skills task (mean 4.22 resp. 2.18, P < 0.000) as well as the suturing task (mean 4.15 resp. 1.85, P < 0.000). The ProMIS is regarded as having better haptic feedback (mean 3.92 resp. 1.92, P < 0.000) and as being more useful for training surgical residents (mean 4.51 resp. 2.94, P < 0.000). Conclusions In comparison with the VR simulator, the AR laparoscopic simulator was regarded by all participants as a better simulator for laparoscopic skills training on all tested features. PMID:17361356

  14. Virtual Reality simulator for dental anesthesia training in the inferior alveolar nerve block.

    PubMed

    Corrêa, Cléber Gimenez; Machado, Maria Aparecida de Andrade Moreira; Ranzini, Edith; Tori, Romero; Nunes, Fátima de Lourdes Santos

    2017-01-01

    This study shows the development and validation of a dental anesthesia-training simulator, specifically for the inferior alveolar nerve block (IANB). The system developed provides the tactile sensation of inserting a real needle in a human patient, using Virtual Reality (VR) techniques and a haptic device that can provide a perceived force feedback in the needle insertion task during the anesthesia procedure. To simulate a realistic anesthesia procedure, a Carpule syringe was coupled to a haptic device. The Volere method was used to elicit requirements from users in the Dentistry area; Repeated Measures Two-Way ANOVA (Analysis of Variance), Tukey post-hoc test and averages for the results' analysis. A questionnaire-based subjective evaluation method was applied to collect information about the simulator, and 26 people participated in the experiments (12 beginners, 12 at intermediate level, and 2 experts). The questionnaire included profile, preferences (number of viewpoints, texture of the objects, and haptic device handler), as well as visual (appearance, scale, and position of objects) and haptic aspects (motion space, tactile sensation, and motion reproduction). The visual aspect was considered appropriate and the haptic feedback must be improved, which the users can do by calibrating the virtual tissues' resistance. The evaluation of visual aspects was influenced by the participants' experience, according to ANOVA test (F=15.6, p=0.0002, with p<0.01). The user preferences were the simulator with two viewpoints, objects with texture based on images and the device with a syringe coupled to it. The simulation was considered thoroughly satisfactory for the anesthesia training, considering the needle insertion task, which includes the correct insertion point and depth, as well as the perception of tissues resistances during the insertion.

  15. Man, mind, and machine: the past and future of virtual reality simulation in neurologic surgery.

    PubMed

    Robison, R Aaron; Liu, Charles Y; Apuzzo, Michael L J

    2011-11-01

    To review virtual reality in neurosurgery, including the history of simulation and virtual reality and some of the current implementations; to examine some of the technical challenges involved; and to propose a potential paradigm for the development of virtual reality in neurosurgery going forward. A search was made on PubMed using key words surgical simulation, virtual reality, haptics, collision detection, and volumetric modeling to assess the current status of virtual reality in neurosurgery. Based on previous results, investigators extrapolated the possible integration of existing efforts and potential future directions. Simulation has a rich history in surgical training, and there are numerous currently existing applications and systems that involve virtual reality. All existing applications are limited to specific task-oriented functions and typically sacrifice visual realism for real-time interactivity or vice versa, owing to numerous technical challenges in rendering a virtual space in real time, including graphic and tissue modeling, collision detection, and direction of the haptic interface. With ongoing technical advancements in computer hardware and graphic and physical rendering, incremental or modular development of a fully immersive, multipurpose virtual reality neurosurgical simulator is feasible. The use of virtual reality in neurosurgery is predicted to change the nature of neurosurgical education, and to play an increased role in surgical rehearsal and the continuing education and credentialing of surgical practitioners. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. Incorporating haptic effects into three-dimensional virtual environments to train the hemiparetic upper extremity

    PubMed Central

    Adamovich, Sergei; Fluet, Gerard G.; Merians, Alma S.; Mathai, Abraham; Qiu, Qinyin

    2010-01-01

    Current neuroscience has identified several constructs to increase the effectiveness of upper extremity rehabilitation. One is the use of progressive, skill acquisition-oriented training. Another approach emphasizes the use of bilateral activities. Building on these principles, this paper describes the design and feasibility testing of a robotic / virtual environment system designed to train the arm of persons who have had strokes. The system provides a variety of assistance modes, scalable workspaces and hand-robot interfaces allowing persons with strokes to train multiple joints in three dimensions. The simulations utilize assistance algorithms that adjust task difficulty both online and offline in relation to subject performance. Several distinctive haptic effects have been incorporated into the simulations. An adaptive master-slave relationship between the unimpaired and impaired arm encourages active movement of the subject's hemiparetic arm during a bimanual task. Adaptive anti-gravity support and damping stabilize the arm during virtual reaching and placement tasks. An adaptive virtual spring provides assistance to complete the movement if the subject is unable to complete the task in time. Finally, haptically rendered virtual objects help to shape the movement trajectory during a virtual placement task. A proof of concept study demonstrated this system to be safe, feasible and worthy of further study. PMID:19666345

  17. Adaptive space warping to enhance passive haptics in an arthroscopy surgical simulator.

    PubMed

    Spillmann, Jonas; Tuchschmid, Stefan; Harders, Matthias

    2013-04-01

    Passive haptics, also known as tactile augmentation, denotes the use of a physical counterpart to a virtual environment to provide tactile feedback. Employing passive haptics can result in more realistic touch sensations than those from active force feedback, especially for rigid contacts. However, changes in the virtual environment would necessitate modifications of the physical counterparts. In recent work space warping has been proposed as one solution to overcome this limitation. In this technique virtual space is distorted such that a variety of virtual models can be mapped onto one single physical object. In this paper, we propose as an extension adaptive space warping; we show how this technique can be employed in a mixed-reality surgical training simulator in order to map different virtual patients onto one physical anatomical model. We developed methods to warp different organ geometries onto one physical mock-up, to handle different mechanical behaviors of the virtual patients, and to allow interactive modifications of the virtual structures, while the physical counterparts remain unchanged. Various practical examples underline the wide applicability of our approach. To the best of our knowledge this is the first practical usage of such a technique in the specific context of interactive medical training.

  18. A haptic sensor-actor-system based on ultrasound elastography and electrorheological fluids for virtual reality applications in medicine.

    PubMed

    Khaled, W; Ermert, H; Bruhns, O; Boese, H; Baumann, M; Monkman, G J; Egersdoerfer, S; Meier, A; Klein, D; Freimuth, H

    2003-01-01

    Mechanical properties of biological tissue represent important diagnostic information and are of histological relevance (hard lesions, "nodes" in organs: tumors; calcifications in vessels: arteriosclerosis). The problem is, that such information is usually obtained by digital palpation only, which is limited with respect to sensitivity. It requires intuitive assessment and does not allow quantitative documentation. A suitable sensor is required for quantitative detection of mechanical tissue properties. On the other hand, there is also some need for a realistic mechanical display of those tissue properties. Suitable actuator arrays with high spatial resolution and real-time capabilities are required operating in a haptic sensor actuator system with different applications. The sensor system uses real time ultrasonic elastography whereas the tactile actuator is based on electrorheological fluids. Due to their small size the actuator array elements have to be manufactured by micro-mechanical production methods. In order to supply the actuator elements with individual high voltages a sophisticated switching and control concept have been designed. This haptic system has the potential of inducing real time substantial forces, using a compact lightweight mechanism which can be applied to numerous areas including intraoperative navigation, telemedicine, teaching, space and telecommunication.

  19. Creation of anatomical models from CT data

    NASA Astrophysics Data System (ADS)

    Alaytsev, Innokentiy K.; Danilova, Tatyana V.; Manturov, Alexey O.; Mareev, Gleb O.; Mareev, Oleg V.

    2018-04-01

    Computed tomography is a great source of biomedical data because it allows a detailed exploration of complex anatomical structures. Some structures are not visible on CT scans, and some are hard to distinguish due to partial volume effect. CT datasets require preprocessing before using them as anatomical models in a simulation system. The work describes segmentation and data transformation methods for an anatomical model creation from the CT data. The result models may be used for visual and haptic rendering and drilling simulation in a virtual surgery system.

  20. Intubation simulation with a cross-sectional visual guidance.

    PubMed

    Rhee, Chi-Hyoung; Kang, Chul Won; Lee, Chang Ha

    2013-01-01

    We present an intubation simulation with deformable objects and a cross-sectional visual guidance using a general haptic device. Our method deforms the tube model when it collides with the human model. Mass-Spring model with the Euler integration is used for the tube deformation. For the trainee's more effective understanding of the intubation process, we provide a cross-sectional view of the oral cavity and the tube. Our system also applies a stereoscopic rendering to improve the depth perception and the reality of the simulation.

  1. Virtual Cerebral Aneurysm Clipping with Real-Time Haptic Force Feedback in Neurosurgical Education.

    PubMed

    Gmeiner, Matthias; Dirnberger, Johannes; Fenz, Wolfgang; Gollwitzer, Maria; Wurm, Gabriele; Trenkler, Johannes; Gruber, Andreas

    2018-04-01

    Realistic, safe, and efficient modalities for simulation-based training are highly warranted to enhance the quality of surgical education, and they should be incorporated in resident training. The aim of this study was to develop a patient-specific virtual cerebral aneurysm-clipping simulator with haptic force feedback and real-time deformation of the aneurysm and vessels. A prototype simulator was developed from 2012 to 2016. Evaluation of virtual clipping by blood flow simulation was integrated in this software, and the prototype was evaluated by 18 neurosurgeons. In 4 patients with different medial cerebral artery aneurysms, virtual clipping was performed after real-life surgery, and surgical results were compared regarding clip application, surgical trajectory, and blood flow. After head positioning and craniotomy, bimanual virtual aneurysm clipping with an original forceps was performed. Blood flow simulation demonstrated residual aneurysm filling or branch stenosis. The simulator improved anatomic understanding for 89% of neurosurgeons. Simulation of head positioning and craniotomy was considered realistic by 89% and 94% of users, respectively. Most participants agreed that this simulator should be integrated into neurosurgical education (94%). Our illustrative cases demonstrated that virtual aneurysm surgery was possible using the same trajectory as in real-life cases. Both virtual clipping and blood flow simulation were realistic in broad-based but not calcified aneurysms. Virtual clipping of a calcified aneurysm could be performed using the same surgical trajectory, but not the same clip type. We have successfully developed a virtual aneurysm-clipping simulator. Next, we will prospectively evaluate this device for surgical procedure planning and education. Copyright © 2018 Elsevier Inc. All rights reserved.

  2. Education Catching Up with Science: Preparing Students for Three-Dimensional Literacy in Cell Biology

    PubMed Central

    Kramer, IJsbrand M.; Dahmani, Hassen-Reda; Delouche, Pamina; Bidabe, Marissa; Schneeberger, Patricia

    2012-01-01

    The large number of experimentally determined molecular structures has led to the development of a new semiotic system in the life sciences, with increasing use of accurate molecular representations. To determine how this change impacts students’ learning, we incorporated image tests into our introductory cell biology course. Groups of students used a single text dealing with signal transduction, which was supplemented with images made in one of three iconographic styles. Typically, we employed realistic renderings, using computer-generated Protein Data Bank (PDB) structures; realistic-schematic renderings, using shapes inspired by PDB structures; or schematic renderings, using simple geometric shapes to represent cellular components. The control group received a list of keywords. When students were asked to draw and describe the process in their own style and to reply to multiple-choice questions, the three iconographic approaches equally improved the overall outcome of the tests (relative to keywords). Students found the three approaches equally useful but, when asked to select a preferred style, they largely favored a realistic-schematic style. When students were asked to annotate “raw” realistic images, both keywords and schematic representations failed to prepare them for this task. We conclude that supplementary images facilitate the comprehension process and despite their visual clutter, realistic representations do not hinder learning in an introductory course. PMID:23222839

  3. Education catching up with science: preparing students for three-dimensional literacy in cell biology.

    PubMed

    Kramer, Ijsbrand M; Dahmani, Hassen-Reda; Delouche, Pamina; Bidabe, Marissa; Schneeberger, Patricia

    2012-01-01

    The large number of experimentally determined molecular structures has led to the development of a new semiotic system in the life sciences, with increasing use of accurate molecular representations. To determine how this change impacts students' learning, we incorporated image tests into our introductory cell biology course. Groups of students used a single text dealing with signal transduction, which was supplemented with images made in one of three iconographic styles. Typically, we employed realistic renderings, using computer-generated Protein Data Bank (PDB) structures; realistic-schematic renderings, using shapes inspired by PDB structures; or schematic renderings, using simple geometric shapes to represent cellular components. The control group received a list of keywords. When students were asked to draw and describe the process in their own style and to reply to multiple-choice questions, the three iconographic approaches equally improved the overall outcome of the tests (relative to keywords). Students found the three approaches equally useful but, when asked to select a preferred style, they largely favored a realistic-schematic style. When students were asked to annotate "raw" realistic images, both keywords and schematic representations failed to prepare them for this task. We conclude that supplementary images facilitate the comprehension process and despite their visual clutter, realistic representations do not hinder learning in an introductory course.

  4. Providing haptic feedback in robot-assisted minimally invasive surgery: a direct optical force-sensing solution for haptic rendering of deformable bodies.

    PubMed

    Ehrampoosh, Shervin; Dave, Mohit; Kia, Michael A; Rablau, Corneliu; Zadeh, Mehrdad H

    2013-01-01

    This paper presents an enhanced haptic-enabled master-slave teleoperation system which can be used to provide force feedback to surgeons in minimally invasive surgery (MIS). One of the research goals was to develop a combined-control architecture framework that included both direct force reflection (DFR) and position-error-based (PEB) control strategies. To achieve this goal, it was essential to measure accurately the direct contact forces between deformable bodies and a robotic tool tip. To measure the forces at a surgical tool tip and enhance the performance of the teleoperation system, an optical force sensor was designed, prototyped, and added to a robot manipulator. The enhanced teleoperation architecture was formulated by developing mathematical models for the optical force sensor, the extended slave robot manipulator, and the combined-control strategy. Human factor studies were also conducted to (a) examine experimentally the performance of the enhanced teleoperation system with the optical force sensor, and (b) study human haptic perception during the identification of remote object deformability. The first experiment was carried out to discriminate deformability of objects when human subjects were in direct contact with deformable objects by means of a laparoscopic tool. The control parameters were then tuned based on the results of this experiment using a gain-scheduling method. The second experiment was conducted to study the effectiveness of the force feedback provided through the enhanced teleoperation system. The results show that the force feedback increased the ability of subjects to correctly identify materials of different deformable types. In addition, the virtual force feedback provided by the teleoperation system comes close to the real force feedback experienced in direct MIS. The experimental results provide design guidelines for choosing and validating the control architecture and the optical force sensor.

  5. Programmable prostate palpation simulator using property-changing pneumatic bladder.

    PubMed

    Talhan, Aishwari; Jeon, Seokhee

    2018-05-01

    The currently available prostate palpation simulators are based on either a physical mock-up or pure virtual simulation. Both cases have their inherent limitations. The former lacks flexibility in presenting abnormalities and scenarios because of the static nature of the mock-up and has usability issues because the prostate model must be replaced in different scenarios. The latter has realism issues, particularly in haptic feedback, because of the very limited performance of haptic hardware and inaccurate haptic simulation. This paper presents a highly flexible and programmable simulator with high haptic fidelity. Our new approach is based on a pneumatic-driven, property-changing, silicone prostate mock-up that can be embedded in a human torso mannequin. The mock-up has seven pneumatically controlled, multi-layered bladder cells to mimic the stiffness, size, and location changes of nodules in the prostate. The size is controlled by inflating the bladder with positive pressure in the chamber, and a hard nodule can be generated using the particle jamming technique; the fine sand in the bladder becomes stiff when it is vacuumed. The programmable valves and system identification process enable us to precisely control the size and stiffness, which results in a simulator that can realistically generate many different diseases without replacing anything. The three most common abnormalities in a prostate are selected for demonstration, and multiple progressive stages of each abnormality are carefully designed based on medical data. A human perception experiment is performed by actual medical professionals and confirms that our simulator exhibits higher realism and usability than do the conventional simulators. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. Virtual reality neurosurgery: a simulator blueprint.

    PubMed

    Spicer, Mark A; van Velsen, Martin; Caffrey, John P; Apuzzo, Michael L J

    2004-04-01

    This article details preliminary studies undertaken to integrate the most relevant advancements across multiple disciplines in an effort to construct a highly realistic neurosurgical simulator based on a distributed computer architecture. Techniques based on modified computational modeling paradigms incorporating finite element analysis are presented, as are current and projected efforts directed toward the implementation of a novel bidirectional haptic device. Patient-specific data derived from noninvasive magnetic resonance imaging sequences are used to construct a computational model of the surgical region of interest. Magnetic resonance images of the brain may be coregistered with those obtained from magnetic resonance angiography, magnetic resonance venography, and diffusion tensor imaging to formulate models of varying anatomic complexity. The majority of the computational burden is encountered in the presimulation reduction of the computational model and allows realization of the required threshold rates for the accurate and realistic representation of real-time visual animations. Intracranial neurosurgical procedures offer an ideal testing site for the development of a totally immersive virtual reality surgical simulator when compared with the simulations required in other surgical subspecialties. The material properties of the brain as well as the typically small volumes of tissue exposed in the surgical field, coupled with techniques and strategies to minimize computational demands, provide unique opportunities for the development of such a simulator. Incorporation of real-time haptic and visual feedback is approached here and likely will be accomplished soon.

  7. Haptic simulation framework for determining virtual dental occlusion.

    PubMed

    Wu, Wen; Chen, Hui; Cen, Yuhai; Hong, Yang; Khambay, Balvinder; Heng, Pheng Ann

    2017-04-01

    The surgical treatment of many dentofacial deformities is often complex due to its three-dimensional nature. To determine the dental occlusion in the most stable position is essential for the success of the treatment. Computer-aided virtual planning on individualized patient-specific 3D model can help formulate the surgical plan and predict the surgical change. However, in current computer-aided planning systems, it is not possible to determine the dental occlusion of the digital models in the intuitive way during virtual surgical planning because of absence of haptic feedback. In this paper, a physically based haptic simulation framework is proposed, which can provide surgeons with the intuitive haptic feedback to determine the dental occlusion of the digital models in their most stable position. To provide the physically realistic force feedback when the dental models contact each other during the searching process, the contact model is proposed to describe the dynamic and collision properties of the dental models during the alignment. The simulated impulse/contact-based forces are integrated into the unified simulation framework. A validation study has been conducted on fifteen sets of virtual dental models chosen at random and covering a wide range of the dental relationships found clinically. The dental occlusions obtained by an expert were employed as a benchmark to compare the virtual occlusion results. The mean translational and angular deviations of the virtual occlusion results from the benchmark were small. The experimental results show the validity of our method. The simulated forces can provide valuable insights to determine the virtual dental occlusion. The findings of this work and the validation of proposed concept lead the way for full virtual surgical planning on patient-specific virtual models allowing fully customized treatment plans for the surgical correction of dentofacial deformities.

  8. Force modeling for incisions into various tissues with MRF haptic master

    NASA Astrophysics Data System (ADS)

    Kim, Pyunghwa; Kim, Soomin; Park, Young-Dai; Choi, Seung-Bok

    2016-03-01

    This study proposes a new model to predict the reaction force that occurs in incisions during robot-assisted minimally invasive surgery. The reaction force is fed back to the manipulator by a magneto-rheological fluid (MRF) haptic master, which is featured by a bi-directional clutch actuator. The reaction force feedback provides similar sensations to laparotomy that cannot be provided by a conventional master for surgery. This advantage shortens the training period for robot-assisted minimally invasive surgery and can improve the accuracy of operations. The reaction force modeling of incisions can be utilized in a surgical simulator that provides a virtual reaction force. In this work, in order to model the reaction force during incisions, the energy aspect of the incision process is adopted and analyzed. Each mode of the incision process is classified by the tendency of the energy change, and modeled for realistic real-time application. The reaction force model uses actual reaction force information with three types of actual tissues: hard tissue, medium tissue, and soft tissue. This modeled force is realized by the MRF haptic master through an algorithm based on the position and velocity of a scalpel using two different control methods: an open-loop algorithm and a closed-loop algorithm. The reaction forces obtained from the proposed model are compared with a desired force in time domain.

  9. Real time ray tracing based on shader

    NASA Astrophysics Data System (ADS)

    Gui, JiangHeng; Li, Min

    2017-07-01

    Ray tracing is a rendering algorithm for generating an image through tracing lights into an image plane, it can simulate complicate optical phenomenon like refraction, depth of field and motion blur. Compared with rasterization, ray tracing can achieve more realistic rendering result, however with greater computational cost, simple scene rendering can consume tons of time. With the GPU's performance improvement and the advent of programmable rendering pipeline, complicated algorithm can also be implemented directly on shader. So, this paper proposes a new method that implement ray tracing directly on fragment shader, mainly include: surface intersection, importance sampling and progressive rendering. With the help of GPU's powerful throughput capability, it can implement real time rendering of simple scene.

  10. A virtual surgical environment for rehearsal of tympanomastoidectomy.

    PubMed

    Chan, Sonny; Li, Peter; Lee, Dong Hoon; Salisbury, J Kenneth; Blevins, Nikolas H

    2011-01-01

    This article presents a virtual surgical environment whose purpose is to assist the surgeon in preparation for individual cases. The system constructs interactive anatomical models from patient-specific, multi-modal preoperative image data, and incorporates new methods for visually and haptically rendering the volumetric data. Evaluation of the system's ability to replicate temporal bone dissections for tympanomastoidectomy, using intraoperative video of the same patients as guides, showed strong correlations between virtual and intraoperative anatomy. The result is a portable and cost-effective tool that may prove highly beneficial for the purposes of surgical planning and rehearsal.

  11. Face and construct validity of a computer-based virtual reality simulator for ERCP.

    PubMed

    Bittner, James G; Mellinger, John D; Imam, Toufic; Schade, Robert R; Macfadyen, Bruce V

    2010-02-01

    Currently, little evidence supports computer-based simulation for ERCP training. To determine face and construct validity of a computer-based simulator for ERCP and assess its perceived utility as a training tool. Novice and expert endoscopists completed 2 simulated ERCP cases by using the GI Mentor II. Virtual Education and Surgical Simulation Laboratory, Medical College of Georgia. Outcomes included times to complete the procedure, reach the papilla, and use fluoroscopy; attempts to cannulate the papilla, pancreatic duct, and common bile duct; and number of contrast injections and complications. Subjects assessed simulator graphics, procedural accuracy, difficulty, haptics, overall realism, and training potential. Only when performance data from cases A and B were combined did the GI Mentor II differentiate novices and experts based on times to complete the procedure, reach the papilla, and use fluoroscopy. Across skill levels, overall opinions were similar regarding graphics (moderately realistic), accuracy (similar to clinical ERCP), difficulty (similar to clinical ERCP), overall realism (moderately realistic), and haptics. Most participants (92%) claimed that the simulator has definite training potential or should be required for training. Small sample size, single institution. The GI Mentor II demonstrated construct validity for ERCP based on select metrics. Most subjects thought that the simulated graphics, procedural accuracy, and overall realism exhibit face validity. Subjects deemed it a useful training tool. Study repetition involving more participants and cases may help confirm results and establish the simulator's ability to differentiate skill levels based on ERCP-specific metrics.

  12. Fast algorithm for the rendering of three-dimensional surfaces

    NASA Astrophysics Data System (ADS)

    Pritt, Mark D.

    1994-02-01

    It is often desirable to draw a detailed and realistic representation of surface data on a computer graphics display. One such representation is a 3D shaded surface. Conventional techniques for rendering shaded surfaces are slow, however, and require substantial computational power. Furthermore, many techniques suffer from aliasing effects, which appear as jagged lines and edges. This paper describes an algorithm for the fast rendering of shaded surfaces without aliasing effects. It is much faster than conventional ray tracing and polygon-based rendering techniques and is suitable for interactive use. On an IBM RISC System/6000TM workstation it renders a 1000 X 1000 surface in about 7 seconds.

  13. A real-time photo-realistic rendering algorithm of ocean color based on bio-optical model

    NASA Astrophysics Data System (ADS)

    Ma, Chunyong; Xu, Shu; Wang, Hongsong; Tian, Fenglin; Chen, Ge

    2016-12-01

    A real-time photo-realistic rendering algorithm of ocean color is introduced in the paper, which considers the impact of ocean bio-optical model. The ocean bio-optical model mainly involves the phytoplankton, colored dissolved organic material (CDOM), inorganic suspended particle, etc., which have different contributions to absorption and scattering of light. We decompose the emergent light of the ocean surface into the reflected light from the sun and the sky, and the subsurface scattering light. We establish an ocean surface transmission model based on ocean bidirectional reflectance distribution function (BRDF) and the Fresnel law, and this model's outputs would be the incident light parameters of subsurface scattering. Using ocean subsurface scattering algorithm combined with bio-optical model, we compute the scattering light emergent radiation in different directions. Then, we blend the reflection of sunlight and sky light to implement the real-time ocean color rendering in graphics processing unit (GPU). Finally, we use two kinds of radiance reflectance calculated by Hydrolight radiative transfer model and our algorithm to validate the physical reality of our method, and the results show that our algorithm can achieve real-time highly realistic ocean color scenes.

  14. Effects of kinesthetic and cutaneous stimulation during the learning of a viscous force field.

    PubMed

    Rosati, Giulio; Oscari, Fabio; Pacchierotti, Claudio; Prattichizzo, Domenico

    2014-01-01

    Haptic stimulation can help humans learn perceptual motor skills, but the precise way in which it influences the learning process has not yet been clarified. This study investigates the role of the kinesthetic and cutaneous components of haptic feedback during the learning of a viscous curl field, taking also into account the influence of visual feedback. We present the results of an experiment in which 17 subjects were asked to make reaching movements while grasping a joystick and wearing a pair of cutaneous devices. Each device was able to provide cutaneous contact forces through a moving platform. The subjects received visual feedback about joystick's position. During the experiment, the system delivered a perturbation through (1) full haptic stimulation, (2) kinesthetic stimulation alone, (3) cutaneous stimulation alone, (4) altered visual feedback, or (5) altered visual feedback plus cutaneous stimulation. Conditions 1, 2, and 3 were also tested with the cancellation of the visual feedback of position error. Results indicate that kinesthetic stimuli played a primary role during motor adaptation to the viscous field, which is a fundamental premise to motor learning and rehabilitation. On the other hand, cutaneous stimulation alone appeared not to bring significant direct or adaptation effects, although it helped in reducing direct effects when used in addition to kinesthetic stimulation. The experimental conditions with visual cancellation of position error showed slower adaptation rates, indicating that visual feedback actively contributes to the formation of internal models. However, modest learning effects were detected when the visual information was used to render the viscous field.

  15. Prototype tactile feedback system for examination by skin touch.

    PubMed

    Lee, O; Lee, K; Oh, C; Kim, K; Kim, M

    2014-08-01

    Diagnosis of conditions such as psoriasis and atopic dermatitis, in the case of induration, involves palpating the infected area via hands and then selecting a ratings score. However, the score is determined based on the tester's experience and standards, making it subjective. To provide tactile feedback on the skin, we developed a prototype tactile feedback system to simulate skin wrinkles with PHANToM OMNI. To provide the user with tactile feedback on skin wrinkles, a visual and haptic Augmented Reality system was developed. First, a pair of stereo skin images obtained by a stereo camera generates a disparity map of skin wrinkles. Second, the generated disparity map is sent to an implemented tactile rendering algorithm that computes a reaction force according to the user's interaction with the skin image. We first obtained a stereo image of skin wrinkles from the in vivo stereo imaging system, which has a baseline of 50.8 μm, and obtained the disparity map with a graph cuts algorithm. The left image is displayed on the monitor to enable the user to recognize the location visually. The disparity map of the skin wrinkle image sends skin wrinkle information as a tactile response to the user through a haptic device. We successfully developed a tactile feedback system for virtual skin wrinkle simulation by means of a commercialized haptic device that provides the user with a single point of contact to feel the surface roughness of a virtual skin sample. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. Real-time surgery simulation of intracranial aneurysm clipping with patient-specific geometries and haptic feedback

    NASA Astrophysics Data System (ADS)

    Fenz, Wolfgang; Dirnberger, Johannes

    2015-03-01

    Providing suitable training for aspiring neurosurgeons is becoming more and more problematic. The increasing popularity of the endovascular treatment of intracranial aneurysms leads to a lack of simple surgical situations for clipping operations, leaving mainly the complex cases, which present even experienced surgeons with a challenge. To alleviate this situation, we have developed a training simulator with haptic interaction allowing trainees to practice virtual clipping surgeries on real patient-specific vessel geometries. By using specialized finite element (FEM) algorithms (fast finite element method, matrix condensation) combined with GPU acceleration, we can achieve the necessary frame rate for smooth real-time interaction with the detailed models needed for a realistic simulation of the vessel wall deformation caused by the clamping with surgical clips. Vessel wall geometries for typical training scenarios were obtained from 3D-reconstructed medical image data, while for the instruments (clipping forceps, various types of clips, suction tubes) we use models provided by manufacturer Aesculap AG. Collisions between vessel and instruments have to be continuously detected and transformed into corresponding boundary conditions and feedback forces, calculated using a contact plane method. After a training, the achieved result can be assessed based on various criteria, including a simulation of the residual blood flow into the aneurysm. Rigid models of the surgical access and surrounding brain tissue, plus coupling a real forceps to the haptic input device further increase the realism of the simulation.

  17. Fast Physically Accurate Rendering of Multimodal Signatures of Distributed Fracture in Heterogeneous Materials.

    PubMed

    Visell, Yon

    2015-04-01

    This paper proposes a fast, physically accurate method for synthesizing multimodal, acoustic and haptic, signatures of distributed fracture in quasi-brittle heterogeneous materials, such as wood, granular media, or other fiber composites. Fracture processes in these materials are challenging to simulate with existing methods, due to the prevalence of large numbers of disordered, quasi-random spatial degrees of freedom, representing the complex physical state of a sample over the geometric volume of interest. Here, I develop an algorithm for simulating such processes, building on a class of statistical lattice models of fracture that have been widely investigated in the physics literature. This algorithm is enabled through a recently published mathematical construction based on the inverse transform method of random number sampling. It yields a purely time domain stochastic jump process representing stress fluctuations in the medium. The latter can be readily extended by a mean field approximation that captures the averaged constitutive (stress-strain) behavior of the material. Numerical simulations and interactive examples demonstrate the ability of these algorithms to generate physically plausible acoustic and haptic signatures of fracture in complex, natural materials interactively at audio sampling rates.

  18. Geometric modeling of the temporal bone for cochlea implant simulation

    NASA Astrophysics Data System (ADS)

    Todd, Catherine A.; Naghdy, Fazel; O'Leary, Stephen

    2004-05-01

    The first stage in the development of a clinically valid surgical simulator for training otologic surgeons in performing cochlea implantation is presented. For this purpose, a geometric model of the temporal bone has been derived from a cadaver specimen using the biomedical image processing software package Analyze (AnalyzeDirect, Inc) and its three-dimensional reconstruction is examined. Simulator construction begins with registration and processing of a Computer Tomography (CT) medical image sequence. Important anatomical structures of the middle and inner ear are identified and segmented from each scan in a semi-automated threshold-based approach. Linear interpolation between image slices produces a three-dimensional volume dataset: the geometrical model. Artefacts are effectively eliminated using a semi-automatic seeded region-growing algorithm and unnecessary bony structures are removed. Once validated by an Ear, Nose and Throat (ENT) specialist, the model may be imported into the Reachin Application Programming Interface (API) (Reachin Technologies AB) for visual and haptic rendering associated with a virtual mastoidectomy. Interaction with the model is realized with haptics interfacing, providing the user with accurate torque and force feedback. Electrode array insertion into the cochlea will be introduced in the final stage of design.

  19. Study on real-time force feedback for a master-slave interventional surgical robotic system.

    PubMed

    Guo, Shuxiang; Wang, Yuan; Xiao, Nan; Li, Youxiang; Jiang, Yuhua

    2018-04-13

    In robot-assisted catheterization, haptic feedback is important, but is currently lacking. In addition, conventional interventional surgical robotic systems typically employ a master-slave architecture with an open-loop force feedback, which results in inaccurate control. We develop herein a novel real-time master-slave (RTMS) interventional surgical robotic system with a closed-loop force feedback that allows a surgeon to sense the true force during remote operation, provide adequate haptic feedback, and improve control accuracy in robot-assisted catheterization. As part of this system, we also design a unique master control handle that measures the true force felt by a surgeon, providing the basis for the closed-loop control of the entire system. We use theoretical and empirical methods to demonstrate that the proposed RTMS system provides a surgeon (using the master control handle) with a more accurate and realistic force sensation, which subsequently improves the precision of the master-slave manipulation. The experimental results show a substantial increase in the control accuracy of the force feedback and an increase in operational efficiency during surgery.

  20. Neodymium:YAG laser cutting of intraocular lens haptics in vitro and in vivo.

    PubMed

    Feder, J M; Rosenberg, M A; Farber, M D

    1989-09-01

    Various complications following intraocular lens (IOL) surgery result in explantation of the lenses. Haptic fibrosis may necessitate cutting the IOL haptics prior to removal. In this study we used the neodymium: YAG (Nd:YAG) laser to cut polypropylene and poly(methyl methacrylate) (PMMA) haptics in vitro and in rabbit eyes. In vitro we were able to cut 100% of both haptic types successfully (28 PMMA and 30 polypropylene haptics). In rabbit eyes we were able to cut 50% of the PMMA haptics and 43% of the polypropylene haptics. Poly(methyl methacrylate) haptics were easier to cut in vitro and in vivo than polypropylene haptics, requiring fewer shots for transection. Complications of Nd:YAG laser use frequently interfered with haptic transections in rabbit eyes. Haptic transection may be more easily accomplished in human eyes.

  1. Spatial Visualization by Realistic 3D Views

    ERIC Educational Resources Information Center

    Yue, Jianping

    2008-01-01

    In this study, the popular Purdue Spatial Visualization Test-Visualization by Rotations (PSVT-R) in isometric drawings was recreated with CAD software that allows 3D solid modeling and rendering to provide more realistic pictorial views. Both the original and the modified PSVT-R tests were given to students and their scores on the two tests were…

  2. HDlive rendering images of the fetal stomach: a preliminary report.

    PubMed

    Inubashiri, Eisuke; Abe, Kiyotaka; Watanabe, Yukio; Akutagawa, Noriyuki; Kuroki, Katumaru; Sugawara, Masaki; Maeda, Nobuhiko; Minami, Kunihiro; Nomura, Yasuhiro

    2015-01-01

    This study aimed to show reconstruction of the fetal stomach using the HDlive rendering mode in ultrasound. Seventeen healthy singleton fetuses at 18-34 weeks' gestational age were observed using the HDlive rendering mode of ultrasound in utero. In all of the fetuses, we identified specific spatial structures, including macroscopic anatomical features (e.g., the pyrous, cardia, fundus, and great curvature) of the fetal stomach, using the HDlive rendering mode. In particular, HDlive rendering images showed remarkably fine details that appeared as if they were being viewed under an endoscope, with visible rugal folds after 27 weeks' gestational age. Our study suggests that the HDlive rendering mode can be used as an additional method for evaluating the fetal stomach. The HDlive rendering mode shows detailed 3D structural images and anatomically realistic images of the fetal stomach. This technique may be effective in prenatal diagnosis for examining detailed information of fetal organs.

  3. Complex adaptation-based LDR image rendering for 3D image reconstruction

    NASA Astrophysics Data System (ADS)

    Lee, Sung-Hak; Kwon, Hyuk-Ju; Sohng, Kyu-Ik

    2014-07-01

    A low-dynamic tone-compression technique is developed for realistic image rendering that can make three-dimensional (3D) images similar to realistic scenes by overcoming brightness dimming in the 3D display mode. The 3D surround provides varying conditions for image quality, illuminant adaptation, contrast, gamma, color, sharpness, and so on. In general, gain/offset adjustment, gamma compensation, and histogram equalization have performed well in contrast compression; however, as a result of signal saturation and clipping effects, image details are removed and information is lost on bright and dark areas. Thus, an enhanced image mapping technique is proposed based on space-varying image compression. The performance of contrast compression is enhanced with complex adaptation in a 3D viewing surround combining global and local adaptation. Evaluating local image rendering in view of tone and color expression, noise reduction, and edge compensation confirms that the proposed 3D image-mapping model can compensate for the loss of image quality in the 3D mode.

  4. The Formation of Teacher Work Teams under Adverse Conditions: Towards a More Realistic Scenario for Schools in Distress

    ERIC Educational Resources Information Center

    Mintrop, Rick; Charles, Jessica

    2017-01-01

    Group formation studies are rare in the literature on teacher professional learning communities (PLCs). But they are needed to render realistic scenarios and design interventions for practitioners who work in schools where teachers encounter distress and social adversity. Under these conditions, we may need approaches to PLC development that are…

  5. Graphic and haptic simulation for transvaginal cholecystectomy training in NOTES.

    PubMed

    Pan, Jun J; Ahn, Woojin; Dargar, Saurabh; Halic, Tansel; Li, Bai C; Sankaranarayanan, Ganesh; Roberts, Kurt; Schwaitzberg, Steven; De, Suvranu

    2016-04-01

    Natural Orifice Transluminal Endoscopic Surgery (NOTES) provides an emerging surgical technique which usually needs a long learning curve for surgeons. Virtual reality (VR) medical simulators with vision and haptic feedback can usually offer an efficient and cost-effective alternative without risk to the traditional training approaches. Under this motivation, we developed the first virtual reality simulator for transvaginal cholecystectomy in NOTES (VTEST™). This VR-based surgical simulator aims to simulate the hybrid NOTES of cholecystectomy. We use a 6DOF haptic device and a tracking sensor to construct the core hardware component of simulator. For software, an innovative approach based on the inner-spheres is presented to deform the organs in real time. To handle the frequent collision between soft tissue and surgical instruments, an adaptive collision detection method based on GPU is designed and implemented. To give a realistic visual performance of gallbladder fat tissue removal by cautery hook, a multi-layer hexahedral model is presented to simulate the electric dissection of fat tissue. From the experimental results, trainees can operate in real time with high degree of stability and fidelity. A preliminary study was also performed to evaluate the realism and the usefulness of this hybrid NOTES simulator. This prototyped simulation system has been verified by surgeons through a pilot study. Some items of its visual performance and the utility were rated fairly high by the participants during testing. It exhibits the potential to improve the surgical skills of trainee and effectively shorten their learning curve. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. New impressive capabilities of SE-workbench for EO/IR real-time rendering of animated scenarios including flares

    NASA Astrophysics Data System (ADS)

    Le Goff, Alain; Cathala, Thierry; Latger, Jean

    2015-10-01

    To provide technical assessments of EO/IR flares and self-protection systems for aircraft, DGA Information superiority resorts to synthetic image generation to model the operational battlefield of an aircraft, as viewed by EO/IR threats. For this purpose, it completed the SE-Workbench suite from OKTAL-SE with functionalities to predict a realistic aircraft IR signature and is yet integrating the real-time EO/IR rendering engine of SE-Workbench called SE-FAST-IR. This engine is a set of physics-based software and libraries that allows preparing and visualizing a 3D scene for the EO/IR domain. It takes advantage of recent advances in GPU computing techniques. The recent past evolutions that have been performed concern mainly the realistic and physical rendering of reflections, the rendering of both radiative and thermal shadows, the use of procedural techniques for the managing and the rendering of very large terrains, the implementation of Image- Based Rendering for dynamic interpolation of plume static signatures and lastly for aircraft the dynamic interpolation of thermal states. The next step is the representation of the spectral, directional, spatial and temporal signature of flares by Lacroix Defense using OKTAL-SE technology. This representation is prepared from experimental data acquired during windblast tests and high speed track tests. It is based on particle system mechanisms to model the different components of a flare. The validation of a flare model will comprise a simulation of real trials and a comparison of simulation outputs to experimental results concerning the flare signature and above all the behavior of the stimulated threat.

  7. Validation of the PASSPORT V2 training environment for arthroscopic skills.

    PubMed

    Stunt, J J; Kerkhoffs, G M M J; Horeman, T; van Dijk, C N; Tuijthof, G J M

    2016-06-01

    Virtual reality simulators used in the education of orthopaedic residents often lack realistic haptic feedback. To solve this, the (Practice Arthroscopic Surgical Skills for Perfect Operative Real-life Treatment) PASSPORT simulator was developed, which was subjected to fundamental changes: improved realism and user interface. The purpose was to demonstrate its face and construct validity. Thirty-one participants were divided into three groups having different levels of arthroscopic experience. Participants answered questions regarding general information and the outer appearance of the simulator for face validity. Construct validity was assessed with one standardized navigation task, which was timed. Face validity, educational value and user-friendliness were determined with two representative exercises and by asking participants to fill out the questionnaire. A value of 7 or greater was considered sufficient. Construct validity was demonstrated between experts and novices. Median task time for the fifth trial was 55 s (range 17-139 s) for the novices, 33 s (range 17-59 s) for the intermediates, and 26 s (range 14-52 s) for the experts. Median task times of three trials were not significantly different between the novices and intermediates, and none of the trials between intermediates and experts. Face validity, educational value and user-friendliness were perceived as sufficient (median >7). The presence of realistic tactile feedback was considered the biggest asset of the simulator. Proper preparation for arthroscopic operations will increase the quality of real-life surgery and patients' safety. The PASSPORT simulator can assist in achieving this, as it showed construct and face validity, and its physical nature offered adequate haptic feedback during training. This indicates that PASSPORT has potential to evolve as a valuable training modality.

  8. The effects of perceptual priming on 4-year-olds' haptic-to-visual cross-modal transfer.

    PubMed

    Kalagher, Hilary

    2013-01-01

    Four-year-old children often have difficulty visually recognizing objects that were previously experienced only haptically. This experiment attempts to improve their performance in these haptic-to-visual transfer tasks. Sixty-two 4-year-old children participated in priming trials in which they explored eight unfamiliar objects visually, haptically, or visually and haptically together. Subsequently, all children participated in the same haptic-to-visual cross-modal transfer task. In this task, children haptically explored the objects that were presented in the priming phase and then visually identified a match from among three test objects, each matching the object on only one dimension (shape, texture, or color). Children in all priming conditions predominantly made shape-based matches; however, the most shape-based matches were made in the Visual and Haptic condition. All kinds of priming provided the necessary memory traces upon which subsequent haptic exploration could build a strong enough representation to enable subsequent visual recognition. Haptic exploration patterns during the cross-modal transfer task are discussed and the detailed analyses provide a unique contribution to our understanding of the development of haptic exploratory procedures.

  9. Clinical and optical intraocular performance of rotationally asymmetric multifocal IOL plate-haptic design versus C-loop haptic design.

    PubMed

    Alió, Jorge L; Plaza-Puche, Ana B; Javaloy, Jaime; Ayala, María José; Vega-Estrada, Alfredo

    2013-04-01

    To compare the visual and intraocular optical quality outcomes with different designs of the refractive rotationally asymmetric multifocal intraocular lens (MFIOL) (Lentis Mplus; Oculentis GmbH, Berlin, Germany) with or without capsular tension ring (CTR) implantation. One hundred thirty-five consecutive eyes of 78 patients with cataract (ages 36 to 82 years) were divided into three groups: 43 eyes implanted with the C-Loop haptic design without CTR (C-Loop haptic only group); 47 eyes implanted with the C-Loop haptic design with CTR (C-Loop haptic with CTR group); and 45 eyes implanted with the plate-haptic design (plate-haptic group). Visual acuity, contrast sensitivity, defocus curve, and ocular and intraocular optical quality were evaluated at 3 months postoperatively. Significant differences in the postoperative sphere were found (P = .01), with a more myopic postoperative refraction for the C-Loop haptic only group. No significant differences were detected in photopic and scotopic contrast sensitivity among groups (P ⩾ .05). Significantly better visual acuities were present in the C-Loop haptic with CTR group for the defocus levels of -2.0, -1.5, -1.0, and -0.50 D (P ⩽.03). Statistically significant differences among groups were found in total intraocular root mean square (RMS), high-order intraocular RMS, and intraocular coma-like RMS aberrations (P ⩽.04), with lower values from the plate-haptic group. The plate-haptic design and the C-Loop haptic design with CTR implantation both allow good visual rehabilitation. However, better refractive predictability and intraocular optical quality was obtained with the plate-haptic design without CTR implantation. The plate-haptic design seems to be a better design to support rotational asymmetric MFIOL optics. Copyright 2013, SLACK Incorporated.

  10. Limited value of haptics in virtual reality laparoscopic cholecystectomy training.

    PubMed

    Thompson, Jonathan R; Leonard, Anthony C; Doarn, Charles R; Roesch, Matt J; Broderick, Timothy J

    2011-04-01

    Haptics is an expensive addition to virtual reality (VR) simulators, and the added value to training has not been proven. This study evaluated the benefit of haptics in VR laparoscopic surgery training for novices. The Simbionix LapMentor II haptic VR simulator was used in the study. Randomly, 33 laparoscopic novice students were placed in one of three groups: control, haptics-trained, or nonhaptics-trained group. The control group performed nine basic laparoscopy tasks and four cholecystectomy procedural tasks one time with haptics engaged at the default setting. The haptics group was trained to proficiency in the basic tasks and then performed each of the procedural tasks one time with haptics engaged. The nonhaptics group used the same training protocol except that haptics was disengaged. The proficiency values used were previously published expert values. Each group was assessed in the performance of 10 laparoscopic cholecystectomies (alternating with and without haptics). Performance was measured via automatically collected simulator data. The three groups exhibited no differences in terms of sex, education level, hand dominance, video game experience, surgical experience, and nonsurgical simulator experience. The number of attempts required to reach proficiency did not differ between the haptics- and nonhaptics-training groups. The haptics and nonhaptics groups exhibited no difference in performance. Both training groups outperformed the control group in number of movements as well as path length of the left instrument. In addition, the nonhaptics group outperformed the control group in total time. Haptics does not improve the efficiency or effectiveness of LapMentor II VR laparoscopic surgery training. The limited benefit and the significant cost of haptics suggest that haptics should not be included routinely in VR laparoscopic surgery training.

  11. Teleoperation System with Hybrid Pneumatic-Piezoelectric Actuation for MRI-Guided Needle Insertion with Haptic Feedback

    PubMed Central

    Shang, Weijian; Su, Hao; Li, Gang; Fischer, Gregory S.

    2014-01-01

    This paper presents a surgical master-slave tele-operation system for percutaneous interventional procedures under continuous magnetic resonance imaging (MRI) guidance. This system consists of a piezoelectrically actuated slave robot for needle placement with integrated fiber optic force sensor utilizing Fabry-Perot interferometry (FPI) sensing principle. The sensor flexure is optimized and embedded to the slave robot for measuring needle insertion force. A novel, compact opto-mechanical FPI sensor interface is integrated into an MRI robot control system. By leveraging the complementary features of pneumatic and piezoelectric actuation, a pneumatically actuated haptic master robot is also developed to render force associated with needle placement interventions to the clinician. An aluminum load cell is implemented and calibrated to close the impedance control loop of the master robot. A force-position control algorithm is developed to control the hybrid actuated system. Teleoperated needle insertion is demonstrated under live MR imaging, where the slave robot resides in the scanner bore and the user manipulates the master beside the patient outside the bore. Force and position tracking results of the master-slave robot are demonstrated to validate the tracking performance of the integrated system. It has a position tracking error of 0.318mm and sine wave force tracking error of 2.227N. PMID:25126446

  12. Teleoperation System with Hybrid Pneumatic-Piezoelectric Actuation for MRI-Guided Needle Insertion with Haptic Feedback.

    PubMed

    Shang, Weijian; Su, Hao; Li, Gang; Fischer, Gregory S

    2013-01-01

    This paper presents a surgical master-slave tele-operation system for percutaneous interventional procedures under continuous magnetic resonance imaging (MRI) guidance. This system consists of a piezoelectrically actuated slave robot for needle placement with integrated fiber optic force sensor utilizing Fabry-Perot interferometry (FPI) sensing principle. The sensor flexure is optimized and embedded to the slave robot for measuring needle insertion force. A novel, compact opto-mechanical FPI sensor interface is integrated into an MRI robot control system. By leveraging the complementary features of pneumatic and piezoelectric actuation, a pneumatically actuated haptic master robot is also developed to render force associated with needle placement interventions to the clinician. An aluminum load cell is implemented and calibrated to close the impedance control loop of the master robot. A force-position control algorithm is developed to control the hybrid actuated system. Teleoperated needle insertion is demonstrated under live MR imaging, where the slave robot resides in the scanner bore and the user manipulates the master beside the patient outside the bore. Force and position tracking results of the master-slave robot are demonstrated to validate the tracking performance of the integrated system. It has a position tracking error of 0.318mm and sine wave force tracking error of 2.227N.

  13. Exploring the simulation requirements for virtual regional anesthesia training

    NASA Astrophysics Data System (ADS)

    Charissis, V.; Zimmer, C. R.; Sakellariou, S.; Chan, W.

    2010-01-01

    This paper presents an investigation towards the simulation requirements for virtual regional anaesthesia training. To this end we have developed a prototype human-computer interface designed to facilitate Virtual Reality (VR) augmenting educational tactics for regional anaesthesia training. The proposed interface system, aims to compliment nerve blocking techniques methods. The system is designed to operate in real-time 3D environment presenting anatomical information and enabling the user to explore the spatial relation of different human parts without any physical constrains. Furthermore the proposed system aims to assist the trainee anaesthetists so as to build a mental, three-dimensional map of the anatomical elements and their depictive relationship to the Ultra-Sound imaging which is used for navigation of the anaesthetic needle. Opting for a sophisticated approach of interaction, the interface elements are based on simplified visual representation of real objects, and can be operated through haptic devices and surround auditory cues. This paper discusses the challenges involved in the HCI design, introduces the visual components of the interface and presents a tentative plan of future work which involves the development of realistic haptic feedback and various regional anaesthesia training scenarios.

  14. Self-Control of Haptic Assistance for Motor Learning: Influences of Frequency and Opinion of Utility

    PubMed Central

    Williams, Camille K.; Tseung, Victrine; Carnahan, Heather

    2017-01-01

    Studies of self-controlled practice have shown benefits when learners controlled feedback schedule, use of assistive devices and task difficulty, with benefits attributed to information processing and motivational advantages of self-control. Although haptic assistance serves as feedback, aids task performance and modifies task difficulty, researchers have yet to explore whether self-control over haptic assistance could be beneficial for learning. We explored whether self-control of haptic assistance would be beneficial for learning a tracing task. Self-controlled participants selected practice blocks on which they would receive haptic assistance, while participants in a yoked group received haptic assistance on blocks determined by a matched self-controlled participant. We inferred learning from performance on retention tests without haptic assistance. From qualitative analysis of open-ended questions related to rationales for/experiences of the haptic assistance that was chosen/provided, themes emerged regarding participants’ views of the utility of haptic assistance for performance and learning. Results showed that learning was directly impacted by the frequency of haptic assistance for self-controlled participants only and view of haptic assistance. Furthermore, self-controlled participants’ views were significantly associated with their requested haptic assistance frequency. We discuss these findings as further support for the beneficial role of self-controlled practice for motor learning. PMID:29255438

  15. [Postoperative ultrasound biomicroscopic evaluation of the tangible position of black diaphragm posterior chamber lenses in congenital and traumatic aniridia in comparison with gonioscopy].

    PubMed

    Schweykart, N; Reinhard, T; Engelhardt, S; Sundmacher, R

    1999-06-01

    Ultrasound biomicroscopy (UBM) allows to determine the haptic position of posterior chamber lenses (PCL) in relation to adjacent structures. In transsclerally sutured PCLs, the comparison between intraoperatively endoscopically and postoperatively localized haptic positions via UBM showed a correspondence of only 81%. The different localisation of 19% of the examined haptic positions was explained with postoperative dislocation without any proof for this assumption. The purpose of this study therefore was the correlation of UBM results with simultaneously determined haptic positions via gonioscopy in aniridia after black diaphragm PCL implantation. The haptic positions of black diaphragm PCL implants in 20 patients with congenital and 13 patients with traumatic aniridia were determined via UBM (50-MHz-probe) and gonioscopy 44.4 (6-75) months postoperatively. 39/66 haptic positions could be localized in gonioscopy as well as in UBM. 38 haptics (97.4%) showed the same position in both examination techniques. Determination of the haptic position through one of the two examination techniques was impossible in 27/66 haptics (11 haptics in gonioscopy, 16 haptics in UBM). Reasons for this were primarily haptic position behind iris remnants and corneal opacities in gonioscopy and scarring of the ciliary body in UBM. The validity of UBM in localization of PCLs was confirmed gonioscopically, which also confirms our prior assumption of postoperative displacement of IOL-haptics after transscleral suturing in about 20% of cases. Scarring of the ciliary body was the most important obstacle in the determination of PCL haptic positions in relation to adjacent structures.

  16. Techniques for efficient, real-time, 3D visualization of multi-modality cardiac data using consumer graphics hardware.

    PubMed

    Levin, David; Aladl, Usaf; Germano, Guido; Slomka, Piotr

    2005-09-01

    We exploit consumer graphics hardware to perform real-time processing and visualization of high-resolution, 4D cardiac data. We have implemented real-time, realistic volume rendering, interactive 4D motion segmentation of cardiac data, visualization of multi-modality cardiac data and 3D display of multiple series cardiac MRI. We show that an ATI Radeon 9700 Pro can render a 512x512x128 cardiac Computed Tomography (CT) study at 0.9 to 60 frames per second (fps) depending on rendering parameters and that 4D motion based segmentation can be performed in real-time. We conclude that real-time rendering and processing of cardiac data can be implemented on consumer graphics cards.

  17. [Visual cuing effect for haptic angle judgment].

    PubMed

    Era, Ataru; Yokosawa, Kazuhiko

    2009-08-01

    We investigated whether visual cues are useful for judging haptic angles. Participants explored three-dimensional angles with a virtual haptic feedback device. For visual cues, we use a location cue, which synchronizes haptic exploration, and a space cue, which specifies the haptic space. In Experiment 1, angles were judged more correctly with both cues, but were overestimated with a location cue only. In Experiment 2, the visual cues emphasized depth, and overestimation with location cues occurred, but space cues had no influence. The results showed that (a) when both cues are presented, haptic angles are judged more correctly. (b) Location cues facilitate only motion information, and not depth information. (c) Haptic angles are apt to be overestimated when there is both haptic and visual information.

  18. Perception of synchronization errors in haptic and visual communications

    NASA Astrophysics Data System (ADS)

    Kameyama, Seiji; Ishibashi, Yutaka

    2006-10-01

    This paper deals with a system which conveys the haptic sensation experimented by a user to a remote user. In the system, the user controls a haptic interface device with another remote haptic interface device while watching video. Haptic media and video of a real object which the user is touching are transmitted to another user. By subjective assessment, we investigate the allowable range and imperceptible range of synchronization error between haptic media and video. We employ four real objects and ask each subject whether the synchronization error is perceived or not for each object in the assessment. Assessment results show that we can more easily perceive the synchronization error in the case of haptic media ahead of video than in the case of the haptic media behind the video.

  19. Active skin as new haptic interface

    NASA Astrophysics Data System (ADS)

    Vuong, Nguyen Huu Lam; Kwon, Hyeok Yong; Chuc, Nguyen Huu; Kim, Duksang; An, Kuangjun; Phuc, Vuong Hong; Moon, Hyungpil; Koo, Jachoon; Lee, Youngkwan; Nam, Jae-Do; Choi, Hyouk Ryeol

    2010-04-01

    In this paper, we present a new haptic interface, called "active skin", which is configured with a tactile sensor and a tactile stimulator in single haptic cell, and multiple haptic cells are embedded in a dielectric elastomer. The active skin generates a wide variety of haptic feel in response to the touch by synchronizing the sensor and the stimulator. In this paper, the design of the haptic cell is derived via iterative analysis and design procedures. A fabrication method dedicated to the proposed device is investigated and a controller to drive multiple haptic cells is developed. In addition, several experiments are performed to evaluate the performance of the active skin.

  20. A comparison of haptic material perception in blind and sighted individuals.

    PubMed

    Baumgartner, Elisabeth; Wiebel, Christiane B; Gegenfurtner, Karl R

    2015-10-01

    We investigated material perception in blind participants to explore the influence of visual experience on material representations and the relationship between visual and haptic material perception. In a previous study with sighted participants, we had found participants' visual and haptic judgments of material properties to be very similar (Baumgartner, Wiebel, & Gegenfurtner, 2013). In a categorization task, however, visual exploration had led to higher categorization accuracy than haptic exploration. Here, we asked congenitally blind participants to explore different materials haptically and rate several material properties in order to assess the role of the visual sense for the emergence of haptic material perception. Principal components analyses combined with a procrustes superimposition showed that the material representations of blind and blindfolded sighted participants were highly similar. We also measured haptic categorization performance, which was equal for the two groups. We conclude that haptic material representations can emerge independently of visual experience, and that there are no advantages for either group of observers in haptic categorization. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Topography compensation for haptization of a mesh object and its stiffness distribution.

    PubMed

    Yim, Sunghoon; Jeon, Seokhee; Choi, Seungmoon

    2015-01-01

    This work was motivated by the need for perceptualizing nano-scale scientific data, e.g., those acquired by a scanning probe microscope, where collocated topography and stiffness distribution of a surface can be measured. Previous research showed that when the topography of a surface with spatially varying stiffness is rendered using the conventional penalty-based haptic rendering method, the topography perceived by the user could be significantly distorted from its original model. In the worst case, a higher region with a smaller stiffness value can be perceived to be lower than a lower region with a larger stiffness value. This problem was explained by the theory of force constancy: the user tends to maintain an invariant contact force when s/he strokes the surface to perceive its topography. In this paper, we present a haptization algorithm that can render the shape of a mesh surface and its stiffness distribution with high perceptual accuracy. Our algorithm adaptively changes the surface topography on the basis of the force constancy theory to deliver adequate shape information to the user while preserving the stiffness perception. We also evaluated the performance of the proposed haptization algorithm in comparison to the constraint-based algorithm by examining relevant proximal stimuli and carrying out a user experiment. Results demonstrated that our algorithm could improve the perceptual accuracy of shape and reduce the exploration time, thereby leading to more accurate and efficient haptization.

  2. Virtual reality training and assessment in laparoscopic rectum surgery.

    PubMed

    Pan, Jun J; Chang, Jian; Yang, Xiaosong; Liang, Hui; Zhang, Jian J; Qureshi, Tahseen; Howell, Robert; Hickish, Tamas

    2015-06-01

    Virtual-reality (VR) based simulation techniques offer an efficient and low cost alternative to conventional surgery training. This article describes a VR training and assessment system in laparoscopic rectum surgery. To give a realistic visual performance of interaction between membrane tissue and surgery tools, a generalized cylinder based collision detection and a multi-layer mass-spring model are presented. A dynamic assessment model is also designed for hierarchy training evaluation. With this simulator, trainees can operate on the virtual rectum with both visual and haptic sensation feedback simultaneously. The system also offers surgeons instructions in real time when improper manipulation happens. The simulator has been tested and evaluated by ten subjects. This prototype system has been verified by colorectal surgeons through a pilot study. They believe the visual performance and the tactile feedback are realistic. It exhibits the potential to effectively improve the surgical skills of trainee surgeons and significantly shorten their learning curve. Copyright © 2014 John Wiley & Sons, Ltd.

  3. Spatiotemporal Visualization of Time-Series Satellite-Derived CO2 Flux Data Using Volume Rendering and Gpu-Based Interpolation on a Cloud-Driven Digital Earth

    NASA Astrophysics Data System (ADS)

    Wu, S.; Yan, Y.; Du, Z.; Zhang, F.; Liu, R.

    2017-10-01

    The ocean carbon cycle has a significant influence on global climate, and is commonly evaluated using time-series satellite-derived CO2 flux data. Location-aware and globe-based visualization is an important technique for analyzing and presenting the evolution of climate change. To achieve realistic simulation of the spatiotemporal dynamics of ocean carbon, a cloud-driven digital earth platform is developed to support the interactive analysis and display of multi-geospatial data, and an original visualization method based on our digital earth is proposed to demonstrate the spatiotemporal variations of carbon sinks and sources using time-series satellite data. Specifically, a volume rendering technique using half-angle slicing and particle system is implemented to dynamically display the released or absorbed CO2 gas. To enable location-aware visualization within the virtual globe, we present a 3D particlemapping algorithm to render particle-slicing textures onto geospace. In addition, a GPU-based interpolation framework using CUDA during real-time rendering is designed to obtain smooth effects in both spatial and temporal dimensions. To demonstrate the capabilities of the proposed method, a series of satellite data is applied to simulate the air-sea carbon cycle in the China Sea. The results show that the suggested strategies provide realistic simulation effects and acceptable interactive performance on the digital earth.

  4. Creating photo-realistic works in a 3D scene using layers styles to create an animation

    NASA Astrophysics Data System (ADS)

    Avramescu, A. M.

    2015-11-01

    Creating realist objects in a 3D scene is not an easy work. We have to be very careful to make the creation very detailed. If we don't know how to make these photo-realistic works, by using the techniques and a good reference photo we can create an amazing amount of detail and realism. For example, in this article there are some of these detailed methods from which we can learn the techniques necessary to make beautiful and realistic objects in a scene. More precisely, in this paper, we present how to create a 3D animated scene, mainly using the Pen Tool and Blending Options. Indeed, this work is based on teaching some simple ways of using the Layer Styles to create some great shadows, lights, textures and a realistic sense of 3 Dimension. The present work involves also showing how some interesting ways of using the illuminating and rendering options can create a realistic effect in a scene. Moreover, this article shows how to create photo realistic 3D models from a digital image. The present work proposes to present how to use Illustrator paths, texturing, basic lighting and rendering, how to apply textures and how to parent the building and objects components. We also propose to use this proposition to recreate smaller details or 3D objects from a 2D image. After a critic art stage, we are able now to present in this paper the architecture of a design method that proposes to create an animation. The aim is to create a conceptual and methodological tutorial to address this issue both scientifically and in practice. This objective also includes proposing, on strong scientific basis, a model that gives the possibility of a better understanding of the techniques necessary to create a realistic animation.

  5. Haptics – Touchfeedback Technology Widening the Horizon of Medicine

    PubMed Central

    Kapoor, Shalini; Arora, Pallak; Kapoor, Vikas; Jayachandran, Mahesh; Tiwari, Manish

    2014-01-01

    Haptics, or touchsense haptic technology is a major breakthrough in medical and dental interventions. Haptic perception is the process of recognizing objects through touch. Haptic sensations are created by actuators or motors which generate vibrations to the users and are controlled by embedded software which is integrated into the device. It takes the advantage of a combination of somatosensory pattern of skin and proprioception of hand position. Anatomical and diagnostic knowledge, when it is combined with this touch sense technology, has revolutionized medical education. This amalgamation of the worlds of diagnosis and surgical intervention adds precise robotic touch to the skill of the surgeon. A systematic literature review was done by using MEDLINE, GOOGLE SEARCH AND PubMed. The aim of this article was to introduce the fundamentals of haptic technology, its current applications in medical training and robotic surgeries, limitations of haptics and future aspects of haptics in medicine. PMID:24783164

  6. Saving and Reproduction of Human Motion Data by Using Haptic Devices with Different Configurations

    NASA Astrophysics Data System (ADS)

    Tsunashima, Noboru; Yokokura, Yuki; Katsura, Seiichiro

    Recently, there has been increased focus on “haptic recording” development of a motion-copying system is an efficient method for the realization of haptic recording. Haptic recording involves saving and reproduction of human motion data on the basis of haptic information. To increase the number of applications of the motion-copying system in various fields, it is necessary to reproduce human motion data by using haptic devices with different configurations. In this study, a method for the above-mentioned haptic recording is developed. In this method, human motion data are saved and reproduced on the basis of work space information, which is obtained by coordinate transformation of motor space information. The validity of the proposed method is demonstrated by experiments. With the proposed method, saving and reproduction of human motion data by using various devices is achieved. Furthermore, it is also possible to use haptic recording in various fields.

  7. Training haptic stiffness discrimination: time course of learning with or without visual information and knowledge of results.

    PubMed

    Teodorescu, Kinneret; Bouchigny, Sylvain; Korman, Maria

    2013-08-01

    In this study, we explored the time course of haptic stiffness discrimination learning and how it was affected by two experimental factors, the addition of visual information and/or knowledge of results (KR) during training. Stiffness perception may integrate both haptic and visual modalities. However, in many tasks, the visual field is typically occluded, forcing stiffness perception to be dependent exclusively on haptic information. No studies to date addressed the time course of haptic stiffness perceptual learning. Using a virtual environment (VE) haptic interface and a two-alternative forced-choice discrimination task, the haptic stiffness discrimination ability of 48 participants was tested across 2 days. Each day included two haptic test blocks separated by a training block Additional visual information and/or KR were manipulated between participants during training blocks. Practice repetitions alone induced significant improvement in haptic stiffness discrimination. Between days, accuracy was slightly improved, but decision time performance was deteriorated. The addition of visual information and/or KR had only temporary effects on decision time, without affecting the time course of haptic discrimination learning. Learning in haptic stiffness discrimination appears to evolve through at least two distinctive phases: A single training session resulted in both immediate and latent learning. This learning was not affected by the training manipulations inspected. Training skills in VE in spaced sessions can be beneficial for tasks in which haptic perception is critical, such as surgery procedures, when the visual field is occluded. However, training protocols for such tasks should account for low impact of multisensory information and KR.

  8. Structural impact detection with vibro-haptic interfaces

    NASA Astrophysics Data System (ADS)

    Jung, Hwee-Kwon; Park, Gyuhae; Todd, Michael D.

    2016-07-01

    This paper presents a new sensing paradigm for structural impact detection using vibro-haptic interfaces. The goal of this study is to allow humans to ‘feel’ structural responses (impact, shape changes, and damage) and eventually determine health conditions of a structure. The target applications for this study are aerospace structures, in particular, airplane wings. Both hardware and software components are developed to realize the vibro-haptic-based impact detection system. First, L-shape piezoelectric sensor arrays are deployed to measure the acoustic emission data generated by impacts on a wing. Unique haptic signals are then generated by processing the measured acoustic emission data. These haptic signals are wirelessly transmitted to human arms, and with vibro-haptic interface, human pilots could identify impact location, intensity and possibility of subsequent damage initiation. With the haptic interface, the experimental results demonstrate that human could correctly identify such events, while reducing false indications on structural conditions by capitalizing on human’s classification capability. Several important aspects of this study, including development of haptic interfaces, design of optimal human training strategies, and extension of the haptic capability into structural impact detection are summarized in this paper.

  9. Haptic wearables as sensory replacement, sensory augmentation and trainer - a review.

    PubMed

    Shull, Peter B; Damian, Dana D

    2015-07-20

    Sensory impairments decrease quality of life and can slow or hinder rehabilitation. Small, computationally powerful electronics have enabled the recent development of wearable systems aimed to improve function for individuals with sensory impairments. The purpose of this review is to synthesize current haptic wearable research for clinical applications involving sensory impairments. We define haptic wearables as untethered, ungrounded body worn devices that interact with skin directly or through clothing and can be used in natural environments outside a laboratory. Results of this review are categorized by degree of sensory impairment. Total impairment, such as in an amputee, blind, or deaf individual, involves haptics acting as sensory replacement; partial impairment, as is common in rehabilitation, involves haptics as sensory augmentation; and no impairment involves haptics as trainer. This review found that wearable haptic devices improved function for a variety of clinical applications including: rehabilitation, prosthetics, vestibular loss, osteoarthritis, vision loss and hearing loss. Future haptic wearables development should focus on clinical needs, intuitive and multimodal haptic displays, low energy demands, and biomechanical compliance for long-term usage.

  10. End-to-End Flow Control for Visual-Haptic Communication under Bandwidth Change

    NASA Astrophysics Data System (ADS)

    Yashiro, Daisuke; Tian, Dapeng; Yakoh, Takahiro

    This paper proposes an end-to-end flow controller for visual-haptic communication. A visual-haptic communication system transmits non-real-time packets, which contain large-size visual data, and real-time packets, which contain small-size haptic data. When the transmission rate of visual data exceeds the communication bandwidth, the visual-haptic communication system becomes unstable owing to buffer overflow. To solve this problem, an end-to-end flow controller is proposed. This controller determines the optimal transmission rate of visual data on the basis of the traffic conditions, which are estimated by the packets for haptic communication. Experimental results confirm that in the proposed method, a short packet-sending interval and a short delay are achieved under bandwidth change, and thus, high-precision visual-haptic communication is realized.

  11. Preserved Haptic Shape Processing after Bilateral LOC Lesions.

    PubMed

    Snow, Jacqueline C; Goodale, Melvyn A; Culham, Jody C

    2015-10-07

    The visual and haptic perceptual systems are understood to share a common neural representation of object shape. A region thought to be critical for recognizing visual and haptic shape information is the lateral occipital complex (LOC). We investigated whether LOC is essential for haptic shape recognition in humans by studying behavioral responses and brain activation for haptically explored objects in a patient (M.C.) with bilateral lesions of the occipitotemporal cortex, including LOC. Despite severe deficits in recognizing objects using vision, M.C. was able to accurately recognize objects via touch. M.C.'s psychophysical response profile to haptically explored shapes was also indistinguishable from controls. Using fMRI, M.C. showed no object-selective visual or haptic responses in LOC, but her pattern of haptic activation in other brain regions was remarkably similar to healthy controls. Although LOC is routinely active during visual and haptic shape recognition tasks, it is not essential for haptic recognition of object shape. The lateral occipital complex (LOC) is a brain region regarded to be critical for recognizing object shape, both in vision and in touch. However, causal evidence linking LOC with haptic shape processing is lacking. We studied recognition performance, psychophysical sensitivity, and brain response to touched objects, in a patient (M.C.) with extensive lesions involving LOC bilaterally. Despite being severely impaired in visual shape recognition, M.C. was able to identify objects via touch and she showed normal sensitivity to a haptic shape illusion. M.C.'s brain response to touched objects in areas of undamaged cortex was also very similar to that observed in neurologically healthy controls. These results demonstrate that LOC is not necessary for recognizing objects via touch. Copyright © 2015 the authors 0270-6474/15/3513745-16$15.00/0.

  12. Haptic augmentation of science instruction: Does touch matter?

    NASA Astrophysics Data System (ADS)

    Jones, M. Gail; Minogue, James; Tretter, Thomas R.; Negishi, Atsuko; Taylor, Russell

    2006-01-01

    This study investigated the impact of haptic augmentation of a science inquiry program on students' learning about viruses and nanoscale science. The study assessed how the addition of different types of haptic feedback (active touch and kinesthetic feedback) combined with computer visualizations influenced middle and high school students' experiences. The influences of a PHANToM (a sophisticated haptic desktop device), a Sidewinder (a haptic gaming joystick), and a mouse (no haptic feedback) interface were compared. The levels of engagement in the instruction and students' attitudes about the instructional program were assessed using a combination of constructed response and Likert scale items. Potential cognitive differences were examined through an analysis of spontaneously generated analogies that appeared during student discourse. Results showed that the addition of haptic feedback from the haptic-gaming joystick and the PHANToM provided a more immersive learning environment that not only made the instruction more engaging but may also influence the way in which the students construct their understandings about abstract science concepts.

  13. Seeing a haptically explored face: visual facial-expression aftereffect from haptic adaptation to a face.

    PubMed

    Matsumiya, Kazumichi

    2013-10-01

    Current views on face perception assume that the visual system receives only visual facial signals. However, I show that the visual perception of faces is systematically biased by adaptation to a haptically explored face. Recently, face aftereffects (FAEs; the altered perception of faces after adaptation to a face) have been demonstrated not only in visual perception but also in haptic perception; therefore, I combined the two FAEs to examine whether the visual system receives face-related signals from the haptic modality. I found that adaptation to a haptically explored facial expression on a face mask produced a visual FAE for facial expression. This cross-modal FAE was not due to explicitly imaging a face, response bias, or adaptation to local features. Furthermore, FAEs transferred from vision to haptics. These results indicate that visual face processing depends on substrates adapted by haptic faces, which suggests that face processing relies on shared representation underlying cross-modal interactions.

  14. A magnetorheological haptic cue accelerator for manual transmission vehicles

    NASA Astrophysics Data System (ADS)

    Han, Young-Min; Noh, Kyung-Wook; Lee, Yang-Sub; Choi, Seung-Bok

    2010-07-01

    This paper proposes a new haptic cue function for manual transmission vehicles to achieve optimal gear shifting. This function is implemented on the accelerator pedal by utilizing a magnetorheological (MR) brake mechanism. By combining the haptic cue function with the accelerator pedal, the proposed haptic cue device can transmit the optimal moment of gear shifting for manual transmission to a driver without requiring the driver's visual attention. As a first step to achieve this goal, a MR fluid-based haptic device is devised to enable rotary motion of the accelerator pedal. Taking into account spatial limitations, the design parameters are optimally determined using finite element analysis to maximize the relative control torque. The proposed haptic cue device is then manufactured and its field-dependent torque and time response are experimentally evaluated. Then the manufactured MR haptic cue device is integrated with the accelerator pedal. A simple virtual vehicle emulating the operation of the engine of a passenger vehicle is constructed and put into communication with the haptic cue device. A feed-forward torque control algorithm for the haptic cue is formulated and control performances are experimentally evaluated and presented in the time domain.

  15. Force and torque modelling of drilling simulation for orthopaedic surgery.

    PubMed

    MacAvelia, Troy; Ghasempoor, Ahmad; Janabi-Sharifi, Farrokh

    2014-01-01

    The advent of haptic simulation systems for orthopaedic surgery procedures has provided surgeons with an excellent tool for training and preoperative planning purposes. This is especially true for procedures involving the drilling of bone, which require a great amount of adroitness and experience due to difficulties arising from vibration and drill bit breakage. One of the potential difficulties with the drilling of bone is the lack of consistent material evacuation from the drill's flutes as the material tends to clog. This clogging leads to significant increases in force and torque experienced by the surgeon. Clogging was observed for feed rates greater than 0.5 mm/s and spindle speeds less than 2500 rpm. The drilling simulation systems that have been created to date do not address the issue of drill flute clogging. This paper presents force and torque prediction models that account for this phenomenon. The two coefficients of friction required by these models were determined via a set of calibration experiments. The accuracy of both models was evaluated by an additional set of validation experiments resulting in average R² regression correlation values of 0.9546 and 0.9209 for the force and torque prediction models, respectively. The resulting models can be adopted by haptic simulation systems to provide a more realistic tactile output.

  16. Assimilation of virtual legs and perception of floor texture by complete paraplegic patients receiving artificial tactile feedback.

    PubMed

    Shokur, Solaiman; Gallo, Simone; Moioli, Renan C; Donati, Ana Rita C; Morya, Edgard; Bleuler, Hannes; Nicolelis, Miguel A L

    2016-09-19

    Spinal cord injuries disrupt bidirectional communication between the patient's brain and body. Here, we demonstrate a new approach for reproducing lower limb somatosensory feedback in paraplegics by remapping missing leg/foot tactile sensations onto the skin of patients' forearms. A portable haptic display was tested in eight patients in a setup where the lower limbs were simulated using immersive virtual reality (VR). For six out of eight patients, the haptic display induced the realistic illusion of walking on three different types of floor surfaces: beach sand, a paved street or grass. Additionally, patients experienced the movements of the virtual legs during the swing phase or the sensation of the foot rolling on the floor while walking. Relying solely on this tactile feedback, patients reported the position of the avatar leg during virtual walking. Crossmodal interference between vision of the virtual legs and tactile feedback revealed that patients assimilated the virtual lower limbs as if they were their own legs. We propose that the addition of tactile feedback to neuroprosthetic devices is essential to restore a full lower limb perceptual experience in spinal cord injury (SCI) patients, and will ultimately, lead to a higher rate of prosthetic acceptance/use and a better level of motor proficiency.

  17. Assimilation of virtual legs and perception of floor texture by complete paraplegic patients receiving artificial tactile feedback

    PubMed Central

    Shokur, Solaiman; Gallo, Simone; Moioli, Renan C.; Donati, Ana Rita C.; Morya, Edgard; Bleuler, Hannes; Nicolelis, Miguel A.L.

    2016-01-01

    Spinal cord injuries disrupt bidirectional communication between the patient’s brain and body. Here, we demonstrate a new approach for reproducing lower limb somatosensory feedback in paraplegics by remapping missing leg/foot tactile sensations onto the skin of patients’ forearms. A portable haptic display was tested in eight patients in a setup where the lower limbs were simulated using immersive virtual reality (VR). For six out of eight patients, the haptic display induced the realistic illusion of walking on three different types of floor surfaces: beach sand, a paved street or grass. Additionally, patients experienced the movements of the virtual legs during the swing phase or the sensation of the foot rolling on the floor while walking. Relying solely on this tactile feedback, patients reported the position of the avatar leg during virtual walking. Crossmodal interference between vision of the virtual legs and tactile feedback revealed that patients assimilated the virtual lower limbs as if they were their own legs. We propose that the addition of tactile feedback to neuroprosthetic devices is essential to restore a full lower limb perceptual experience in spinal cord injury (SCI) patients, and will ultimately, lead to a higher rate of prosthetic acceptance/use and a better level of motor proficiency. PMID:27640345

  18. Emerging Role of Three-Dimensional Printing in Simulation in Otolaryngology.

    PubMed

    VanKoevering, Kyle K; Malloy, Kelly Michele

    2017-10-01

    Simulation is rapidly expanding across medicine as a valuable component of trainee education. For procedural simulation, development of low-cost simulators that allow a realistic, haptic experience for learners to practice maneuvers while appreciating anatomy has become highly valuable. Otolaryngology has seen significant advancements in development of improved, specialty-specific simulators with the expansion of three-dimensional (3D) printing. This article highlights the fundamental components of 3D printing and the multitude of subspecialty simulators that have been developed with the assistance of 3D printing. It briefly discusses important considerations such as cost, fidelity, and validation where available in the literature. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. NASA's Hybrid Reality Lab: One Giant Leap for Full Dive

    NASA Technical Reports Server (NTRS)

    Delgado, Francisco J.; Noyes, Matthew

    2017-01-01

    This presentation demonstrates how NASA is using consumer VR headsets, game engine technology and NVIDIA's GPUs to create highly immersive future training systems augmented with extremely realistic haptic feedback, sound, additional sensory information, and how these can be used to improve the engineering workflow. Include in this presentation is an environment simulation of the ISS, where users can interact with virtual objects, handrails, and tracked physical objects while inside VR, integration of consumer VR headsets with the Active Response Gravity Offload System, and a space habitat architectural evaluation tool. Attendees will learn how the best elements of real and virtual worlds can be combined into a hybrid reality environment with tangible engineering and scientific applications.

  20. The Effect of Realistic Appearance of Virtual Characters in Immersive Environments - Does the Character's Personality Play a Role?

    PubMed

    Zibrek, Katja; Kokkinara, Elena; Mcdonnell, Rachel

    2018-04-01

    Virtual characters that appear almost photo-realistic have been shown to induce negative responses from viewers in traditional media, such as film and video games. This effect, described as the uncanny valley, is the reason why realism is often avoided when the aim is to create an appealing virtual character. In Virtual Reality, there have been few attempts to investigate this phenomenon and the implications of rendering virtual characters with high levels of realism on user enjoyment. In this paper, we conducted a large-scale experiment on over one thousand members of the public in order to gather information on how virtual characters are perceived in interactive virtual reality games. We were particularly interested in whether different render styles (realistic, cartoon, etc.) would directly influence appeal, or if a character's personality was the most important indicator of appeal. We used a number of perceptual metrics such as subjective ratings, proximity, and attribution bias in order to test our hypothesis. Our main result shows that affinity towards virtual characters is a complex interaction between the character's appearance and personality, and that realism is in fact a positive choice for virtual characters in virtual reality.

  1. Design of a haptic device with grasp and push-pull force feedback for a master-slave surgical robot.

    PubMed

    Hu, Zhenkai; Yoon, Chae-Hyun; Park, Samuel Byeongjun; Jo, Yung-Ho

    2016-07-01

    We propose a portable haptic device providing grasp (kinesthetic) and push-pull (cutaneous) sensations for optical-motion-capture master interfaces. Although optical-motion-capture master interfaces for surgical robot systems can overcome the stiffness, friction, and coupling problems of mechanical master interfaces, it is difficult to add haptic feedback to an optical-motion-capture master interface without constraining the free motion of the operator's hands. Therefore, we utilized a Bowden cable-driven mechanism to provide the grasp and push-pull sensation while retaining the free hand motion of the optical-motion capture master interface. To evaluate the haptic device, we construct a 2-DOF force sensing/force feedback system. We compare the sensed force and the reproduced force of the haptic device. Finally, a needle insertion test was done to evaluate the performance of the haptic interface in the master-slave system. The results demonstrate that both the grasp force feedback and the push-pull force feedback provided by the haptic interface closely matched with the sensed forces of the slave robot. We successfully apply our haptic interface in the optical-motion-capture master-slave system. The results of the needle insertion test showed that our haptic feedback can provide more safety than merely visual observation. We develop a suitable haptic device to produce both kinesthetic grasp force feedback and cutaneous push-pull force feedback. Our future research will include further objective performance evaluations of the optical-motion-capture master-slave robot system with our haptic interface in surgical scenarios.

  2. 3D-Printed Simulation Device for Orbital Surgery.

    PubMed

    Lichtenstein, Juergen Thomas; Zeller, Alexander Nicolai; Lemound, Juliana; Lichtenstein, Thorsten Enno; Rana, Majeed; Gellrich, Nils-Claudius; Wagner, Maximilian Eberhard

    Orbital surgery is a challenging procedure because of its complex anatomy. Training could especially benefit from dedicated study models. The currently available devices lack sufficient anatomical representation and realistic soft tissue properties. Hence, we developed a 3D-printed simulation device for orbital surgery with tactual (haptic) correct simulation of all relevant anatomical structures. Based on computed tomography scans collected from patients treated in a third referral center, the hard and soft tissue were segmented and virtually processed to generate a 3D-model of the orbit. Hard tissue was then physically realized by 3D-printing. The soft tissue was manufactured by a composite silicone model of the nucleus and the surrounding tissue over a negative mold model also generated by 3D-printing. The final model was evaluated by a group of 5 trainees in oral and maxillofacial surgery (1) and a group of 5 consultants (2). All participants were asked to reconstruct an isolated orbital floor defect with a titanium implant. A stereotactic navigation system was available to all participants. Their experience was evaluated for haptic realism, correct representation of surgical approach, general handling of model, insertion of implant into the orbit, placement and fixation of implant, and usability of navigated control. The items were evaluated via nonparametric statistics (1 [poor]-5 [good]). Group 1 gave an average mark of 4.0 (±0.9) versus 4.6 (±0.6) by group 2. The haptics were rated as 3.6 (±1.1) [1] and 4.2 (±0.8) [2]. The surgical approach was graded 3.7 (±1.2) [1] and 4.0 (±1.0) [2]. Handling of the models was rated 3.5 (±1.1) [1] and 4 (±0.7) [2]. The insertion of the implants was marked as 3.7 (±0.8) [1] and 4.2 (±0.8) [2]. Fixation of the implants was also perceived to be realistic with 3.6 (±0.9) [1] and 4.2 (±0.45) [2]. Lastly, surgical navigation was rated 3.8 (±0.8) [1] and 4.6 (±0.56) [2]. In this project, all relevant hard and soft tissue characteristics of orbital anatomy could be realized. Moreover, it was possible to demonstrate that the entire workflow of an orbital procedure may be simulated. Hence, using this model training expenses may be reduced and patient security could be enhanced. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  3. The Efficacy of Surface Haptics and Force Feedback in Education

    ERIC Educational Resources Information Center

    Gorlewicz, Jenna Lynn

    2013-01-01

    This dissertation bridges the fields of haptics, engineering, and education to realize some of the potential benefits haptic devices may have in Science, Technology, Engineering, and Math (STEM) education. Specifically, this dissertation demonstrates the development, implementation, and assessment of two haptic devices in engineering and math…

  4. Incorporating Haptic Feedback in Simulation for Learning Physics

    ERIC Educational Resources Information Center

    Han, Insook; Black, John B.

    2011-01-01

    The purpose of this study was to investigate the effectiveness of a haptic augmented simulation in learning physics. The results indicate that haptic augmented simulations, both the force and kinesthetic and the purely kinesthetic simulations, were more effective than the equivalent non-haptic simulation in providing perceptual experiences and…

  5. Haptic Distal Spatial Perception Mediated by Strings: Haptic "Looming"

    ERIC Educational Resources Information Center

    Cabe, Patrick A.

    2011-01-01

    Five experiments tested a haptic analog of optical looming, demonstrating string-mediated haptic distal spatial perception. Horizontally collinear hooks supported a weighted string held taut by a blindfolded participant's finger midway between the hooks. At the finger, the angle between string segments increased as the finger approached…

  6. Haptic Classification of Common Objects: Knowledge-Driven Exploration.

    ERIC Educational Resources Information Center

    Lederman, Susan J.; Klatzky, Roberta L.

    1990-01-01

    Theoretical and empirical issues relating to haptic exploration and the representation of common objects during haptic classification were investigated in 3 experiments involving a total of 112 college students. Results are discussed in terms of a computational model of human haptic object classification with implications for dextrous robot…

  7. Real-time simulation of biological soft tissues: a PGD approach.

    PubMed

    Niroomandi, S; González, D; Alfaro, I; Bordeu, F; Leygue, A; Cueto, E; Chinesta, F

    2013-05-01

    We introduce here a novel approach for the numerical simulation of nonlinear, hyperelastic soft tissues at kilohertz feedback rates necessary for haptic rendering. This approach is based upon the use of proper generalized decomposition techniques, a generalization of PODs. Proper generalized decomposition techniques can be considered as a means of a priori model order reduction and provides a physics-based meta-model without the need for prior computer experiments. The suggested strategy is thus composed of an offline phase, in which a general meta-model is computed, and an online evaluation phase in which the results are obtained at real time. Results are provided that show the potential of the proposed technique, together with some benchmark test that shows the accuracy of the method. Copyright © 2013 John Wiley & Sons, Ltd.

  8. Exploring Relationships between Students' Interaction and Learning with a Haptic Virtual Biomolecular Model

    ERIC Educational Resources Information Center

    Schonborn, Konrad J.; Bivall, Petter; Tibell, Lena A. E.

    2011-01-01

    This study explores tertiary students' interaction with a haptic virtual model representing the specific binding of two biomolecules, a core concept in molecular life science education. Twenty students assigned to a "haptics" (experimental) or "no-haptics" (control) condition performed a "docking" task where users sought the most favourable…

  9. Differential effects of non-informative vision and visual interference on haptic spatial processing

    PubMed Central

    van Rheede, Joram J.; Postma, Albert; Kappers, Astrid M. L.

    2008-01-01

    The primary purpose of this study was to examine the effects of non-informative vision and visual interference upon haptic spatial processing, which supposedly derives from an interaction between an allocentric and egocentric reference frame. To this end, a haptic parallelity task served as baseline to determine the participant-dependent biasing influence of the egocentric reference frame. As expected, large systematic participant-dependent deviations from veridicality were observed. In the second experiment we probed the effect of non-informative vision on the egocentric bias. Moreover, orienting mechanisms (gazing directions) were studied with respect to the presentation of haptic information in a specific hemispace. Non-informative vision proved to have a beneficial effect on haptic spatial processing. No effect of gazing direction or hemispace was observed. In the third experiment we investigated the effect of simultaneously presented interfering visual information on the haptic bias. Interfering visual information parametrically influenced haptic performance. The interplay of reference frames that subserves haptic spatial processing was found to be related to both the effects of non-informative vision and visual interference. These results suggest that spatial representations are influenced by direct cross-modal interactions; inter-participant differences in the haptic modality resulted in differential effects of the visual modality. PMID:18553074

  10. Size-Sensitive Perceptual Representations Underlie Visual and Haptic Object Recognition

    PubMed Central

    Craddock, Matt; Lawson, Rebecca

    2009-01-01

    A variety of similarities between visual and haptic object recognition suggests that the two modalities may share common representations. However, it is unclear whether such common representations preserve low-level perceptual features or whether transfer between vision and haptics is mediated by high-level, abstract representations. Two experiments used a sequential shape-matching task to examine the effects of size changes on unimodal and crossmodal visual and haptic object recognition. Participants felt or saw 3D plastic models of familiar objects. The two objects presented on a trial were either the same size or different sizes and were the same shape or different but similar shapes. Participants were told to ignore size changes and to match on shape alone. In Experiment 1, size changes on same-shape trials impaired performance similarly for both visual-to-visual and haptic-to-haptic shape matching. In Experiment 2, size changes impaired performance on both visual-to-haptic and haptic-to-visual shape matching and there was no interaction between the cost of size changes and direction of transfer. Together the unimodal and crossmodal matching results suggest that the same, size-specific perceptual representations underlie both visual and haptic object recognition, and indicate that crossmodal memory for objects must be at least partly based on common perceptual representations. PMID:19956685

  11. Learning of Temporal and Spatial Movement Aspects: A Comparison of Four Types of Haptic Control and Concurrent Visual Feedback.

    PubMed

    Rauter, Georg; Sigrist, Roland; Riener, Robert; Wolf, Peter

    2015-01-01

    In literature, the effectiveness of haptics for motor learning is controversially discussed. Haptics is believed to be effective for motor learning in general; however, different types of haptic control enhance different movement aspects. Thus, in dependence on the movement aspects of interest, one type of haptic control may be effective whereas another one is not. Therefore, in the current work, it was investigated if and how different types of haptic controllers affect learning of spatial and temporal movement aspects. In particular, haptic controllers that enforce active participation of the participants were expected to improve spatial aspects. Only haptic controllers that provide feedback about the task's velocity profile were expected to improve temporal aspects. In a study on learning a complex trunk-arm rowing task, the effect of training with four different types of haptic control was investigated: position control, path control, adaptive path control, and reactive path control. A fifth group (control) trained with visual concurrent augmented feedback. As hypothesized, the position controller was most effective for learning of temporal movement aspects, while the path controller was most effective in teaching spatial movement aspects of the rowing task. Visual feedback was also effective for learning temporal and spatial movement aspects.

  12. Prevailing Trends in Haptic Feedback Simulation for Minimally Invasive Surgery.

    PubMed

    Pinzon, David; Byrns, Simon; Zheng, Bin

    2016-08-01

    Background The amount of direct hand-tool-tissue interaction and feedback in minimally invasive surgery varies from being attenuated in laparoscopy to being completely absent in robotic minimally invasive surgery. The role of haptic feedback during surgical skill acquisition and its emphasis in training have been a constant source of controversy. This review discusses the major developments in haptic simulation as they relate to surgical performance and the current research questions that remain unanswered. Search Strategy An in-depth review of the literature was performed using PubMed. Results A total of 198 abstracts were returned based on our search criteria. Three major areas of research were identified, including advancements in 1 of the 4 components of haptic systems, evaluating the effectiveness of haptic integration in simulators, and improvements to haptic feedback in robotic surgery. Conclusions Force feedback is the best method for tissue identification in minimally invasive surgery and haptic feedback provides the greatest benefit to surgical novices in the early stages of their training. New technology has improved our ability to capture, playback and enhance to utility of haptic cues in simulated surgery. Future research should focus on deciphering how haptic training in surgical education can increase performance, safety, and improve training efficiency. © The Author(s) 2016.

  13. KinoHaptics: An Automated, Wearable, Haptic Assisted, Physio-therapeutic System for Post-surgery Rehabilitation and Self-care.

    PubMed

    Rajanna, Vijay; Vo, Patrick; Barth, Jerry; Mjelde, Matthew; Grey, Trevor; Oduola, Cassandra; Hammond, Tracy

    2016-03-01

    A carefully planned, structured, and supervised physiotherapy program, following a surgery, is crucial for the successful diagnosis of physical injuries. Nearly 50 % of the surgeries fail due to unsupervised, and erroneous physiotherapy. The demand for a physiotherapist for an extended period is expensive to afford, and sometimes inaccessible. Researchers have tried to leverage the advancements in wearable sensors and motion tracking by building affordable, automated, physio-therapeutic systems that direct a physiotherapy session by providing audio-visual feedback on patient's performance. There are many aspects of automated physiotherapy program which are yet to be addressed by the existing systems: a wide classification of patients' physiological conditions to be diagnosed, multiple demographics of the patients (blind, deaf, etc.), and the need to pursue patients to adopt the system for an extended period for self-care. In our research, we have tried to address these aspects by building a health behavior change support system called KinoHaptics, for post-surgery rehabilitation. KinoHaptics is an automated, wearable, haptic assisted, physio-therapeutic system that can be used by a wide variety of demographics and for various physiological conditions of the patients. The system provides rich and accurate vibro-haptic feedback that can be felt by the user, irrespective of the physiological limitations. KinoHaptics is built to ensure that no injuries are induced during the rehabilitation period. The persuasive nature of the system allows for personal goal-setting, progress tracking, and most importantly life-style compatibility. The system was evaluated under laboratory conditions, involving 14 users. Results show that KinoHaptics is highly convenient to use, and the vibro-haptic feedback is intuitive, accurate, and has shown to prevent accidental injuries. Also, results show that KinoHaptics is persuasive in nature as it supports behavior change and habit building. The successful acceptance of KinoHaptics, an automated, wearable, haptic assisted, physio-therapeutic system proves the need and future-scope of automated physio-therapeutic systems for self-care and behavior change. It also proves that such systems incorporated with vibro-haptic feedback encourage strong adherence to the physiotherapy program; can have profound impact on the physiotherapy experience resulting in higher acceptance rate.

  14. G2H--graphics-to-haptic virtual environment development tool for PC's.

    PubMed

    Acosta, E; Temkin, B; Krummel, T M; Heinrichs, W L

    2000-01-01

    For surgical training and preparations, the existing surgical virtual environments have shown great improvement. However, these improvements are more in the visual aspect. The incorporation of haptics into virtual reality base surgical simulations would enhance the sense of realism greatly. To aid in the development of the haptic surgical virtual environment we have created a graphics to haptic, G2H, virtual environment developer tool. G2H transforms graphical virtual environments (created or imported) to haptic virtual environments without programming. The G2H capability has been demonstrated using the complex 3D pelvic model of Lucy 2.0, the Stanford Visible Female. The pelvis was made haptic using G2H without any further programming effort.

  15. Physics Based Modeling and Rendering of Vegetation in the Thermal Infrared

    NASA Technical Reports Server (NTRS)

    Smith, J. A.; Ballard, J. R., Jr.

    1999-01-01

    We outline a procedure for rendering physically-based thermal infrared images of simple vegetation scenes. Our approach incorporates the biophysical processes that affect the temperature distribution of the elements within a scene. Computer graphics plays a key role in two respects. First, in computing the distribution of scene shaded and sunlit facets and, second, in the final image rendering once the temperatures of all the elements in the scene have been computed. We illustrate our approach for a simple corn scene where the three-dimensional geometry is constructed based on measured morphological attributes of the row crop. Statistical methods are used to construct a representation of the scene in agreement with the measured characteristics. Our results are quite good. The rendered images exhibit realistic behavior in directional properties as a function of view and sun angle. The root-mean-square error in measured versus predicted brightness temperatures for the scene was 2.1 deg C.

  16. RenderMan design principles

    NASA Technical Reports Server (NTRS)

    Apodaca, Tony; Porter, Tom

    1989-01-01

    The two worlds of interactive graphics and realistic graphics have remained separate. Fast graphics hardware runs simple algorithms and generates simple looking images. Photorealistic image synthesis software runs slowly on large expensive computers. The time has come for these two branches of computer graphics to merge. The speed and expense of graphics hardware is no longer the barrier to the wide acceptance of photorealism. There is every reason to believe that high quality image synthesis will become a standard capability of every graphics machine, from superworkstation to personal computer. The significant barrier has been the lack of a common language, an agreed-upon set of terms and conditions, for 3-D modeling systems to talk to 3-D rendering systems for computing an accurate rendition of that scene. Pixar has introduced RenderMan to serve as that common language. RenderMan, specifically the extensibility it offers in shading calculations, is discussed.

  17. A new visual feedback-based magnetorheological haptic master for robot-assisted minimally invasive surgery

    NASA Astrophysics Data System (ADS)

    Choi, Seung-Hyun; Kim, Soomin; Kim, Pyunghwa; Park, Jinhyuk; Choi, Seung-Bok

    2015-06-01

    In this study, we developed a novel four-degrees-of-freedom haptic master using controllable magnetorheological (MR) fluid. We also integrated the haptic master with a vision device with image processing for robot-assisted minimally invasive surgery (RMIS). The proposed master can be used in RMIS as a haptic interface to provide the surgeon with a sense of touch by using both kinetic and kinesthetic information. The slave robot, which is manipulated with a proportional-integrative-derivative controller, uses a force sensor to obtain the desired forces from tissue contact, and these desired repulsive forces are then embodied through the MR haptic master. To verify the effectiveness of the haptic master, the desired force and actual force are compared in the time domain. In addition, a visual feedback system is implemented in the RMIS experiment to distinguish between the tumor and organ more clearly and provide better visibility to the operator. The hue-saturation-value color space is adopted for the image processing since it is often more intuitive than other color spaces. The image processing and haptic feedback are realized on surgery performance. In this work, tumor-cutting experiments are conducted under four different operating conditions: haptic feedback on, haptic feedback off, image processing on, and image processing off. The experimental realization shows that the performance index, which is a function of pixels, is different in the four operating conditions.

  18. Aging and solid shape recognition: Vision and haptics.

    PubMed

    Norman, J Farley; Cheeseman, Jacob R; Adkins, Olivia C; Cox, Andrea G; Rogers, Connor E; Dowell, Catherine J; Baxter, Michael W; Norman, Hideko F; Reyes, Cecia M

    2015-10-01

    The ability of 114 younger and older adults to recognize naturally-shaped objects was evaluated in three experiments. The participants viewed or haptically explored six randomly-chosen bell peppers (Capsicum annuum) in a study session and were later required to judge whether each of twelve bell peppers was "old" (previously presented during the study session) or "new" (not presented during the study session). When recognition memory was tested immediately after study, the younger adults' (Experiment 1) performance for vision and haptics was identical when the individual study objects were presented once. Vision became superior to haptics, however, when the individual study objects were presented multiple times. When 10- and 20-min delays (Experiment 2) were inserted in between study and test sessions, no significant differences occurred between vision and haptics: recognition performance in both modalities was comparable. When the recognition performance of older adults was evaluated (Experiment 3), a negative effect of age was found for visual shape recognition (younger adults' overall recognition performance was 60% higher). There was no age effect, however, for haptic shape recognition. The results of the present experiments indicate that the visual recognition of natural object shape is different from haptic recognition in multiple ways: visual shape recognition can be superior to that of haptics and is affected by aging, while haptic shape recognition is less accurate and unaffected by aging. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Haptic Paddle Enhancements and a Formal Assessment of Student Learning in System Dynamics

    ERIC Educational Resources Information Center

    Gorlewicz, Jenna L.; Kratchman, Louis B.; Webster, Robert J., III

    2014-01-01

    The haptic paddle is a force-feedback joystick used at several universities in teaching System Dynamics, a core mechanical engineering undergraduate course where students learn to model dynamic systems in several domains. A second goal of the haptic paddle is to increase the accessibility of robotics and haptics by providing a low-cost device for…

  20. Identification of walked-upon materials in auditory, kinesthetic, haptic, and audio-haptic conditions.

    PubMed

    Giordano, Bruno L; Visell, Yon; Yao, Hsin-Yun; Hayward, Vincent; Cooperstock, Jeremy R; McAdams, Stephen

    2012-05-01

    Locomotion generates multisensory information about walked-upon objects. How perceptual systems use such information to get to know the environment remains unexplored. The ability to identify solid (e.g., marble) and aggregate (e.g., gravel) walked-upon materials was investigated in auditory, haptic or audio-haptic conditions, and in a kinesthetic condition where tactile information was perturbed with a vibromechanical noise. Overall, identification performance was better than chance in all experimental conditions and for both solids and the better identified aggregates. Despite large mechanical differences between the response of solids and aggregates to locomotion, for both material categories discrimination was at its worst in the auditory and kinesthetic conditions and at its best in the haptic and audio-haptic conditions. An analysis of the dominance of sensory information in the audio-haptic context supported a focus on the most accurate modality, haptics, but only for the identification of solid materials. When identifying aggregates, response biases appeared to produce a focus on the least accurate modality--kinesthesia. When walking on loose materials such as gravel, individuals do not perceive surfaces by focusing on the most accurate modality, but by focusing on the modality that would most promptly signal postural instabilities.

  1. Review of Designs for Haptic Data Visualization.

    PubMed

    Paneels, Sabrina; Roberts, Jonathan C

    2010-01-01

    There are many different uses for haptics, such as training medical practitioners, teleoperation, or navigation of virtual environments. This review focuses on haptic methods that display data. The hypothesis is that haptic devices can be used to present information, and consequently, the user gains quantitative, qualitative, or holistic knowledge about the presented data. Not only is this useful for users who are blind or partially sighted (who can feel line graphs, for instance), but also the haptic modality can be used alongside other modalities, to increase the amount of variables being presented, or to duplicate some variables to reinforce the presentation. Over the last 20 years, a significant amount of research has been done in haptic data presentation; e.g., researchers have developed force feedback line graphs, bar charts, and other forms of haptic representations. However, previous research is published in different conferences and journals, with different application emphases. This paper gathers and collates these various designs to provide a comprehensive review of designs for haptic data visualization. The designs are classified by their representation: Charts, Maps, Signs, Networks, Diagrams, Images, and Tables. This review provides a comprehensive reference for researchers and learners, and highlights areas for further research.

  2. Role of combined tactile and kinesthetic feedback in minimally invasive surgery.

    PubMed

    Lim, Soo-Chul; Lee, Hyung-Kew; Park, Joonah

    2014-10-18

    Haptic feedback is of critical importance in surgical tasks. However, conventional surgical robots do not provide haptic feedback to surgeons during surgery. Thus, in this study, a combined tactile and kinesthetic feedback system was developed to provide haptic feedback to surgeons during robotic surgery. To assess haptic feasibility, the effects of two types of haptic feedback were examined empirically - kinesthetic and tactile feedback - to measure object-pulling force with a telesurgery robotics system at two desired pulling forces (1 N and 2 N). Participants answered a set of questionnaires after experiments. The experimental results reveal reductions in force error (39.1% and 40.9%) when using haptic feedback during 1 N and 2 N pulling tasks. Moreover, survey analyses show the effectiveness of the haptic feedback during teleoperation. The combined tactile and kinesthetic feedback of the master device in robotic surgery improves the surgeon's ability to control the interaction force applied to the tissue. Copyright © 2014 John Wiley & Sons, Ltd. Copyright © 2014 John Wiley & Sons, Ltd.

  3. Study on development of active-passive rehabilitation system for upper limbs: Hybrid-PLEMO

    NASA Astrophysics Data System (ADS)

    Kikuchi, T.; Jin, Y.; Fukushima, K.; Akai, H.; Furusho, J.

    2009-02-01

    In recent years, many researchers have studied the potential of using robotics technology to assist and quantify the motor functions for neuron-rehabilitation. Some kinds of haptic devices have been developed and evaluated its efficiency with clinical tests, for example, upper limb training for patients with spasticity after stroke. Active-type (motor-driven) haptic devices can realize a lot of varieties of haptics. But they basically require high-cost safety system. On the other hand, passive-type (brake-based) haptic devices have inherent safety. However, the passive robot system has strong limitation on varieties of haptics. There are not sufficient evidences to clarify how the passive/active haptics effect to the rehabilitation of motor skills. In this paper, we developed an active-passive-switchable rehabilitation system with ER clutch/brake device named "Hybrid-PLEMO" in order to address these problems. In this paper, basic structures and haptic control methods of the Hybrid-PLEMO are described.

  4. The benefits of virtual reality simulator training for laparoscopic surgery.

    PubMed

    Hart, Roger; Karthigasu, Krishnan

    2007-08-01

    Virtual reality is a computer-generated system that provides a representation of an environment. This review will analyse the literature with regard to any benefit to be derived from training with virtual reality equipment and to describe the current equipment available. Virtual reality systems are not currently realistic of the live operating environment because they lack tactile sensation, and do not represent a complete operation. The literature suggests that virtual reality training is a valuable learning tool for gynaecologists in training, particularly those in the early stages of their careers. Furthermore, it may be of benefit for the ongoing audit of surgical skills and for the early identification of a surgeon's deficiencies before operative incidents occur. It is only a matter of time before realistic virtual reality models of most complete gynaecological operations are available, with improved haptics as a result of improved computer technology. It is inevitable that in the modern climate of litigation virtual reality training will become an essential part of clinical training, as evidence for its effectiveness as a training tool exists, and in many countries training by operating on live animals is not possible.

  5. Grounded Learning Experience: Helping Students Learn Physics through Visuo-Haptic Priming and Instruction

    NASA Astrophysics Data System (ADS)

    Huang, Shih-Chieh Douglas

    In this dissertation, I investigate the effects of a grounded learning experience on college students' mental models of physics systems. The grounded learning experience consisted of a priming stage and an instruction stage, and within each stage, one of two different types of visuo-haptic representation was applied: visuo-gestural simulation (visual modality and gestures) and visuo-haptic simulation (visual modality, gestures, and somatosensory information). A pilot study involving N = 23 college students examined how using different types of visuo-haptic representation in instruction affected people's mental model construction for physics systems. Participants' abilities to construct mental models were operationalized through their pretest-to-posttest gain scores for a basic physics system and their performance on a transfer task involving an advanced physics system. Findings from this pilot study revealed that, while both simulations significantly improved participants' mental modal construction for physics systems, visuo-haptic simulation was significantly better than visuo-gestural simulation. In addition, clinical interviews suggested that participants' mental model construction for physics systems benefited from receiving visuo-haptic simulation in a tutorial prior to the instruction stage. A dissertation study involving N = 96 college students examined how types of visuo-haptic representation in different applications support participants' mental model construction for physics systems. Participant's abilities to construct mental models were again operationalized through their pretest-to-posttest gain scores for a basic physics system and their performance on a transfer task involving an advanced physics system. Participants' physics misconceptions were also measured before and after the grounded learning experience. Findings from this dissertation study not only revealed that visuo-haptic simulation was significantly more effective in promoting mental model construction and remedying participants' physics misconceptions than visuo-gestural simulation, they also revealed that visuo-haptic simulation was more effective during the priming stage than during the instruction stage. Interestingly, the effects of visuo-haptic simulation in priming and visuo-haptic simulation in instruction on participants' pretest-to-posttest gain scores for a basic physics system appeared additive. These results suggested that visuo-haptic simulation is effective in physics learning, especially when it is used during the priming stage.

  6. Visual-haptic integration with pliers and tongs: signal “weights” take account of changes in haptic sensitivity caused by different tools

    PubMed Central

    Takahashi, Chie; Watt, Simon J.

    2014-01-01

    When we hold an object while looking at it, estimates from visual and haptic cues to size are combined in a statistically optimal fashion, whereby the “weight” given to each signal reflects their relative reliabilities. This allows object properties to be estimated more precisely than would otherwise be possible. Tools such as pliers and tongs systematically perturb the mapping between object size and the hand opening. This could complicate visual-haptic integration because it may alter the reliability of the haptic signal, thereby disrupting the determination of appropriate signal weights. To investigate this we first measured the reliability of haptic size estimates made with virtual pliers-like tools (created using a stereoscopic display and force-feedback robots) with different “gains” between hand opening and object size. Haptic reliability in tool use was straightforwardly determined by a combination of sensitivity to changes in hand opening and the effects of tool geometry. The precise pattern of sensitivity to hand opening, which violated Weber's law, meant that haptic reliability changed with tool gain. We then examined whether the visuo-motor system accounts for these reliability changes. We measured the weight given to visual and haptic stimuli when both were available, again with different tool gains, by measuring the perceived size of stimuli in which visual and haptic sizes were varied independently. The weight given to each sensory cue changed with tool gain in a manner that closely resembled the predictions of optimal sensory integration. The results are consistent with the idea that different tool geometries are modeled by the brain, allowing it to calculate not only the distal properties of objects felt with tools, but also the certainty with which those properties are known. These findings highlight the flexibility of human sensory integration and tool-use, and potentially provide an approach for optimizing the design of visual-haptic devices. PMID:24592245

  7. The value of haptic feedback in conventional and robot-assisted minimal invasive surgery and virtual reality training: a current review.

    PubMed

    van der Meijden, O A J; Schijven, M P

    2009-06-01

    Virtual reality (VR) as surgical training tool has become a state-of-the-art technique in training and teaching skills for minimally invasive surgery (MIS). Although intuitively appealing, the true benefits of haptic (VR training) platforms are unknown. Many questions about haptic feedback in the different areas of surgical skills (training) need to be answered before adding costly haptic feedback in VR simulation for MIS training. This study was designed to review the current status and value of haptic feedback in conventional and robot-assisted MIS and training by using virtual reality simulation. A systematic review of the literature was undertaken using PubMed and MEDLINE. The following search terms were used: Haptic feedback OR Haptics OR Force feedback AND/OR Minimal Invasive Surgery AND/OR Minimal Access Surgery AND/OR Robotics AND/OR Robotic Surgery AND/OR Endoscopic Surgery AND/OR Virtual Reality AND/OR Simulation OR Surgical Training/Education. The results were assessed according to level of evidence as reflected by the Oxford Centre of Evidence-based Medicine Levels of Evidence. In the current literature, no firm consensus exists on the importance of haptic feedback in performing minimally invasive surgery. Although the majority of the results show positive assessment of the benefits of force feedback, results are ambivalent and not unanimous on the subject. Benefits are least disputed when related to surgery using robotics, because there is no haptic feedback in currently used robotics. The addition of haptics is believed to reduce surgical errors resulting from a lack of it, especially in knot tying. Little research has been performed in the area of robot-assisted endoscopic surgical training, but results seem promising. Concerning VR training, results indicate that haptic feedback is important during the early phase of psychomotor skill acquisition.

  8. Different haptic tools reduce trunk velocity in the frontal plane during walking, but haptic anchors have advantages over lightly touching a railing.

    PubMed

    Hedayat, Isabel; Moraes, Renato; Lanovaz, Joel L; Oates, Alison R

    2017-06-01

    There are different ways to add haptic input during walking which may affect walking balance. This study compared the use of two different haptic tools (rigid railing and haptic anchors) and investigated whether any effects on walking were the result of the added sensory input and/or the posture generated when using those tools. Data from 28 young healthy adults were collected using the Mobility Lab inertial sensor system (APDM, Oregon, USA). Participants walked with and without both haptic tools and while pretending to use both haptic tools (placebo trials), with eyes opened and eyes closed. Using the tools or pretending to use both tools decreased normalized stride velocity (p < .001-0.008) and peak medial-lateral (ML) trunk velocity (p < .001-0.001). Normalized stride velocity was slower when actually using the railing compared to placebo railing trials (p = .006). Using the anchors resulted in lower peak ML trunk velocity than the railing (p = .002). The anchors had lower peak ML trunk velocity than placebo anchors (p < .001), but there was no difference between railing and placebo railing (p > .999). These findings highlight a difference in the type of tool used to add haptic input and suggest that changes in balance control strategy resulting from using the railing are based on arm placement, where it is the posture combined with added sensory input that affects balance control strategies with the haptic anchors. These findings provide a strong framework for additional research to be conducted on the effects of haptic input on walking in populations known to have decreased walking balance.

  9. Haptic device development based on electro static force of cellulose electro active paper

    NASA Astrophysics Data System (ADS)

    Yun, Gyu-young; Kim, Sang-Youn; Jang, Sang-Dong; Kim, Dong-Gu; Kim, Jaehwan

    2011-04-01

    Haptic is one of well-considered device which is suitable for demanding virtual reality applications such as medical equipment, mobile devices, the online marketing and so on. Nowadays, many of concepts for haptic devices have been suggested to meet the demand of industries. Cellulose has received much attention as an emerging smart material, named as electro-active paper (EAPap). The EAPap is attractive for mobile haptic devices due to its unique characteristics in terms of low actuation power, suitability for thin devices and transparency. In this paper, we suggest a new concept of haptic actuator with the use of cellulose EAPap. Its performance is evaluated depending on various actuation conditions. As a result, cellulose electrostatic force actuator shows a large output displacement and fast response, which is suitable for mobile haptic devices.

  10. Detection thresholds for small haptic effects

    NASA Astrophysics Data System (ADS)

    Dosher, Jesse A.; Hannaford, Blake

    2002-02-01

    We are interested in finding out whether or not haptic interfaces will be useful in portable and hand held devices. Such systems will have severe constraints on force output. Our first step is to investigate the lower limits at which haptic effects can be perceived. In this paper we report on experiments studying the effects of varying the amplitude, size, shape, and pulse-duration of a haptic feature. Using a specific haptic device we measure the smallest detectable haptics effects, with active exploration of saw-tooth shaped icons sized 3, 4 and 5 mm, a sine-shaped icon 5 mm wide, and static pulses 50, 100, and 150 ms in width. Smooth shaped icons resulted in a detection threshold of approximately 55 mN, almost twice that of saw-tooth shaped icons which had a threshold of 31 mN.

  11. Investigating Students' Ideas About Buoyancy and the Influence of Haptic Feedback

    NASA Astrophysics Data System (ADS)

    Minogue, James; Borland, David

    2016-04-01

    While haptics (simulated touch) represents a potential breakthrough technology for science teaching and learning, there is relatively little research into its differential impact in the context of teaching and learning. This paper describes the testing of a haptically enhanced simulation (HES) for learning about buoyancy. Despite a lifetime of everyday experiences, a scientifically sound explanation of buoyancy remains difficult to construct for many. It requires the integration of domain-specific knowledge regarding density, fluid, force, gravity, mass, weight, and buoyancy. Prior studies suggest that novices often focus on only one dimension of the sinking and floating phenomenon. Our HES was designed to promote the integration of the subconcepts of density and buoyant forces and stresses the relationship between the object itself and the surrounding fluid. The study employed a randomized pretest-posttest control group research design and a suite of measures including an open-ended prompt and objective content questions to provide insights into the influence of haptic feedback on undergraduate students' thinking about buoyancy. A convenience sample (n = 40) was drawn from a university's population of undergraduate elementary education majors. Two groups were formed from haptic feedback (n = 22) and no haptic feedback (n = 18). Through content analysis, discernible differences were seen in the posttest explanations sinking and floating across treatment groups. Learners that experienced the haptic feedback made more frequent use of "haptically grounded" terms (e.g., mass, gravity, buoyant force, pushing), leading us to begin to build a local theory of language-mediated haptic cognition.

  12. Development of visuo-haptic transfer for object recognition in typical preschool and school-aged children.

    PubMed

    Purpura, Giulia; Cioni, Giovanni; Tinelli, Francesca

    2018-07-01

    Object recognition is a long and complex adaptive process and its full maturation requires combination of many different sensory experiences as well as cognitive abilities to manipulate previous experiences in order to develop new percepts and subsequently to learn from the environment. It is well recognized that the transfer of visual and haptic information facilitates object recognition in adults, but less is known about development of this ability. In this study, we explored the developmental course of object recognition capacity in children using unimodal visual information, unimodal haptic information, and visuo-haptic information transfer in children from 4 years to 10 years and 11 months of age. Participants were tested through a clinical protocol, involving visual exploration of black-and-white photographs of common objects, haptic exploration of real objects, and visuo-haptic transfer of these two types of information. Results show an age-dependent development of object recognition abilities for visual, haptic, and visuo-haptic modalities. A significant effect of time on development of unimodal and crossmodal recognition skills was found. Moreover, our data suggest that multisensory processes for common object recognition are active at 4 years of age. They facilitate recognition of common objects, and, although not fully mature, are significant in adaptive behavior from the first years of age. The study of typical development of visuo-haptic processes in childhood is a starting point for future studies regarding object recognition in impaired populations.

  13. Visual and Haptic Shape Processing in the Human Brain: Unisensory Processing, Multisensory Convergence, and Top-Down Influences.

    PubMed

    Lee Masson, Haemy; Bulthé, Jessica; Op de Beeck, Hans P; Wallraven, Christian

    2016-08-01

    Humans are highly adept at multisensory processing of object shape in both vision and touch. Previous studies have mostly focused on where visually perceived object-shape information can be decoded, with haptic shape processing receiving less attention. Here, we investigate visuo-haptic shape processing in the human brain using multivoxel correlation analyses. Importantly, we use tangible, parametrically defined novel objects as stimuli. Two groups of participants first performed either a visual or haptic similarity-judgment task. The resulting perceptual object-shape spaces were highly similar and matched the physical parameter space. In a subsequent fMRI experiment, objects were first compared within the learned modality and then in the other modality in a one-back task. When correlating neural similarity spaces with perceptual spaces, visually perceived shape was decoded well in the occipital lobe along with the ventral pathway, whereas haptically perceived shape information was mainly found in the parietal lobe, including frontal cortex. Interestingly, ventrolateral occipito-temporal cortex decoded shape in both modalities, highlighting this as an area capable of detailed visuo-haptic shape processing. Finally, we found haptic shape representations in early visual cortex (in the absence of visual input), when participants switched from visual to haptic exploration, suggesting top-down involvement of visual imagery on haptic shape processing. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  14. Haptic Exploration in Humans and Machines: Attribute Integration and Machine Recognition/Implementation.

    DTIC Science & Technology

    1988-04-30

    side it necessary and Identify’ by’ block n~nmbot) haptic hand, touch , vision, robot, object recognition, categorization 20. AGSTRPACT (Continue an...established that the haptic system has remarkable capabilities for object recognition. We define haptics as purposive touch . The basic tactual system...gathered ratings of the importance of dimensions for categorizing common objects by touch . Texture and hardness ratings strongly co-vary, which is

  15. Shifty: A Weight-Shifting Dynamic Passive Haptic Proxy to Enhance Object Perception in Virtual Reality.

    PubMed

    Zenner, Andre; Kruger, Antonio

    2017-04-01

    We define the concept of Dynamic Passive Haptic Feedback (DPHF) for virtual reality by introducing the weight-shifting physical DPHF proxy object Shifty. This concept combines actuators known from active haptics and physical proxies known from passive haptics to construct proxies that automatically adapt their passive haptic feedback. We describe the concept behind our ungrounded weight-shifting DPHF proxy Shifty and the implementation of our prototype. We then investigate how Shifty can, by automatically changing its internal weight distribution, enhance the user's perception of virtual objects interacted with in two experiments. In a first experiment, we show that Shifty can enhance the perception of virtual objects changing in shape, especially in length and thickness. Here, Shifty was shown to increase the user's fun and perceived realism significantly, compared to an equivalent passive haptic proxy. In a second experiment, Shifty is used to pick up virtual objects of different virtual weights. The results show that Shifty enhances the perception of weight and thus the perceived realism by adapting its kinesthetic feedback to the picked-up virtual object. In the same experiment, we additionally show that specific combinations of haptic, visual and auditory feedback during the pick-up interaction help to compensate for visual-haptic mismatch perceived during the shifting process.

  16. Virtual wall-based haptic-guided teleoperated surgical robotic system for single-port brain tumor removal surgery.

    PubMed

    Seung, Sungmin; Choi, Hongseok; Jang, Jongseong; Kim, Young Soo; Park, Jong-Oh; Park, Sukho; Ko, Seong Young

    2017-01-01

    This article presents a haptic-guided teleoperation for a tumor removal surgical robotic system, so-called a SIROMAN system. The system was developed in our previous work to make it possible to access tumor tissue, even those that seat deeply inside the brain, and to remove the tissue with full maneuverability. For a safe and accurate operation to remove only tumor tissue completely while minimizing damage to the normal tissue, a virtual wall-based haptic guidance together with a medical image-guided control is proposed and developed. The virtual wall is extracted from preoperative medical images, and the robot is controlled to restrict its motion within the virtual wall using haptic feedback. Coordinate transformation between sub-systems, a collision detection algorithm, and a haptic-guided teleoperation using a virtual wall are described in the context of using SIROMAN. A series of experiments using a simplified virtual wall are performed to evaluate the performance of virtual wall-based haptic-guided teleoperation. With haptic guidance, the accuracy of the robotic manipulator's trajectory is improved by 57% compared to one without. The tissue removal performance is also improved by 21% ( p < 0.05). The experiments show that virtual wall-based haptic guidance provides safer and more accurate tissue removal for single-port brain surgery.

  17. Haptograph Representation of Real-World Haptic Information by Wideband Force Control

    NASA Astrophysics Data System (ADS)

    Katsura, Seiichiro; Irie, Kouhei; Ohishi, Kiyoshi

    Artificial acquisition and reproduction of human sensations are basic technologies of communication engineering. For example, auditory information is obtained by a microphone, and a speaker reproduces it by artificial means. Furthermore, a video camera and a television make it possible to transmit visual sensation by broadcasting. On the contrary, since tactile or haptic information is subject to the Newton's “law of action and reaction” in the real world, a device which acquires, transmits, and reproduces the information has not been established. From the point of view, real-world haptics is the key technology for future haptic communication engineering. This paper proposes a novel acquisition method of haptic information named “haptograph”. The haptograph visualizes the haptic information like photograph. The proposed haptograph is applied to haptic recognition of the contact environment. A linear motor contacts to the surface of the environment and its reaction force is used to make a haptograph. A robust contact motion and sensor-less sensing of the reaction force are attained by using a disturbance observer. As a result, an encyclopedia of contact environment is attained. Since temporal and spatial analyses are conducted to represent haptic information as the haptograph, it is possible to be recognized and to be evaluated intuitively.

  18. Visuo-Haptic Mixed Reality with Unobstructed Tool-Hand Integration.

    PubMed

    Cosco, Francesco; Garre, Carlos; Bruno, Fabio; Muzzupappa, Maurizio; Otaduy, Miguel A

    2013-01-01

    Visuo-haptic mixed reality consists of adding to a real scene the ability to see and touch virtual objects. It requires the use of see-through display technology for visually mixing real and virtual objects, and haptic devices for adding haptic interaction with the virtual objects. Unfortunately, the use of commodity haptic devices poses obstruction and misalignment issues that complicate the correct integration of a virtual tool and the user's real hand in the mixed reality scene. In this work, we propose a novel mixed reality paradigm where it is possible to touch and see virtual objects in combination with a real scene, using commodity haptic devices, and with a visually consistent integration of the user's hand and the virtual tool. We discuss the visual obstruction and misalignment issues introduced by commodity haptic devices, and then propose a solution that relies on four simple technical steps: color-based segmentation of the hand, tracking-based segmentation of the haptic device, background repainting using image-based models, and misalignment-free compositing of the user's hand. We have developed a successful proof-of-concept implementation, where a user can touch virtual objects and interact with them in the context of a real scene, and we have evaluated the impact on user performance of obstruction and misalignment correction.

  19. Visual and haptic integration in the estimation of softness of deformable objects

    PubMed Central

    Cellini, Cristiano; Kaim, Lukas; Drewing, Knut

    2013-01-01

    Softness perception intrinsically relies on haptic information. However, through everyday experiences we learn correspondences between felt softness and the visual effects of exploratory movements that are executed to feel softness. Here, we studied how visual and haptic information is integrated to assess the softness of deformable objects. Participants discriminated between the softness of two softer or two harder objects using only-visual, only-haptic or both visual and haptic information. We assessed the reliabilities of the softness judgments using the method of constant stimuli. In visuo-haptic trials, discrepancies between the two senses' information allowed us to measure the contribution of the individual senses to the judgments. Visual information (finger movement and object deformation) was simulated using computer graphics; input in visual trials was taken from previous visuo-haptic trials. Participants were able to infer softness from vision alone, and vision considerably contributed to bisensory judgments (∼35%). The visual contribution was higher than predicted from models of optimal integration (senses are weighted according to their reliabilities). Bisensory judgments were less reliable than predicted from optimal integration. We conclude that the visuo-haptic integration of softness information is biased toward vision, rather than being optimal, and might even be guided by a fixed weighting scheme. PMID:25165510

  20. Non-Colocated Kinesthetic Display Limits Compliance Discrimination in the Absence of Terminal Force Cues.

    PubMed

    Brown, Jeremy D; Shelley, Mackenzie K; Gardner, Duane; Gansallo, Emmanuel A; Gillespie, R Brent

    2016-01-01

    An important goal of haptic display is to make available the action/reaction relationships that define interactions between the body and the physical world. While in physical world interactions reaction cues invariably impinge on the same part of the body involved in action (reaction and action are colocated), a haptic interface is quite capable of rendering feedback to a separate body part than that used for producing exploratory actions (non-colocated action and reaction). This most commonly occurs with the use of vibrotactile display, in which a cutaneous cue has been substituted for a kinesthetic cue (a kind of sensory substitution). In this paper, we investigate whether non-colocated force and displacement cues degrade the perception of compliance. Using a custom non-colocated kinesthetic display in which one hand controls displacement and the other senses force, we ask participants to discriminate between two virtual springs with matched terminal force and adjustable non-linearity. An additional condition includes one hand controlling displacement while the other senses force encoded in a vibrotactile cue. Results show that when the terminal force cue is unavailable, and even when sensory substitution is not involved, non-colocated kinesthetic displays degrade compliance discrimination relative to colocated kinesthetic displays. Compliance discrimination is also degraded with vibrotactile display of force. These findings suggest that non-colocated kinesthetic displays and, likewise, cutaneous sensory substitution displays should be avoided when discrimination of compliance is necessary for task success.

  1. A virtual reality based simulator for learning nasogastric tube placement.

    PubMed

    Choi, Kup-Sze; He, Xuejian; Chiang, Vico Chung-Lim; Deng, Zhaohong

    2015-02-01

    Nasogastric tube (NGT) placement is a common clinical procedure where a plastic tube is inserted into the stomach through the nostril for feeding or drainage. However, the placement is a blind process in which the tube may be mistakenly inserted into other locations, leading to unexpected complications or fatal incidents. The placement techniques are conventionally acquired by practising on unrealistic rubber mannequins or on humans. In this paper, a virtual reality based training simulation system is proposed to facilitate the training of NGT placement. It focuses on the simulation of tube insertion and the rendering of the feedback forces with a haptic device. A hybrid force model is developed to compute the forces analytically or numerically under different conditions, including the situations when the patient is swallowing or when the tube is buckled at the nostril. To ensure real-time interactive simulations, an offline simulation approach is adopted to obtain the relationship between the insertion depth and insertion force using a non-linear finite element method. The offline dataset is then used to generate real-time feedback forces by interpolation. The virtual training process is logged quantitatively with metrics that can be used for assessing objective performance and tracking progress. The system has been evaluated by nursing professionals. They found that the haptic feeling produced by the simulated forces is similar to their experience during real NGT insertion. The proposed system provides a new educational tool to enhance conventional training in NGT placement. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Development of a StandAlone Surgical Haptic Arm.

    PubMed

    Jones, Daniel; Lewis, Andrew; Fischer, Gregory S

    2011-01-01

    When performing telesurgery with current commercially available Minimally Invasive Robotic Surgery (MIRS) systems, a surgeon cannot feel the tool interactions that are inherent in traditional laparoscopy. It is proposed that haptic feedback in the control of MIRS systems could improve the speed, safety and learning curve of robotic surgery. To test this hypothesis, a standalone surgical haptic arm (SASHA) capable of manipulating da Vinci tools has been designed and fabricated with the additional ability of providing information for haptic feedback. This arm was developed as a research platform for developing and evaluating approaches to telesurgery, including various haptic mappings between master and slave and evaluating the effects of latency.

  3. Comparative study on collaborative interaction in non-immersive and immersive systems

    NASA Astrophysics Data System (ADS)

    Shahab, Qonita M.; Kwon, Yong-Moo; Ko, Heedong; Mayangsari, Maria N.; Yamasaki, Shoko; Nishino, Hiroaki

    2007-09-01

    This research studies the Virtual Reality simulation for collaborative interaction so that different people from different places can interact with one object concurrently. Our focus is the real-time handling of inputs from multiple users, where object's behavior is determined by the combination of the multiple inputs. Issues addressed in this research are: 1) The effects of using haptics on a collaborative interaction, 2) The possibilities of collaboration between users from different environments. We conducted user tests on our system in several cases: 1) Comparison between non-haptics and haptics collaborative interaction over LAN, 2) Comparison between non-haptics and haptics collaborative interaction over Internet, and 3) Analysis of collaborative interaction between non-immersive and immersive display environments. The case studies are the interaction of users in two cases: collaborative authoring of a 3D model by two users, and collaborative haptic interaction by multiple users. In Virtual Dollhouse, users can observe physics law while constructing a dollhouse using existing building blocks, under gravity effects. In Virtual Stretcher, multiple users can collaborate on moving a stretcher together while feeling each other's haptic motions.

  4. Multilateral haptics-based immersive teleoperation for improvised explosive device disposal

    NASA Astrophysics Data System (ADS)

    Erickson, David; Lacheray, Hervé; Daly, John

    2013-05-01

    Of great interest to police and military organizations is the development of effective improvised explosive device (IED) disposal (IEDD) technology to aid in activities such as mine field clearing, and bomb disposal. At the same time minimizing risk to personnel. This paper presents new results in the research and development of a next generation mobile immersive teleoperated explosive ordnance disposal system. This system incorporates elements of 3D vision, multilateral teleoperation for high transparency haptic feedback, immersive augmented reality operator control interfaces, and a realistic hardware-in-the-loop (HIL) 3D simulation environment incorporating vehicle and manipulator dynamics for both operator training and algorithm development. In the past year, new algorithms have been developed to facilitate incorporating commercial off-the-shelf (COTS) robotic hardware into the teleoperation system. In particular, a real-time numerical inverse position kinematics algorithm that can be applied to a wide range of manipulators has been implemented, an inertial measurement unit (IMU) attitude stabilization system for manipulators has been developed and experimentally validated, and a voice­operated manipulator control system has been developed and integrated into the operator control station. The integration of these components into a vehicle simulation environment with half-car vehicle dynamics has also been successfully carried out. A physical half-car plant is currently being constructed for HIL integration with the simulation environment.

  5. Haptic force-feedback devices for the office computer: performance and musculoskeletal loading issues.

    PubMed

    Dennerlein, J T; Yang, M C

    2001-01-01

    Pointing devices, essential input tools for the graphical user interface (GUI) of desktop computers, require precise motor control and dexterity to use. Haptic force-feedback devices provide the human operator with tactile cues, adding the sense of touch to existing visual and auditory interfaces. However, the performance enhancements, comfort, and possible musculoskeletal loading of using a force-feedback device in an office environment are unknown. Hypothesizing that the time to perform a task and the self-reported pain and discomfort of the task improve with the addition of force feedback, 26 people ranging in age from 22 to 44 years performed a point-and-click task 540 times with and without an attractive force field surrounding the desired target. The point-and-click movements were approximately 25% faster with the addition of force feedback (paired t-tests, p < 0.001). Perceived user discomfort and pain, as measured through a questionnaire, were also smaller with the addition of force feedback (p < 0.001). However, this difference decreased as additional distracting force fields were added to the task environment, simulating a more realistic work situation. These results suggest that for a given task, use of a force-feedback device improves performance, and potentially reduces musculoskeletal loading during mouse use. Actual or potential applications of this research include human-computer interface design, specifically that of the pointing device extensively used for the graphical user interface.

  6. Multi-Purpose Crew Vehicle Camera Asset Planning: Imagery Previsualization

    NASA Technical Reports Server (NTRS)

    Beaulieu, K.

    2014-01-01

    Using JSC-developed and other industry-standard off-the-shelf 3D modeling, animation, and rendering software packages, the Image Science Analysis Group (ISAG) supports Orion Project imagery planning efforts through dynamic 3D simulation and realistic previsualization of ground-, vehicle-, and air-based camera output.

  7. Functional specialization and convergence in the occipito-temporal cortex supporting haptic and visual identification of human faces and body parts: an fMRI study.

    PubMed

    Kitada, Ryo; Johnsrude, Ingrid S; Kochiyama, Takanori; Lederman, Susan J

    2009-10-01

    Humans can recognize common objects by touch extremely well whenever vision is unavailable. Despite its importance to a thorough understanding of human object recognition, the neuroscientific study of this topic has been relatively neglected. To date, the few published studies have addressed the haptic recognition of nonbiological objects. We now focus on haptic recognition of the human body, a particularly salient object category for touch. Neuroimaging studies demonstrate that regions of the occipito-temporal cortex are specialized for visual perception of faces (fusiform face area, FFA) and other body parts (extrastriate body area, EBA). Are the same category-sensitive regions activated when these components of the body are recognized haptically? Here, we use fMRI to compare brain organization for haptic and visual recognition of human body parts. Sixteen subjects identified exemplars of faces, hands, feet, and nonbiological control objects using vision and haptics separately. We identified two discrete regions within the fusiform gyrus (FFA and the haptic face region) that were each sensitive to both haptically and visually presented faces; however, these two regions differed significantly in their response patterns. Similarly, two regions within the lateral occipito-temporal area (EBA and the haptic body region) were each sensitive to body parts in both modalities, although the response patterns differed. Thus, although the fusiform gyrus and the lateral occipito-temporal cortex appear to exhibit modality-independent, category-sensitive activity, our results also indicate a degree of functional specialization related to sensory modality within these structures.

  8. Solid shape discrimination from vision and haptics: natural objects (Capsicum annuum) and Gibson's "feelies".

    PubMed

    Norman, J Farley; Phillips, Flip; Holmin, Jessica S; Norman, Hideko F; Beers, Amanda M; Boswell, Alexandria M; Cheeseman, Jacob R; Stethen, Angela G; Ronning, Cecilia

    2012-10-01

    A set of three experiments evaluated 96 participants' ability to visually and haptically discriminate solid object shape. In the past, some researchers have found haptic shape discrimination to be substantially inferior to visual shape discrimination, while other researchers have found haptics and vision to be essentially equivalent. A primary goal of the present study was to understand these discrepant past findings and to determine the true capabilities of the haptic system. All experiments used the same task (same vs. different shape discrimination) and stimulus objects (James Gibson's "feelies" and a set of naturally shaped objects--bell peppers). However, the methodology varied across experiments. Experiment 1 used random 3-dimensional (3-D) orientations of the stimulus objects, and the conditions were full-cue (active manipulation of objects and rotation of the visual objects in depth). Experiment 2 restricted the 3-D orientations of the stimulus objects and limited the haptic and visual information available to the participants. Experiment 3 compared restricted and full-cue conditions using random 3-D orientations. We replicated both previous findings in the current study. When we restricted visual and haptic information (and placed the stimulus objects in the same orientation on every trial), the participants' visual performance was superior to that obtained for haptics (replicating the earlier findings of Davidson et al. in Percept Psychophys 15(3):539-543, 1974). When the circumstances resembled those of ordinary life (e.g., participants able to actively manipulate objects and see them from a variety of perspectives), we found no significant difference between visual and haptic solid shape discrimination.

  9. A study on haptic collaborative game in shared virtual environment

    NASA Astrophysics Data System (ADS)

    Lu, Keke; Liu, Guanyang; Liu, Lingzhi

    2013-03-01

    A study on collaborative game in shared virtual environment with haptic feedback over computer networks is introduced in this paper. A collaborative task was used where the players located at remote sites and played the game together. The player can feel visual and haptic feedback in virtual environment compared to traditional networked multiplayer games. The experiment was desired in two conditions: visual feedback only and visual-haptic feedback. The goal of the experiment is to assess the impact of force feedback on collaborative task performance. Results indicate that haptic feedback is beneficial for performance enhancement for collaborative game in shared virtual environment. The outcomes of this research can have a powerful impact on the networked computer games.

  10. Study on Collaborative Object Manipulation in Virtual Environment

    NASA Astrophysics Data System (ADS)

    Mayangsari, Maria Niken; Yong-Moo, Kwon

    This paper presents comparative study on network collaboration performance in different immersion. Especially, the relationship between user collaboration performance and degree of immersion provided by the system is addressed and compared based on several experiments. The user tests on our system include several cases: 1) Comparison between non-haptics and haptics collaborative interaction over LAN, 2) Comparison between non-haptics and haptics collaborative interaction over Internet, and 3) Analysis of collaborative interaction between non-immersive and immersive display environments.

  11. Development of Velocity Guidance Assistance System by Haptic Accelerator Pedal Reaction Force Control

    NASA Astrophysics Data System (ADS)

    Yin, Feilong; Hayashi, Ryuzo; Raksincharoensak, Pongsathorn; Nagai, Masao

    This research proposes a haptic velocity guidance assistance system for realizing eco-driving as well as enhancing traffic capacity by cooperating with ITS (Intelligent Transportation Systems). The proposed guidance system generates the desired accelerator pedal (abbreviated as pedal) stroke with respect to the desired velocity obtained from ITS considering vehicle dynamics, and provides the desired pedal stroke to the driver via a haptic pedal whose reaction force is controllable and guides the driver in order to trace the desired velocity in real time. The main purpose of this paper is to discuss the feasibility of the haptic velocity guidance. A haptic velocity guidance system for research is developed on the Driving Simulator of TUAT (DS), by attaching a low-inertia, low-friction motor to the pedal, which does not change the original characteristics of the original pedal when it is not operated, implementing an algorithm regarding the desired pedal stroke calculation and the reaction force controller. The haptic guidance maneuver is designed based on human pedal stepping experiments. A simple velocity profile with acceleration, deceleration and cruising is synthesized according to naturalistic driving for testing the proposed system. The experiment result of 9 drivers shows that the haptic guidance provides high accuracy and quick response in velocity tracking. These results prove that the haptic guidance is a promising velocity guidance method from the viewpoint of HMI (Human Machine Interface).

  12. Haptic perception accuracy depending on self-produced movement.

    PubMed

    Park, Chulwook; Kim, Seonjin

    2014-01-01

    This study measured whether self-produced movement influences haptic perception ability (experiment 1) as well as the factors associated with levels of influence (experiment 2) in racket sports. For experiment 1, the haptic perception accuracy levels of five male table tennis experts and five male novices were examined under two different conditions (no movement vs. movement). For experiment 2, the haptic afferent subsystems of five male table tennis experts and five male novices were investigated in only the self-produced movement-coupled condition. Inferential statistics (ANOVA, t-test) and custom-made devices (shock & vibration sensor, Qualisys Track Manager) of the data were used to determine the haptic perception accuracy (experiment 1, experiment 2) and its association with expertise. The results of this research show that expert-level players acquire higher accuracy with less variability (racket vibration and angle) than novice-level players, especially in their self-produced movement coupled performances. The important finding from this result is that, in terms of accuracy, the skill-associated differences were enlarged during self-produced movement. To explain the origin of this difference between experts and novices, the functional variability of haptic afferent subsystems can serve as a reference. These two factors (self-produced accuracy and the variability of haptic features) as investigated in this study would be useful criteria for educators in racket sports and suggest a broader hypothesis for further research into the effects of the haptic accuracy related to variability.

  13. Mechatronic design of haptic forceps for robotic surgery.

    PubMed

    Rizun, P; Gunn, D; Cox, B; Sutherland, G

    2006-12-01

    Haptic feedback increases operator performance and comfort during telerobotic manipulation. Feedback of grasping pressure is critical in many microsurgical tasks, yet no haptic interface for surgical tools is commercially available. Literature on the psychophysics of touch was reviewed to define the spectrum of human touch perception and the fidelity requirements of an ideal haptic interface. Mechanical design and control literature was reviewed to translate the psychophysical requirements to engineering specification. High-fidelity haptic forceps were then developed through an iterative process between engineering and surgery. The forceps are a modular device that integrate with a haptic hand controller to add force feedback for tool actuation in telerobotic or virtual surgery. Their overall length is 153 mm and their mass is 125 g. A contact-free voice coil actuator generates force feedback at frequencies up to 800 Hz. Maximum force output is 6 N (2N continuous) and the force resolution is 4 mN. The forceps employ a contact-free magnetic position sensor as well as micro-machined accelerometers to measure opening/closing acceleration. Position resolution is 0.6 microm with 1.3 microm RMS noise. The forceps can simulate stiffness greater than 20N/mm or impedances smaller than 15 g with no noticeable haptic artifacts or friction. As telerobotic surgery evolves, haptics will play an increasingly important role. Copyright 2006 John Wiley & Sons, Ltd.

  14. A computerized system for portrayal of landscape alterations

    Treesearch

    A. E. Stevenson; J. A. Conley; J. B. Carey

    1979-01-01

    The growing public awareness of and participation in the visual resource decision process has stimulated interest to find improved means of accurately and realistically displaying proposed alterations. The traditional artist renderings often lack the accuracy and objectivity needed for critical decisions. One approach, using computer graphics, led to the MOSAIC system...

  15. Haptic Feedback in Robot-Assisted Minimally Invasive Surgery

    PubMed Central

    Okamura, Allison M.

    2009-01-01

    Purpose of Review Robot-assisted minimally invasive surgery (RMIS) holds great promise for improving the accuracy and dexterity of a surgeon while minimizing trauma to the patient. However, widespread clinical success with RMIS has been marginal. It is hypothesized that the lack of haptic (force and tactile) feedback presented to the surgeon is a limiting factor. This review explains the technical challenges of creating haptic feedback for robot-assisted surgery and provides recent results that evaluate the effectiveness of haptic feedback in mock surgical tasks. Recent Findings Haptic feedback systems for RMIS are still under development and evaluation. Most provide only force feedback, with limited fidelity. The major challenge at this time is sensing forces applied to the patient. A few tactile feedback systems for RMIS have been created, but their practicality for clinical implementation needs to be shown. It is particularly difficult to sense and display spatially distributed tactile information. The cost-benefit ratio for haptic feedback in RMIS has not been established. Summary The designs of existing commercial RMIS systems are not conducive for force feedback, and creative solutions are needed to create compelling tactile feedback systems. Surgeons, engineers, and neuroscientists should work together to develop effective solutions for haptic feedback in RMIS. PMID:19057225

  16. Learning, retention, and generalization of haptic categories

    NASA Astrophysics Data System (ADS)

    Do, Phuong T.

    This dissertation explored how haptic concepts are learned, retained, and generalized to the same or different modality. Participants learned to classify objects into three categories either visually or haptically via different training procedures, followed by an immediate or delayed transfer test. Experiment I involved visual versus haptic learning and transfer. Intermodal matching between vision and haptics was investigated in Experiment II. Experiments III and IV examined intersensory conflict in within- and between-category bimodal situations to determine the degree of perceptual dominance between sight and touch. Experiment V explored the intramodal relationship between similarity and categorization in a psychological space, as revealed by MDS analysis of similarity judgments. Major findings were: (1) visual examination resulted in relatively higher performance accuracy than haptic learning; (2) systematic training produced better category learning of haptic concepts across all modality conditions; (3) the category prototypes were rated newer than any transfer stimulus followed learning both immediately and after a week delay; and, (4) although they converged at the apex of two transformational trajectories, the category prototypes became more central to their respective categories and increasingly structured as a function of learning. Implications for theories of multimodal similarity and categorization behavior are discussed in terms of discrimination learning, sensory integration, and dominance relation.

  17. A laparoscopy-based method for BRDF estimation from in vivo human liver.

    PubMed

    Nunes, A L P; Maciel, A; Cavazzola, L T; Walter, M

    2017-01-01

    While improved visual realism is known to enhance training effectiveness in virtual surgery simulators, the advances on realistic rendering for these simulators is slower than similar simulations for man-made scenes. One of the main reasons for this is that in vivo data is hard to gather and process. In this paper, we propose the analysis of videolaparoscopy data to compute the Bidirectional Reflectance Distribution Function (BRDF) of living organs as an input to physically based rendering algorithms. From the interplay between light and organic matter recorded in video images, we propose the definition of a process capable of establishing the BRDF for inside-the-body organic surfaces. We present a case study around the liver with patient-specific rendering under global illumination. Results show that despite the limited range of motion allowed within the body, the computed BRDF presents a high-coverage of the sampled regions and produces plausible renderings. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Anatomy, technology, art, and culture: toward a realistic perspective of the brain.

    PubMed

    Cavalcanti, Daniel D; Feindel, William; Goodrich, James T; Dagi, T Forcht; Prestigiacomo, Charles J; Preul, Mark C

    2009-09-01

    In the 15th century, brain illustration began to change from a schematic system that involved scant objective rendering of the brain, to accurate depictions based on anatomical dissections that demanded significant artistic talent. Notable examples of this innovation are the drawings of Leonardo da Vinci (1498-1504), Andreas Vesalius' association with the bottega of Titian to produce the drawings of Vesalius' De humani corporis fabrica (1543), and Christopher Wren's illustrations for Thomas Willis' Cerebri Anatome (1664). These works appeared during the Renaissance and Age of Enlightenment, when advances in brain imaging, or really brain rendering, reflected not only the abilities and dedications of the artists, but also the influences of important cultural and scientific factors. Anatomy and human dissection became popular social phenomena as well as scholarly pursuits, linked with the world of the fine arts. The working philosophy of these artists involved active participation in both anatomical study and illustration, and the belief that their discoveries of the natural world could best be communicated by rendering them in objective form (that is, with realistic perspective). From their studies emerged the beginning of contemporary brain imaging. In this article, the authors examine how the brain began to be imaged in realism within a cultural and scientific milieu that witnessed the emergence of anatomical dissection, the geometry of linear perspective, and the closer confluence of art and science.

  19. Interactive browsing of 3D environment over the Internet

    NASA Astrophysics Data System (ADS)

    Zhang, Cha; Li, Jin

    2000-12-01

    In this paper, we describe a system for wandering in a realistic environment over the Internet. The environment is captured by the concentric mosaic, compressed via the reference block coder (RBC), and accessed and delivered over the Internet through the virtual media (Vmedia) access protocol. Capturing the environment through the concentric mosaic is easy. We mount a camera at the end of a level beam, and shoot images as the beam rotates. The huge dataset of the concentric mosaic is then compressed through the RBC, which is specifically designed for both high compression efficiency and just-in-time (JIT) rendering. Through the JIT rendering function, only a portion of the RBC bitstream is accessed, decoded and rendered for each virtual view. A multimedia communication protocol -- the Vmedia protocol, is then proposed to deliver the compressed concentric mosaic data over the Internet. Only the bitstream segments corresponding to the current view are streamed over the Internet. Moreover, the delivered bitstream segments are managed by a local Vmedia cache so that frequently used bitstream segments need not be streamed over the Internet repeatedly, and the Vmedia is able to handle a RBC bitstream larger than its memory capacity. A Vmedia concentric mosaic interactive browser is developed where the user can freely wander in a realistic environment, e.g., rotate around, walk forward/backward and sidestep, even under a tight bandwidth of 33.6 kbps.

  20. The contributions of vision and haptics to reaching and grasping

    PubMed Central

    Stone, Kayla D.; Gonzalez, Claudia L. R.

    2015-01-01

    This review aims to provide a comprehensive outlook on the sensory (visual and haptic) contributions to reaching and grasping. The focus is on studies in developing children, normal, and neuropsychological populations, and in sensory-deprived individuals. Studies have suggested a right-hand/left-hemisphere specialization for visually guided grasping and a left-hand/right-hemisphere specialization for haptically guided object recognition. This poses the interesting possibility that when vision is not available and grasping relies heavily on the haptic system, there is an advantage to use the left hand. We review the evidence for this possibility and dissect the unique contributions of the visual and haptic systems to grasping. We ultimately discuss how the integration of these two sensory modalities shape hand preference. PMID:26441777

  1. Robotic guidance benefits the learning of dynamic, but not of spatial movement characteristics.

    PubMed

    Lüttgen, Jenna; Heuer, Herbert

    2012-10-01

    Robotic guidance is an engineered form of haptic-guidance training and intended to enhance motor learning in rehabilitation, surgery, and sports. However, its benefits (and pitfalls) are still debated. Here, we investigate the effects of different presentation modes on the reproduction of a spatiotemporal movement pattern. In three different groups of participants, the movement was demonstrated in three different modalities, namely visual, haptic, and visuo-haptic. After demonstration, participants had to reproduce the movement in two alternating recall conditions: haptic and visuo-haptic. Performance of the three groups during recall was compared with regard to spatial and dynamic movement characteristics. After haptic presentation, participants showed superior dynamic accuracy, whereas after visual presentation, participants performed better with regard to spatial accuracy. Added visual feedback during recall always led to enhanced performance, independent of the movement characteristic and the presentation modality. These findings substantiate the different benefits of different presentation modes for different movement characteristics. In particular, robotic guidance is beneficial for the learning of dynamic, but not of spatial movement characteristics.

  2. Augmented reality and haptic interfaces for robot-assisted surgery.

    PubMed

    Yamamoto, Tomonori; Abolhassani, Niki; Jung, Sung; Okamura, Allison M; Judkins, Timothy N

    2012-03-01

    Current teleoperated robot-assisted minimally invasive surgical systems do not take full advantage of the potential performance enhancements offered by various forms of haptic feedback to the surgeon. Direct and graphical haptic feedback systems can be integrated with vision and robot control systems in order to provide haptic feedback to improve safety and tissue mechanical property identification. An interoperable interface for teleoperated robot-assisted minimally invasive surgery was developed to provide haptic feedback and augmented visual feedback using three-dimensional (3D) graphical overlays. The software framework consists of control and command software, robot plug-ins, image processing plug-ins and 3D surface reconstructions. The feasibility of the interface was demonstrated in two tasks performed with artificial tissue: palpation to detect hard lumps and surface tracing, using vision-based forbidden-region virtual fixtures to prevent the patient-side manipulator from entering unwanted regions of the workspace. The interoperable interface enables fast development and successful implementation of effective haptic feedback methods in teleoperation. Copyright © 2011 John Wiley & Sons, Ltd.

  3. Design of a 7-DOF haptic master using a magneto-rheological devices for robot surgery

    NASA Astrophysics Data System (ADS)

    Kang, Seok-Rae; Choi, Seung-Bok; Hwang, Yong-Hoon; Cha, Seung-Woo

    2017-04-01

    This paper presents a 7 degrees-of-freedom (7-DOF) haptic master which is applicable to the robot-assisted minimally invasive surgery (RMIS). By utilizing a controllable magneto-rheological (MR) fluid, the haptic master can provide force information to the surgeon during surgery. The proposed haptic master consists of three degrees motions of X, Y, Z and four degrees motions of the pitch, yaw, roll and grasping. All of them have force feedback capability. The proposed haptic master can generate the repulsive forces or torques by activating MR clutch and MR brake. Both MR clutch and MR brake are designed and manufactured with consideration of the size and output torque which is usable to the robotic surgery. A proportional-integral-derivative (PID) controller is then designed and implemented to achieve torque/force tracking trajectories. It is verified that the proposed haptic master can track well the desired torque and force occurred in the surgical place by controlling the input current applied to MR clutch and brake.

  4. Haptic fMRI: Reliability and performance of electromagnetic haptic interfaces for motion and force neuroimaging experiments.

    PubMed

    Menon, Samir; Zhu, Jack; Goyal, Deeksha; Khatib, Oussama

    2017-07-01

    Haptic interfaces compatible with functional magnetic resonance imaging (Haptic fMRI) promise to enable rich motor neuroscience experiments that study how humans perform complex manipulation tasks. Here, we present a large-scale study (176 scans runs, 33 scan sessions) that characterizes the reliability and performance of one such electromagnetically actuated device, Haptic fMRI Interface 3 (HFI-3). We outline engineering advances that ensured HFI-3 did not interfere with fMRI measurements. Observed fMRI temporal noise levels with HFI-3 operating were at the fMRI baseline (0.8% noise to signal). We also present results from HFI-3 experiments demonstrating that high resolution fMRI can be used to study spatio-temporal patterns of fMRI blood oxygenation dependent (BOLD) activation. These experiments include motor planning, goal-directed reaching, and visually-guided force control. Observed fMRI responses are consistent with existing literature, which supports Haptic fMRI's effectiveness at studying the brain's motor regions.

  5. Evaluation of Wearable Haptic Systems for the Fingers in Augmented Reality Applications.

    PubMed

    Maisto, Maurizio; Pacchierotti, Claudio; Chinello, Francesco; Salvietti, Gionata; De Luca, Alessandro; Prattichizzo, Domenico

    2017-01-01

    Although Augmented Reality (AR) has been around for almost five decades, only recently we have witnessed AR systems and applications entering in our everyday life. Representative examples of this technological revolution are the smartphone games "Pokémon GO" and "Ingress" or the Google Translate real-time sign interpretation app. Even if AR applications are already quite compelling and widespread, users are still not able to physically interact with the computer-generated reality. In this respect, wearable haptics can provide the compelling illusion of touching the superimposed virtual objects without constraining the motion or the workspace of the user. In this paper, we present the experimental evaluation of two wearable haptic interfaces for the fingers in three AR scenarios, enrolling 38 participants. In the first experiment, subjects were requested to write on a virtual board using a real chalk. The haptic devices provided the interaction forces between the chalk and the board. In the second experiment, subjects were asked to pick and place virtual and real objects. The haptic devices provided the interaction forces due to the weight of the virtual objects. In the third experiment, subjects were asked to balance a virtual sphere on a real cardboard. The haptic devices provided the interaction forces due to the weight of the virtual sphere rolling on the cardboard. Providing haptic feedback through the considered wearable device significantly improved the performance of all the considered tasks. Moreover, subjects significantly preferred conditions providing wearable haptic feedback.

  6. Effects of Motion and Figural Goodness on Haptic Object Perception in Infancy.

    ERIC Educational Resources Information Center

    Streri, Arlette; Spelke, Elizabeth S.

    1989-01-01

    After haptic habituation to a ring display, infants perceived the rings in two experiments as parts of one connected object. In both haptic and visual modes, infants appeared to perceive object unity by analyzing motion but not by analyzing figural goodness. (RH)

  7. Teaching Classical Mechanics Concepts Using Visuo-Haptic Simulators

    ERIC Educational Resources Information Center

    Neri, Luis; Noguez, Julieta; Robledo-Rella, Victor; Escobar-Castillejos, David; Gonzalez-Nucamendi, Andres

    2018-01-01

    In this work, the design and implementation of several physics scenarios using haptic devices are presented and discussed. Four visuo-haptic applications were developed for an undergraduate engineering physics course. Experiments with experimental and control groups were designed and implemented. Activities and exercises related to classical…

  8. Colour computer-generated holography for point clouds utilizing the Phong illumination model.

    PubMed

    Symeonidou, Athanasia; Blinder, David; Schelkens, Peter

    2018-04-16

    A technique integrating the bidirectional reflectance distribution function (BRDF) is proposed to generate realistic high-quality colour computer-generated holograms (CGHs). We build on prior work, namely a fast computer-generated holography method for point clouds that handles occlusions. We extend the method by integrating the Phong illumination model so that the properties of the objects' surfaces are taken into account to achieve natural light phenomena such as reflections and shadows. Our experiments show that rendering holograms with the proposed algorithm provides realistic looking objects without any noteworthy increase to the computational cost.

  9. Operator dynamics for stability condition in haptic and teleoperation system: A survey.

    PubMed

    Li, Hongbing; Zhang, Lei; Kawashima, Kenji

    2018-04-01

    Currently, haptic systems ignore the varying impedance of the human hand with its countless configurations and thus cannot recreate the complex haptic interactions. The literature does not reveal a comprehensive survey on the methods proposed and this study is an attempt to bridge this gap. The paper includes an extensive review of human arm impedance modeling and control deployed to address inherent stability and transparency issues in haptic interaction and teleoperation systems. Detailed classification and comparative study of various contributions in human arm modeling are presented and summarized in tables and diagrams. The main challenges in modeling human arm impedance for haptic robotic applications are identified. The possible future research directions are outlined based on the gaps identified in the survey. Copyright © 2018 John Wiley & Sons, Ltd.

  10. A Haptic-Enhanced System for Molecular Sensing

    NASA Astrophysics Data System (ADS)

    Comai, Sara; Mazza, Davide

    The science of haptics has received an enormous attention in the last decade. One of the major application trends of haptics technology is data visualization and training. In this paper, we present a haptically-enhanced system for manipulation and tactile exploration of molecules.The geometrical models of molecules is extracted either from theoretical or empirical data using file formats widely adopted in chemical and biological fields. The addition of information computed with computational chemistry tools, allows users to feel the interaction forces between an explored molecule and a charge associated to the haptic device, and to visualize a huge amount of numerical data in a more comprehensible way. The developed tool can be used either for teaching or research purposes due to its high reliance on both theoretical and experimental data.

  11. A Haptics Symposium Retrospective: 20 Years

    NASA Technical Reports Server (NTRS)

    Colgate, J. Edward; Adelstein, Bernard

    2012-01-01

    The very first "Haptics Symposium" actually went by the name "Issues in the Development of Kinesthetic Displays of Teleoperation and Virtual environments." The word "Haptic" didn't make it into the name until the next year. Not only was the most important word absent but so were RFPs, journals and commercial markets. And yet, as we prepare for the 2012 symposium, haptics is a thriving and amazingly diverse field of endeavor. In this talk we'll reflect on the origins of this field and on its evolution over the past twenty years, as well as the evolution of the Haptics Symposium itself. We hope to share with you some of the excitement we've felt along the way, and that we continue to feel as we look toward the future of our field.

  12. Invited Article: A review of haptic optical tweezers for an interactive microworld exploration

    NASA Astrophysics Data System (ADS)

    Pacoret, Cécile; Régnier, Stéphane

    2013-08-01

    This paper is the first review of haptic optical tweezers, a new technique which associates force feedback teleoperation with optical tweezers. This technique allows users to explore the microworld by sensing and exerting picoNewton-scale forces with trapped microspheres. Haptic optical tweezers also allow improved dexterity of micromanipulation and micro-assembly. One of the challenges of this technique is to sense and magnify picoNewton-scale forces by a factor of 1012 to enable human operators to perceive interactions that they have never experienced before, such as adhesion phenomena, extremely low inertia, and high frequency dynamics of extremely small objects. The design of optical tweezers for high quality haptic feedback is challenging, given the requirements for very high sensitivity and dynamic stability. The concept, design process, and specification of optical tweezers reviewed here are focused on those intended for haptic teleoperation. In this paper, two new specific designs as well as the current state-of-the-art are presented. Moreover, the remaining important issues are identified for further developments. The initial results obtained are promising and demonstrate that optical tweezers have a significant potential for haptic exploration of the microworld. Haptic optical tweezers will become an invaluable tool for force feedback micromanipulation of biological samples and nano- and micro-assembly parts.

  13. Psychophysical evaluation of a variable friction tactile interface

    NASA Astrophysics Data System (ADS)

    Samur, Evren; Colgate, J. Edward; Peshkin, Michael A.

    2009-02-01

    This study explores the haptic rendering capabilities of a variable friction tactile interface through psychophysical experiments. In order to obtain a deeper understanding of the sensory resolution associated with the Tactile Pattern Display (TPaD), friction discrimination experiments are conducted. During the experiments, subjects are asked to explore the glass surface of the TPaD using their bare index fingers, to feel the friction on the surface, and to compare the slipperiness of two stimuli, displayed in sequential order. The fingertip position data is collected by an infrared frame and normal and translational forces applied by the finger are measured by force sensors attached to the TPaD. The recorded data is used to calculate the coefficient of friction between the fingertip and the TPaD. The experiments determine the just noticeable difference (JND) of friction coefficient for humans interacting with the TPaD.

  14. Pop-Art Panels

    ERIC Educational Resources Information Center

    Alford, Joanna

    2012-01-01

    James Rosenquist's giant Pop-art panels included realistic renderings of well-known contemporary foods and objects, juxtaposed with famous people in the news--largely from the 1960s, '70s and '80s--and really serve as visual time capsules. In this article, eighth-graders focus on the style of James Rosenquist to create their own Pop-art panel that…

  15. The Potential for Scientific Collaboration in Virtual Ecosystems

    ERIC Educational Resources Information Center

    Magerko, Brian

    2010-01-01

    This article explores the potential benefits of creating "virtual ecosystems" from real-world data. These ecosystems are intended to be realistic virtual representations of environments that may be costly or difficult to access in person. They can be constructed as 3D worlds rendered from stereo video data, augmented with scientific data, and then…

  16. A Forest Landscape Visualization System

    Treesearch

    Tim McDonald; Bryce Stokes

    1998-01-01

    A forest landscape visualization system was developed and used in creating realistic images depicting how an area might appear if harvested. The system uses a ray-tracing renderer to draw model trees on a virtual landscape. The system includes components to create landscape surfaces from digital elevation data, populate/cut trees within (polygonal) areas, and convert...

  17. HARPERS FERRY, A PLAY ABOUT JOHN BROWN.

    ERIC Educational Resources Information Center

    STAVIS, BARRIE

    THIS PLAY IS A DRAMATIC RENDERING OF JOHN BROWN'S ATTACK ON THE ARMORY AT HARPERS FERRY AND HIS SUBSEQUENT TRIAL FOR TREASON. ALTHOUGH IT ADHERES TO THE FACTS OF HISTORY, THEY ARE NOT TREATED REALISTICALLY. "HARPERS FERRY" PORTRAYS BROWN AS POSSESSING A PURE IDEALISM UNTAINTED IN THE SLIGHTEST DEGREE BY MATERIALISM OR SELF-SEEKING, WHICH…

  18. Haptic fMRI: using classification to quantify task-correlated noise during goal-directed reaching motions.

    PubMed

    Menon, Samir; Quigley, Paul; Yu, Michelle; Khatib, Oussama

    2014-01-01

    Neuroimaging artifacts in haptic functional magnetic resonance imaging (Haptic fMRI) experiments have the potential to induce spurious fMRI activation where there is none, or to make neural activation measurements appear correlated across brain regions when they are actually not. Here, we demonstrate that performing three-dimensional goal-directed reaching motions while operating Haptic fMRI Interface (HFI) does not create confounding motion artifacts. To test for artifacts, we simultaneously scanned a subject's brain with a customized soft phantom placed a few centimeters away from the subject's left motor cortex. The phantom captured task-related motion and haptic noise, but did not contain associated neural activation measurements. We quantified the task-related information present in fMRI measurements taken from the brain and the phantom by using a linear max-margin classifier to predict whether raw time series data could differentiate between motion planning or reaching. fMRI measurements in the phantom were uninformative (2σ, 45-73%; chance=50%), while those in primary motor, visual, and somatosensory cortex accurately classified task-conditions (2σ, 90-96%). We also localized artifacts due to the haptic interface alone by scanning a stand-alone fBIRN phantom, while an operator performed haptic tasks outside the scanner's bore with the interface at the same location. The stand-alone phantom had lower temporal noise and had similar mean classification but a tighter distribution (bootstrap Gaussian fit) than the brain phantom. Our results suggest that any fMRI measurement artifacts for Haptic fMRI reaching experiments are dominated by actual neural responses.

  19. Evaluation of Motor Control Using Haptic Device

    NASA Astrophysics Data System (ADS)

    Nuruki, Atsuo; Kawabata, Takuro; Shimozono, Tomoyuki; Yamada, Masafumi; Yunokuchi, Kazutomo

    When the kinesthesia and the touch act at the same time, such perception is called haptic perception. This sense has the key role in motor information on the force and position control. The haptic perception is important in the field where the evaluation of the motor control is needed. The purpose of this paper is to evaluate the motor control, perception of heaviness and distance in normal and fatigue conditions using psychophysical experiment. We used a haptic device in order to generate precise force and distance, but the precedent of the evaluation system with the haptic device has been few. Therefore, it is another purpose to examine whether the haptic device is useful as evaluation system for the motor control. The psychophysical quantity of force and distance was measured by two kinds of experiments. Eight healthy subjects participated in this study. The stimulation was presented by haptic device [PHANTOM Omni: SensAble Company]. The subjects compared between standard and test stimulation, and answered it had felt which stimulation was strong. In the result of the psychophysical quantity of force, just noticeable difference (JND) had a significant difference, and point of subjective equality (PSE) was not different between normal and muscle fatigue. On the other hand, in the result of the psychophysical quantity of distance, JND and PSE were not difference between normal and muscle fatigue. These results show that control of force was influenced, but control of distance was not influenced in muscle fatigue. Moreover, these results suggested that the haptic device is useful as the evaluation system for the motor control.

  20. Mental rotation of tactile stimuli: using directional haptic cues in mobile devices.

    PubMed

    Gleeson, Brian T; Provancher, William R

    2013-01-01

    Haptic interfaces have the potential to enrich users' interactions with mobile devices and convey information without burdening the user's visual or auditory attention. Haptic stimuli with directional content, for example, navigational cues, may be difficult to use in handheld applications; the user's hand, where the cues are delivered, may not be aligned with the world, where the cues are to be interpreted. In such a case, the user would be required to mentally transform the stimuli between different reference frames. We examine the mental rotation of directional haptic stimuli in three experiments, investigating: 1) users' intuitive interpretation of rotated stimuli, 2) mental rotation of haptic stimuli about a single axis, and 3) rotation about multiple axes and the effects of specific hand poses and joint rotations. We conclude that directional haptic stimuli are suitable for use in mobile applications, although users do not naturally interpret rotated stimuli in any one universal way. We find evidence of cognitive processes involving the rotation of analog, spatial representations and discuss how our results fit into the larger body of mental rotation research. For small angles (e.g., less than 40 degree), these mental rotations come at little cost, but rotations with larger misalignment angles impact user performance. When considering the design of a handheld haptic device, our results indicate that hand pose must be carefully considered, as certain poses increase the difficulty of stimulus interpretation. Generally, all tested joint rotations impact task difficulty, but finger flexion and wrist rotation interact to greatly increase the cost of stimulus interpretation; such hand poses should be avoided when designing a haptic interface.

  1. Haptic-2D: A new haptic test battery assessing the tactual abilities of sighted and visually impaired children and adolescents with two-dimensional raised materials.

    PubMed

    Mazella, Anaïs; Albaret, Jean-Michel; Picard, Delphine

    2016-01-01

    To fill an important gap in the psychometric assessment of children and adolescents with impaired vision, we designed a new battery of haptic tests, called Haptic-2D, for visually impaired and sighted individuals aged five to 18 years. Unlike existing batteries, ours uses only two-dimensional raised materials that participants explore using active touch. It is composed of 11 haptic tests, measuring scanning skills, tactile discrimination skills, spatial comprehension skills, short-term tactile memory, and comprehension of tactile pictures. We administered this battery to 138 participants, half of whom were sighted (n=69), and half visually impaired (blind, n=16; low vision, n=53). Results indicated a significant main effect of age on haptic scores, but no main effect of vision or Age × Vision interaction effect. Reliability of test items was satisfactory (Cronbach's alpha, α=0.51-0.84). Convergent validity was good, as shown by a significant correlation (age partialled out) between total haptic scores and scores on the B101 test (rp=0.51, n=47). Discriminant validity was also satisfactory, as attested by a lower but still significant partial correlation between total haptic scores and the raw score on the verbal WISC (rp=0.43, n=62). Finally, test-retest reliability was good (rs=0.93, n=12; interval of one to two months). This new psychometric tool should prove useful to practitioners working with young people with impaired vision. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Performance evaluation of a robot-assisted catheter operating system with haptic feedback.

    PubMed

    Song, Yu; Guo, Shuxiang; Yin, Xuanchun; Zhang, Linshuai; Hirata, Hideyuki; Ishihara, Hidenori; Tamiya, Takashi

    2018-06-20

    In this paper, a novel robot-assisted catheter operating system (RCOS) has been proposed as a method to reduce physical stress and X-ray exposure time to physicians during endovascular procedures. The unique design of this system allows the physician to apply conventional bedside catheterization skills (advance, retreat and rotate) to an input catheter, which is placed at the master side to control another patient catheter placed at the slave side. For this purpose, a magnetorheological (MR) fluids-based master haptic interface has been developed to measure the axial and radial motions of an input catheter, as well as to provide the haptic feedback to the physician during the operation. In order to achieve a quick response of the haptic force in the master haptic interface, a hall sensor-based closed-loop control strategy is employed. In slave side, a catheter manipulator is presented to deliver the patient catheter, according to position commands received from the master haptic interface. The contact forces between the patient catheter and blood vessel system can be measured by designed force sensor unit of catheter manipulator. Four levels of haptic force are provided to make the operator aware of the resistance encountered by the patient catheter during the insertion procedure. The catheter manipulator was evaluated for precision positioning. The time lag from the sensed motion to replicated motion is tested. To verify the efficacy of the proposed haptic feedback method, the evaluation experiments in vitro are carried out. The results demonstrate that the proposed system has the ability to enable decreasing the contact forces between the catheter and vasculature.

  3. A survey on hair modeling: styling, simulation, and rendering.

    PubMed

    Ward, Kelly; Bertails, Florence; Kim, Tae-Yong; Marschner, Stephen R; Cani, Marie-Paule; Lin, Ming C

    2007-01-01

    Realistic hair modeling is a fundamental part of creating virtual humans in computer graphics. This paper surveys the state of the art in the major topics of hair modeling: hairstyling, hair simulation, and hair rendering. Because of the difficult, often unsolved problems that arise in all these areas, a broad diversity of approaches are used, each with strengths that make it appropriate for particular applications. We discuss each of these major topics in turn, presenting the unique challenges facing each area and describing solutions that have been presented over the years to handle these complex issues. Finally, we outline some of the remaining computational challenges in hair modeling.

  4. Investigations into haptic space and haptic perception of shape for active touch

    NASA Astrophysics Data System (ADS)

    Sanders, A. F. J.

    2008-12-01

    This thesis presents a number of psychophysical investigations into haptic space and haptic perception of shape. Haptic perception is understood to include the two subsystems of the cutaneous sense and kinesthesis. Chapter 2 provides an extensive quantitative study into haptic perception of curvature. I investigated bimanual curvature discrimination of cylindrically curved, hand-sized surfaces. I found that discrimination thresholds were in the same range as unimanual thresholds reported in previous studies. Moreover, the distance between the surfaces or the position of the setup with respect to the observer had no effect on thresholds. Finally, I found idiosyncratic biases: A number of observers judged two surfaces that had different radii as equally curved. Biases were of the same order of magnitude as thresholds. In Chapter 3, I investigated haptic space. Here, haptic space is understood to be (1) the set of observer’s judgments of spatial relations in physical space, and (2) a set of constraints by which these judgments are internally consistent. I asked blindfolded observers to construct straight lines in a number of different tasks. I show that the shape of the haptically straight line depends on the task used to produce it. I therefore conclude that there is no unique definition of the haptically straight line and that doubts are cast on the usefulness of the concept of haptic space. In Chapter 4, I present a new experiment into haptic length perception. I show that when observers trace curved pathways with their index finger and judge distance traversed, their distance estimates depend on the geometry of the paths: Lengths of convex, cylindrically curved pathways were overestimated and lengths of concave pathways were underestimated. In addition, I show that a kinematic mechanism must underlie this interaction: (1) the geometry of the path traced by the finger affects movement speed and consequently movement time, and (2) movement time is taken as a measure of traversed length. The study presented in Chapter 5 addresses the question of how kinematic properties of exploratory movements affect perceived shape. I identify a kinematic invariant for the case of a single finger moving across cylindrically curved strips under conditions of slip. I found that the rotation angle of the finger increased linearly with the curvature of the stimulus. In addition, I show that observers took rotation angle as their primary measure of perceived curvature: Observers rotated their finger less on a concave curvature by a constant amount, and consequently, they overestimated the radius of the concave strips compared to the convex ones. Finally, in Chapter 6, I investigated the haptic filled-space illusion for dynamic touch: Observers move their fingertip across an unfilled extent or an extent filled with intermediate stimulations. Previous researchers have reported lengths of filled extents to be overestimated, but the parameters affecting the strength of the illusion are still largely unknown. Factors investigated in this chapter include end point effects, filler density and overall average movement speed.

  5. Haptic Glove Technology: Skill Development through Video Game Play

    ERIC Educational Resources Information Center

    Bargerhuff, Mary Ellen; Cowan, Heidi; Oliveira, Francisco; Quek, Francis; Fang, Bing

    2010-01-01

    This article introduces a recently developed haptic glove system and describes how the participants used a video game that was purposely designed to train them in skills that are needed for the efficient use of the haptic glove. Assessed skills included speed, efficiency, embodied skill, and engagement. The findings and implications for future…

  6. The Role of Visual Experience on the Representation and Updating of Novel Haptic Scenes

    ERIC Educational Resources Information Center

    Pasqualotto, Achille; Newell, Fiona N.

    2007-01-01

    We investigated the role of visual experience on the spatial representation and updating of haptic scenes by comparing recognition performance across sighted, congenitally and late blind participants. We first established that spatial updating occurs in sighted individuals to haptic scenes of novel objects. All participants were required to…

  7. Effect of Auditory Interference on Memory of Haptic Perceptions.

    ERIC Educational Resources Information Center

    Anater, Paul F.

    1980-01-01

    The effect of auditory interference on the processing of haptic information by 61 visually impaired students (8 to 20 years old) was the focus of the research described in this article. It was assumed that as the auditory interference approximated the verbalized activity of the haptic task, accuracy of recall would decline. (Author)

  8. Investigating Students' Ideas about Buoyancy and the Influence of Haptic Feedback

    ERIC Educational Resources Information Center

    Minogue, James; Borland, David

    2016-01-01

    While haptics (simulated touch) represents a potential breakthrough technology for science teaching and learning, there is relatively little research into its differential impact in the context of teaching and learning. This paper describes the testing of a haptically enhanced simulation (HES) for learning about buoyancy. Despite a lifetime of…

  9. Superior haptic-to-visual shape matching in autism spectrum disorders.

    PubMed

    Nakano, Tamami; Kato, Nobumasa; Kitazawa, Shigeru

    2012-04-01

    A weak central coherence theory in autism spectrum disorder (ASD) proposes that a cognitive bias toward local processing in ASD derives from a weakness in integrating local elements into a coherent whole. Using this theory, we hypothesized that shape perception through active touch, which requires sequential integration of sensorimotor traces of exploratory finger movements into a shape representation, would be impaired in ASD. Contrary to our expectation, adults with ASD showed superior performance in a haptic-to-visual delayed shape-matching task compared to adults without ASD. Accuracy in discriminating haptic lengths or haptic orientations, which lies within the somatosensory modality, did not differ between adults with ASD and adults without ASD. Moreover, this superior ability in inter-modal haptic-to-visual shape matching was not explained by the score in a unimodal visuospatial rotation task. These results suggest that individuals with ASD are not impaired in integrating sensorimotor traces into a global visual shape and that their multimodal shape representations and haptic-to-visual information transfer are more accurate than those of individuals without ASD. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. Haptic Recreation of Elbow Spasticity

    PubMed Central

    Kim, Jonghyun; Damiano, Diane L.

    2013-01-01

    The aim of this paper is to develop a haptic device capable of presenting standardized recreation of elbow spasticity. Using the haptic device, clinicians will be able to repeatedly practice the assessment of spasticity without requiring patient involvement, and these practice opportunities will help improve accuracy and reliability of the assessment itself. Haptic elbow spasticity simulator (HESS) was designed and prototyped according to mechanical requirements to recreate the feel of elbow spasticity. Based on the data collected from subjects with elbow spasticity, a mathematical model representing elbow spasticity is proposed. As an attempt to differentiate the feel of each score in Modified Ashworth Scale (MAS), parameters of the model were obtained respectively for three different MAS scores 1, 1+, and 2. The implemented haptic recreation was evaluated by experienced clinicians who were asked to give MAS scores by manipulating the haptic device. The clinicians who participated in the study were blinded to each other’s scores and to the given models. They distinguished the three models and the MAS scores given to the recreated models matched 100% with the original MAS scores from the patients. PMID:22275660

  11. Control of a haptic gear shifting assistance device utilizing a magnetorheological clutch

    NASA Astrophysics Data System (ADS)

    Han, Young-Min; Choi, Seung-Bok

    2014-10-01

    This paper proposes a haptic clutch driven gear shifting assistance device that can help when the driver shifts the gear of a transmission system. In order to achieve this goal, a magnetorheological (MR) fluid-based clutch is devised to be capable of the rotary motion of an accelerator pedal to which the MR clutch is integrated. The proposed MR clutch is then manufactured, and its transmission torque is experimentally evaluated according to the magnetic field intensity. The manufactured MR clutch is integrated with the accelerator pedal to transmit a haptic cue signal to the driver. The impending control issue is to cue the driver to shift the gear via the haptic force. Therefore, a gear-shifting decision algorithm is constructed by considering the vehicle engine speed concerned with engine combustion dynamics, vehicle dynamics and driving resistance. Then, the algorithm is integrated with a compensation strategy for attaining the desired haptic force. In this work, the compensator is also developed and implemented through the discrete version of the inverse hysteretic model. The control performances, such as the haptic force tracking responses and fuel consumption, are experimentally evaluated.

  12. Torque Measurement of 3-DOF Haptic Master Operated by Controllable Electrorheological Fluid

    NASA Astrophysics Data System (ADS)

    Oh, Jong-Seok; Choi, Seung-Bok; Lee, Yang-Sub

    2015-02-01

    This work presents a torque measurement method of 3-degree-of-freedom (3-DOF) haptic master featuring controllable electrorheological (ER) fluid. In order to reflect the sense of an organ for a surgeon, the ER haptic master which can generate the repulsive torque of an organ is utilized as a remote controller for a surgery robot. Since accurate representation of organ feeling is essential for the success of the robot-assisted surgery, it is indispensable to develop a proper torque measurement method of 3-DOF ER haptic master. After describing the structural configuration of the haptic master, the torque models of ER spherical joint are mathematically derived based on the Bingham model of ER fluid. A new type of haptic device which has pitching, rolling, and yawing motions is then designed and manufactured using a spherical joint mechanism. Subsequently, the field-dependent parameters of the Bingham model are identified and generating repulsive torque according to applied electric field is measured. In addition, in order to verify the effectiveness of the proposed torque model, a comparative work between simulated and measured torques is undertaken.

  13. Dynamics modeling for parallel haptic interfaces with force sensing and control.

    PubMed

    Bernstein, Nicholas; Lawrence, Dale; Pao, Lucy

    2013-01-01

    Closed-loop force control can be used on haptic interfaces (HIs) to mitigate the effects of mechanism dynamics. A single multidimensional force-torque sensor is often employed to measure the interaction force between the haptic device and the user's hand. The parallel haptic interface at the University of Colorado (CU) instead employs smaller 1D force sensors oriented along each of the five actuating rods to build up a 5D force vector. This paper shows that a particular manipulandum/hand partition in the system dynamics is induced by the placement and type of force sensing, and discusses the implications on force and impedance control for parallel haptic interfaces. The details of a "squaring down" process are also discussed, showing how to obtain reduced degree-of-freedom models from the general six degree-of-freedom dynamics formulation.

  14. Development of haptic system for surgical robot

    NASA Astrophysics Data System (ADS)

    Gang, Han Gyeol; Park, Jiong Min; Choi, Seung-Bok; Sohn, Jung Woo

    2017-04-01

    In this paper, a new type of haptic system for surgical robot application is proposed and its performances are evaluated experimentally. The proposed haptic system consists of an effective master device and a precision slave robot. The master device has 3-DOF rotational motion as same as human wrist motion. It has lightweight structure with a gyro sensor and three small-sized MR brakes for position measurement and repulsive torque generation, respectively. The slave robot has 3-DOF rotational motion using servomotors, five bar linkage and a torque sensor is used to measure resistive torque. It has been experimentally demonstrated that the proposed haptic system has good performances on tracking control of desired position and repulsive torque. It can be concluded that the proposed haptic system can be effectively applied to the surgical robot system in real field.

  15. fMRI-Compatible Electromagnetic Haptic Interface.

    PubMed

    Riener, R; Villgrattner, T; Kleiser, R; Nef, T; Kollias, S

    2005-01-01

    A new haptic interface device is suggested, which can be used for functional magnetic resonance imaging (fMRI) studies. The basic component of this 1 DOF haptic device are two coils that produce a Lorentz force induced by the large static magnetic field of the MR scanner. A MR-compatible optical angular encoder and a optical force sensor enable the implementation of different control architectures for haptic interactions. The challenge was to provide a large torque, and not to affect image quality by the currents applied in the device. The haptic device was tested in a 3T MR scanner. With a current of up to 1A and a distance of 1m to the focal point of the MR-scanner it was possible to generate torques of up to 4 Nm. Within these boundaries image quality was not affected.

  16. Assessment of Haptic Interaction for Home-Based Physical Tele-Therapy using Wearable Devices and Depth Sensors.

    PubMed

    Barmpoutis, Angelos; Alzate, Jose; Beekhuizen, Samantha; Delgado, Horacio; Donaldson, Preston; Hall, Andrew; Lago, Charlie; Vidal, Kevin; Fox, Emily J

    2016-01-01

    In this paper a prototype system is presented for home-based physical tele-therapy using a wearable device for haptic feedback. The haptic feedback is generated as a sequence of vibratory cues from 8 vibrator motors equally spaced along an elastic wearable band. The motors guide the patients' movement as they perform a prescribed exercise routine in a way that replaces the physical therapists' haptic guidance in an unsupervised or remotely supervised home-based therapy session. A pilot study of 25 human subjects was performed that focused on: a) testing the capability of the system to guide the users in arbitrary motion paths in the space and b) comparing the motion of the users during typical physical therapy exercises with and without haptic-based guidance. The results demonstrate the efficacy of the proposed system.

  17. The effect of haptic guidance and visual feedback on learning a complex tennis task.

    PubMed

    Marchal-Crespo, Laura; van Raai, Mark; Rauter, Georg; Wolf, Peter; Riener, Robert

    2013-11-01

    While haptic guidance can improve ongoing performance of a motor task, several studies have found that it ultimately impairs motor learning. However, some recent studies suggest that the haptic demonstration of optimal timing, rather than movement magnitude, enhances learning in subjects trained with haptic guidance. Timing of an action plays a crucial role in the proper accomplishment of many motor skills, such as hitting a moving object (discrete timing task) or learning a velocity profile (time-critical tracking task). The aim of the present study is to evaluate which feedback conditions-visual or haptic guidance-optimize learning of the discrete and continuous elements of a timing task. The experiment consisted in performing a fast tennis forehand stroke in a virtual environment. A tendon-based parallel robot connected to the end of a racket was used to apply haptic guidance during training. In two different experiments, we evaluated which feedback condition was more adequate for learning: (1) a time-dependent discrete task-learning to start a tennis stroke and (2) a tracking task-learning to follow a velocity profile. The effect that the task difficulty and subject's initial skill level have on the selection of the optimal training condition was further evaluated. Results showed that the training condition that maximizes learning of the discrete time-dependent motor task depends on the subjects' initial skill level. Haptic guidance was especially suitable for less-skilled subjects and in especially difficult discrete tasks, while visual feedback seems to benefit more skilled subjects. Additionally, haptic guidance seemed to promote learning in a time-critical tracking task, while visual feedback tended to deteriorate the performance independently of the task difficulty and subjects' initial skill level. Haptic guidance outperformed visual feedback, although additional studies are needed to further analyze the effect of other types of feedback visualization on motor learning of time-critical tasks.

  18. Haptic Cues for Balance: Use of a Cane Provides Immediate Body Stabilization

    PubMed Central

    Sozzi, Stefania; Crisafulli, Oscar; Schieppati, Marco

    2017-01-01

    Haptic cues are important for balance. Knowledge of the temporal features of their effect may be crucial for the design of neural prostheses. Touching a stable surface with a fingertip reduces body sway in standing subjects eyes closed (EC), and removal of haptic cue reinstates a large sway pattern. Changes in sway occur rapidly on changing haptic conditions. Here, we describe the effects and time-course of stabilization produced by a haptic cue derived from a walking cane. We intended to confirm that cane use reduces body sway, to evaluate the effect of vision on stabilization by a cane, and to estimate the delay of the changes in body sway after addition and withdrawal of haptic input. Seventeen healthy young subjects stood in tandem position on a force platform, with eyes closed or open (EO). They gently lowered the cane onto and lifted it from a second force platform. Sixty trials per direction of haptic shift (Touch → NoTouch, T-NT; NoTouch → Touch, NT-T) and visual condition (EC-EO) were acquired. Traces of Center of foot Pressure (CoP) and the force exerted by cane were filtered, rectified, and averaged. The position in space of a reflective marker positioned on the cane tip was also acquired by an optoelectronic device. Cross-correlation (CC) analysis was performed between traces of cane tip and CoP displacement. Latencies of changes in CoP oscillation in the frontal plane EC following the T-NT and NT-T haptic shift were statistically estimated. The CoP oscillations were larger in EC than EO under both T and NT (p < 0.001) and larger during NT than T conditions (p < 0.001). Haptic-induced effect under EC (Romberg quotient NT/T ~ 1.2) was less effective than that of vision under NT condition (EC/EO ~ 1.5) (p < 0.001). With EO cane had little effect. Cane displacement lagged CoP displacement under both EC and EO. Latencies to changes in CoP oscillations were longer after addition (NT-T, about 1.6 s) than withdrawal (T-NT, about 0.9 s) of haptic input (p < 0.001). These latencies were similar to those occurring on fingertip touch, as previously shown. Overall, data speak in favor of substantial equivalence of the haptic information derived from both “direct” fingertip contact and “indirect” contact with the floor mediated by the cane. Cane, finger and visual inputs would be similarly integrated in the same neural centers for balance control. Haptic input from a walking aid and its processing time should be considered when designing prostheses for locomotion. PMID:29311785

  19. Multiple reference frames in haptic spatial processing

    NASA Astrophysics Data System (ADS)

    Volčič, R.

    2008-08-01

    The present thesis focused on haptic spatial processing. In particular, our interest was directed to the perception of spatial relations with the main focus on the perception of orientation. To this end, we studied haptic perception in different tasks, either in isolation or in combination with vision. The parallelity task, where participants have to match the orientations of two spatially separated bars, was used in its two-dimensional and three-dimensional versions in Chapter 2 and Chapter 3, respectively. The influence of non-informative vision and visual interference on performance in the parallelity task was studied in Chapter 4. A different task, the mental rotation task, was introduced in a purely haptic study in Chapter 5 and in a visuo-haptic cross-modal study in Chapter 6. The interaction of multiple reference frames and their influence on haptic spatial processing were the common denominators of these studies. In this thesis we approached the problems of which reference frames play the major role in haptic spatial processing and how the relative roles of distinct reference frames change depending on the available information and the constraints imposed by different tasks. We found that the influence of a reference frame centered on the hand was the major cause of the deviations from veridicality observed in both the two-dimensional and three-dimensional studies. The results were described by a weighted average model, in which the hand-centered egocentric reference frame is supposed to have a biasing influence on the allocentric reference frame. Performance in haptic spatial processing has been shown to depend also on sources of information or processing that are not strictly connected to the task at hand. When non-informative vision was provided, a beneficial effect was observed in the haptic performance. This improvement was interpreted as a shift from the egocentric to the allocentric reference frame. Moreover, interfering visual information presented in the vicinity of the haptic stimuli parametrically modulated the magnitude of the deviations. The influence of the hand-centered reference frame was shown also in the haptic mental rotation task where participants were quicker in judging the parity of objects when these were aligned with respect to the hands than when they were physically aligned. Similarly, in the visuo-haptic cross-modal mental rotation task the parity judgments were influenced by the orientation of the exploring hand with respect to the viewing direction. This effect was shown to be modulated also by an intervening temporal delay that supposedly counteracts the influence of the hand-centered reference frame. We suggest that the hand-centered reference frame is embedded in a hierarchical structure of reference frames where some of these emerge depending on the demands and the circumstances of the surrounding environment and the needs of an active perceiver.

  20. A Novel Temporal Bone Simulation Model Using 3D Printing Techniques.

    PubMed

    Mowry, Sarah E; Jammal, Hachem; Myer, Charles; Solares, Clementino Arturo; Weinberger, Paul

    2015-09-01

    An inexpensive temporal bone model for use in a temporal bone dissection laboratory setting can be made using a commercially available, consumer-grade 3D printer. Several models for a simulated temporal bone have been described but use commercial-grade printers and materials to produce these models. The goal of this project was to produce a plastic simulated temporal bone on an inexpensive 3D printer that recreates the visual and haptic experience associated with drilling a human temporal bone. Images from a high-resolution CT of a normal temporal bone were converted into stereolithography files via commercially available software, with image conversion and print settings adjusted to achieve optimal print quality. The temporal bone model was printed using acrylonitrile butadiene styrene (ABS) plastic filament on a MakerBot 2x 3D printer. Simulated temporal bones were drilled by seven expert temporal bone surgeons, assessing the fidelity of the model as compared with a human cadaveric temporal bone. Using a four-point scale, the simulated bones were assessed for haptic experience and recreation of the temporal bone anatomy. The created model was felt to be an accurate representation of a human temporal bone. All raters felt strongly this would be a good training model for junior residents or to simulate difficult surgical anatomy. Material cost for each model was $1.92. A realistic, inexpensive, and easily reproducible temporal bone model can be created on a consumer-grade desktop 3D printer.

  1. A Study on Immersion and Presence of a Portable Hand Haptic System for Immersive Virtual Reality

    PubMed Central

    Kim, Mingyu; Jeon, Changyu; Kim, Jinmo

    2017-01-01

    This paper proposes a portable hand haptic system using Leap Motion as a haptic interface that can be used in various virtual reality (VR) applications. The proposed hand haptic system was designed as an Arduino-based sensor architecture to enable a variety of tactile senses at low cost, and is also equipped with a portable wristband. As a haptic system designed for tactile feedback, the proposed system first identifies the left and right hands and then sends tactile senses (vibration and heat) to each fingertip (thumb and index finger). It is incorporated into a wearable band-type system, making its use easy and convenient. Next, hand motion is accurately captured using the sensor of the hand tracking system and is used for virtual object control, thus achieving interaction that enhances immersion. A VR application was designed with the purpose of testing the immersion and presence aspects of the proposed system. Lastly, technical and statistical tests were carried out to assess whether the proposed haptic system can provide a new immersive presence to users. According to the results of the presence questionnaire and the simulator sickness questionnaire, we confirmed that the proposed hand haptic system, in comparison to the existing interaction that uses only the hand tracking system, provided greater presence and a more immersive environment in the virtual reality. PMID:28513545

  2. A Study on Immersion and Presence of a Portable Hand Haptic System for Immersive Virtual Reality.

    PubMed

    Kim, Mingyu; Jeon, Changyu; Kim, Jinmo

    2017-05-17

    This paper proposes a portable hand haptic system using Leap Motion as a haptic interface that can be used in various virtual reality (VR) applications. The proposed hand haptic system was designed as an Arduino-based sensor architecture to enable a variety of tactile senses at low cost, and is also equipped with a portable wristband. As a haptic system designed for tactile feedback, the proposed system first identifies the left and right hands and then sends tactile senses (vibration and heat) to each fingertip (thumb and index finger). It is incorporated into a wearable band-type system, making its use easy and convenient. Next, hand motion is accurately captured using the sensor of the hand tracking system and is used for virtual object control, thus achieving interaction that enhances immersion. A VR application was designed with the purpose of testing the immersion and presence aspects of the proposed system. Lastly, technical and statistical tests were carried out to assess whether the proposed haptic system can provide a new immersive presence to users. According to the results of the presence questionnaire and the simulator sickness questionnaire, we confirmed that the proposed hand haptic system, in comparison to the existing interaction that uses only the hand tracking system, provided greater presence and a more immersive environment in the virtual reality.

  3. Perceptual Grouping in Haptic Search: The Influence of Proximity, Similarity, and Good Continuation

    ERIC Educational Resources Information Center

    Overvliet, Krista E.; Krampe, Ralf Th.; Wagemans, Johan

    2012-01-01

    We conducted a haptic search experiment to investigate the influence of the Gestalt principles of proximity, similarity, and good continuation. We expected faster search when the distractors could be grouped. We chose edges at different orientations as stimuli because they are processed similarly in the haptic and visual modality. We therefore…

  4. Immediate Memory for Haptically-Examined Braille Symbols by Blind and Sighted Subjects.

    ERIC Educational Resources Information Center

    Newman, Slater E.; And Others

    The paper reports on two experiments in Braille learning which compared blind and sighted subjects on the immediate recall of haptically-examined Braille symbols. In the first study, sighted subjects (N=64) haptically examined each of a set of Braille symbols with their preferred or nonpreferred hand and immediately recalled the symbol by drawing…

  5. Haptic Cues Used for Outdoor Wayfinding by Individuals with Visual Impairments

    ERIC Educational Resources Information Center

    Koutsoklenis, Athanasios; Papadopoulos, Konstantinos

    2014-01-01

    Introduction: The study presented here examines which haptic cues individuals with visual impairments use more frequently and determines which of these cues are deemed by these individuals to be the most important for way-finding in urban environments. It also investigates the ways in which these haptic cues are used by individuals with visual…

  6. Cortical Activation Patterns during Long-Term Memory Retrieval of Visually or Haptically Encoded Objects and Locations

    ERIC Educational Resources Information Center

    Stock, Oliver; Roder, Brigitte; Burke, Michael; Bien, Siegfried; Rosler, Frank

    2009-01-01

    The present study used functional magnetic resonance imaging to delineate cortical networks that are activated when objects or spatial locations encoded either visually (visual encoding group, n = 10) or haptically (haptic encoding group, n = 10) had to be retrieved from long-term memory. Participants learned associations between auditorily…

  7. Physical Student-Robot Interaction with the ETHZ Haptic Paddle

    ERIC Educational Resources Information Center

    Gassert, R.; Metzger, J.; Leuenberger, K.; Popp, W. L.; Tucker, M. R.; Vigaru, B.; Zimmermann, R.; Lambercy, O.

    2013-01-01

    Haptic paddles--low-cost one-degree-of-freedom force feedback devices--have been used with great success at several universities throughout the US to teach the basic concepts of dynamic systems and physical human-robot interaction (pHRI) to students. The ETHZ haptic paddle was developed for a new pHRI course offered in the undergraduate…

  8. Haptic perception and body representation in lateral and medial occipito-temporal cortices.

    PubMed

    Costantini, Marcello; Urgesi, Cosimo; Galati, Gaspare; Romani, Gian Luca; Aglioti, Salvatore M

    2011-04-01

    Although vision is the primary sensory modality that humans and other primates use to identify objects in the environment, we can recognize crucial object features (e.g., shape, size) using the somatic modality. Previous studies have shown that the occipito-temporal areas dedicated to the visual processing of object forms, faces and bodies also show category-selective responses when the preferred stimuli are haptically explored out of view. Visual processing of human bodies engages specific areas in lateral (extrastriate body area, EBA) and medial (fusiform body area, FBA) occipito-temporal cortex. This study aimed at exploring the relative involvement of EBA and FBA in the haptic exploration of body parts. During fMRI scanning, participants were asked to haptically explore either real-size fake body parts or objects. We found a selective activation of right and left EBA, but not of right FBA, while participants haptically explored body parts as compared to real objects. This suggests that EBA may integrate visual body representations with somatosensory information regarding body parts and form a multimodal representation of the body. Furthermore, both left and right EBA showed a comparable level of body selectivity during haptic perception and visual imagery. However, right but not left EBA was more activated during haptic exploration than visual imagery of body parts, ruling out that the response to haptic body exploration was entirely due to the use of visual imagery. Overall, the results point to the existence of different multimodal body representations in the occipito-temporal cortex which are activated during perception and imagery of human body parts. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Perceptual grouping determines haptic contextual modulation.

    PubMed

    Overvliet, K E; Sayim, B

    2016-09-01

    Since the early phenomenological demonstrations of Gestalt principles, one of the major challenges of Gestalt psychology has been to quantify these principles. Here, we show that contextual modulation, i.e. the influence of context on target perception, can be used as a tool to quantify perceptual grouping in the haptic domain, similar to the visual domain. We investigated the influence of target-flanker grouping on performance in haptic vernier offset discrimination. We hypothesized that when, despite the apparent differences between vision and haptics, similar grouping principles are operational, a similar pattern of flanker interference would be observed in the haptic as in the visual domain. Participants discriminated the offset of a haptic vernier. The vernier was flanked by different flanker configurations: no flankers, single flanking lines, 10 flanking lines, rectangles and single perpendicular lines, varying the degree to which the vernier grouped with the flankers. Additionally, we used two different flanker widths (same width as and narrower than the target), again to vary target-flanker grouping. Our results show a clear effect of flankers: performance was much better when the vernier was presented alone compared to when it was presented with flankers. In the majority of flanker configurations, grouping between the target and the flankers determined the strength of interference, similar to the visual domain. However, in the same width rectangular flanker condition we found aberrant results. We discuss the results of our study in light of similarities and differences between vision and haptics and the interaction between different grouping principles. We conclude that in haptics, similar organization principles apply as in visual perception and argue that grouping and Gestalt are key organization principles not only of vision, but of the perceptual system in general. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Haptic biofeedback for improving compliance with lower-extremity partial weight bearing.

    PubMed

    Fu, Michael C; DeLuke, Levi; Buerba, Rafael A; Fan, Richard E; Zheng, Ying Jean; Leslie, Michael P; Baumgaertner, Michael R; Grauer, Jonathan N

    2014-11-01

    After lower-extremity orthopedic trauma and surgery, patients are often advised to restrict weight bearing on the affected limb. Conventional training methods are not effective at enabling patients to comply with recommendations for partial weight bearing. The current study assessed a novel method of using real-time haptic (vibratory/vibrotactile) biofeedback to improve compliance with instructions for partial weight bearing. Thirty healthy, asymptomatic participants were randomized into 1 of 3 groups: verbal instruction, bathroom scale training, and haptic biofeedback. Participants were instructed to restrict lower-extremity weight bearing in a walking boot with crutches to 25 lb, with an acceptable range of 15 to 35 lb. A custom weight bearing sensor and biofeedback system was attached to all participants, but only those in the haptic biofeedback group were given a vibrotactile signal if they exceeded the acceptable range. Weight bearing in all groups was measured with a separate validated commercial system. The verbal instruction group bore an average of 60.3±30.5 lb (mean±standard deviation). The bathroom scale group averaged 43.8±17.2 lb, whereas the haptic biofeedback group averaged 22.4±9.1 lb (P<.05). As a percentage of body weight, the verbal instruction group averaged 40.2±19.3%, the bathroom scale group averaged 32.5±16.9%, and the haptic biofeedback group averaged 14.5±6.3% (P<.05). In this initial evaluation of the use of haptic biofeedback to improve compliance with lower-extremity partial weight bearing, haptic biofeedback was superior to conventional physical therapy methods. Further studies in patients with clinical orthopedic trauma are warranted. Copyright 2014, SLACK Incorporated.

  11. Enhanced operator perception through 3D vision and haptic feedback

    NASA Astrophysics Data System (ADS)

    Edmondson, Richard; Light, Kenneth; Bodenhamer, Andrew; Bosscher, Paul; Wilkinson, Loren

    2012-06-01

    Polaris Sensor Technologies (PST) has developed a stereo vision upgrade kit for TALON® robot systems comprised of a replacement gripper camera and a replacement mast zoom camera on the robot, and a replacement display in the Operator Control Unit (OCU). Harris Corporation has developed a haptic manipulation upgrade for TALON® robot systems comprised of a replacement arm and gripper and an OCU that provides haptic (force) feedback. PST and Harris have recently collaborated to integrate the 3D vision system with the haptic manipulation system. In multiple studies done at Fort Leonard Wood, Missouri it has been shown that 3D vision and haptics provide more intuitive perception of complicated scenery and improved robot arm control, allowing for improved mission performance and the potential for reduced time on target. This paper discusses the potential benefits of these enhancements to robotic systems used for the domestic homeland security mission.

  12. [Haptic tracking control for minimally invasive robotic surgery].

    PubMed

    Xu, Zhaohong; Song, Chengli; Wu, Wenwu

    2012-06-01

    Haptic feedback plays a significant role in minimally invasive robotic surgery (MIRS). A major deficiency of the current MIRS is the lack of haptic perception for the surgeon, including the commercially available robot da Vinci surgical system. In this paper, a dynamics model of a haptic robot is established based on Newton-Euler method. Because it took some period of time in exact dynamics solution, we used a digital PID arithmetic dependent on robot dynamics to ensure real-time bilateral control, and it could improve tracking precision and real-time control efficiency. To prove the proposed method, an experimental system in which two Novint Falcon haptic devices acting as master-slave system has been developed. Simulations and experiments showed proposed methods could give instrument force feedbacks to operator, and bilateral control strategy is an effective method to master-slave MIRS. The proposed methods could be used to tele-robotic system.

  13. Human-computer interface including haptically controlled interactions

    DOEpatents

    Anderson, Thomas G.

    2005-10-11

    The present invention provides a method of human-computer interfacing that provides haptic feedback to control interface interactions such as scrolling or zooming within an application. Haptic feedback in the present method allows the user more intuitive control of the interface interactions, and allows the user's visual focus to remain on the application. The method comprises providing a control domain within which the user can control interactions. For example, a haptic boundary can be provided corresponding to scrollable or scalable portions of the application domain. The user can position a cursor near such a boundary, feeling its presence haptically (reducing the requirement for visual attention for control of scrolling of the display). The user can then apply force relative to the boundary, causing the interface to scroll the domain. The rate of scrolling can be related to the magnitude of applied force, providing the user with additional intuitive, non-visual control of scrolling.

  14. Data-Driven Modeling and Rendering of Force Responses from Elastic Tool Deformation

    PubMed Central

    Rakhmatov, Ruslan; Ogay, Tatyana; Jeon, Seokhee

    2018-01-01

    This article presents a new data-driven model design for rendering force responses from elastic tool deformation. The new design incorporates a six-dimensional input describing the initial position of the contact, as well as the state of the tool deformation. The input-output relationship of the model was represented by a radial basis functions network, which was optimized based on training data collected from real tool-surface contact. Since the input space of the model is represented in the local coordinate system of a tool, the model is independent of recording and rendering devices and can be easily deployed to an existing simulator. The model also supports complex interactions, such as self and multi-contact collisions. In order to assess the proposed data-driven model, we built a custom data acquisition setup and developed a proof-of-concept rendering simulator. The simulator was evaluated through numerical and psychophysical experiments with four different real tools. The numerical evaluation demonstrated the perceptual soundness of the proposed model, meanwhile the user study revealed the force feedback of the proposed simulator to be realistic. PMID:29342964

  15. All Things Considered: The Bridge of Trois-Sautets

    ERIC Educational Resources Information Center

    School Arts: The Art Education Magazine for Teachers, 2004

    2004-01-01

    The painting by Paul Cezanne, The Bridge of Trois-Sautets, was painted near the time of his death. In this rich watercolor, the artist dispensed with notions of traditional painting goals. Rather than solely focus on a realistic rendering of this scene, Cezanne turned instead to the task of recording his sensorial experience and exploring the…

  16. Haptic Guidance Improves the Visuo-Manual Tracking of Trajectories

    PubMed Central

    Bluteau, Jérémy; Coquillart, Sabine; Payan, Yohan; Gentaz, Edouard

    2008-01-01

    Background Learning to perform new movements is usually achieved by following visual demonstrations. Haptic guidance by a force feedback device is a recent and original technology which provides additional proprioceptive cues during visuo-motor learning tasks. The effects of two types of haptic guidances-control in position (HGP) or in force (HGF)–on visuo-manual tracking (“following”) of trajectories are still under debate. Methodology/Principals Findings Three training techniques of haptic guidance (HGP, HGF or control condition, NHG, without haptic guidance) were evaluated in two experiments. Movements produced by adults were assessed in terms of shapes (dynamic time warping) and kinematics criteria (number of velocity peaks and mean velocity) before and after the training sessions. Trajectories consisted of two Arabic and two Japanese-inspired letters in Experiment 1 and ellipses in Experiment 2. We observed that the use of HGF globally improves the fluency of the visuo-manual tracking of trajectories while no significant improvement was found for HGP or NHG. Conclusion/Significance These results show that the addition of haptic information, probably encoded in force coordinates, play a crucial role on the visuo-manual tracking of new trajectories. PMID:18335049

  17. The effect of perceptual grouping on haptic numerosity perception.

    PubMed

    Verlaers, K; Wagemans, J; Overvliet, K E

    2015-01-01

    We used a haptic enumeration task to investigate whether enumeration can be facilitated by perceptual grouping in the haptic modality. Eight participants were asked to count tangible dots as quickly and accurately as possible, while moving their finger pad over a tactile display. In Experiment 1, we manipulated the number and organization of the dots, while keeping the total exploration area constant. The dots were either evenly distributed on a horizontal line (baseline condition) or organized into groups based on either proximity (dots placed in closer proximity to each other) or configural cues (dots placed in a geometric configuration). In Experiment 2, we varied the distance between the subsets of dots. We hypothesized that when subsets of dots can be grouped together, the enumeration time will be shorter and accuracy will be higher than in the baseline condition. The results of both experiments showed faster enumeration for the configural condition than for the baseline condition, indicating that configural grouping also facilitates haptic enumeration. In Experiment 2, faster enumeration was also observed for the proximity condition than for the baseline condition. Thus, perceptual grouping speeds up haptic enumeration by both configural and proximity cues, suggesting that similar mechanisms underlie perceptual grouping in both visual and haptic enumeration.

  18. Acquisition and Visualization Techniques of Human Motion Using Master-Slave System and Haptograph

    NASA Astrophysics Data System (ADS)

    Katsura, Seiichiro; Ohishi, Kiyoshi

    Artificial acquisition and reproduction of human sensations are basic technologies of communication engineering. For example, auditory information is obtained by a microphone, and a speaker reproduces it by artificial means. Furthermore, a video camera and a television make it possible to transmit visual sensation by broadcasting. On the contrary, since tactile or haptic information is subject to the Newton's “law of action and reaction” in the real world, a device which acquires, transmits, and reproduces the information has not been established. From the point of view, real-world haptics is the key technology for future haptic communication engineering. This paper proposes a novel acquisition method of haptic information named “haptograph”. The haptograph visualizes the haptic information like photograph. Since temporal and spatial analyses are conducted to represent haptic information as the haptograph, it is possible to be recognized and to be evaluated intuitively. In this paper, the proposed haptograph is applied to visualization of human motion. It is possible to represent the motion characteristics, the expert's skill and the personal habit, and so on. In other words, a personal encyclopedia is attained. Once such a personal encyclopedia is stored in ubiquitous environment, the future human support technology will be developed.

  19. Haptic Stylus and Empirical Studies on Braille, Button, and Texture Display

    PubMed Central

    Kyung, Ki-Uk; Lee, Jun-Young; Park, Junseok

    2008-01-01

    This paper presents a haptic stylus interface with a built-in compact tactile display module and an impact module as well as empirical studies on Braille, button, and texture display. We describe preliminary evaluations verifying the tactile display's performance indicating that it can satisfactorily represent Braille numbers for both the normal and the blind. In order to prove haptic feedback capability of the stylus, an experiment providing impact feedback mimicking the click of a button has been conducted. Since the developed device is small enough to be attached to a force feedback device, its applicability to combined force and tactile feedback display in a pen-held haptic device is also investigated. The handle of pen-held haptic interface was replaced by the pen-like interface to add tactile feedback capability to the device. Since the system provides combination of force, tactile and impact feedback, three haptic representation methods for texture display have been compared on surface with 3 texture groups which differ in direction, groove width, and shape. In addition, we evaluate its capacity to support touch screen operations by providing tactile sensations when a user rubs against an image displayed on a monitor. PMID:18317520

  20. Haptic stylus and empirical studies on braille, button, and texture display.

    PubMed

    Kyung, Ki-Uk; Lee, Jun-Young; Park, Junseok

    2008-01-01

    This paper presents a haptic stylus interface with a built-in compact tactile display module and an impact module as well as empirical studies on Braille, button, and texture display. We describe preliminary evaluations verifying the tactile display's performance indicating that it can satisfactorily represent Braille numbers for both the normal and the blind. In order to prove haptic feedback capability of the stylus, an experiment providing impact feedback mimicking the click of a button has been conducted. Since the developed device is small enough to be attached to a force feedback device, its applicability to combined force and tactile feedback display in a pen-held haptic device is also investigated. The handle of pen-held haptic interface was replaced by the pen-like interface to add tactile feedback capability to the device. Since the system provides combination of force, tactile and impact feedback, three haptic representation methods for texture display have been compared on surface with 3 texture groups which differ in direction, groove width, and shape. In addition, we evaluate its capacity to support touch screen operations by providing tactile sensations when a user rubs against an image displayed on a monitor.

  1. Haptics-based dynamic implicit solid modeling.

    PubMed

    Hua, Jing; Qin, Hong

    2004-01-01

    This paper systematically presents a novel, interactive solid modeling framework, Haptics-based Dynamic Implicit Solid Modeling, which is founded upon volumetric implicit functions and powerful physics-based modeling. In particular, we augment our modeling framework with a haptic mechanism in order to take advantage of additional realism associated with a 3D haptic interface. Our dynamic implicit solids are semi-algebraic sets of volumetric implicit functions and are governed by the principles of dynamics, hence responding to sculpting forces in a natural and predictable manner. In order to directly manipulate existing volumetric data sets as well as point clouds, we develop a hierarchical fitting algorithm to reconstruct and represent discrete data sets using our continuous implicit functions, which permit users to further design and edit those existing 3D models in real-time using a large variety of haptic and geometric toolkits, and visualize their interactive deformation at arbitrary resolution. The additional geometric and physical constraints afford more sophisticated control of the dynamic implicit solids. The versatility of our dynamic implicit modeling enables the user to easily modify both the geometry and the topology of modeled objects, while the inherent physical properties can offer an intuitive haptic interface for direct manipulation with force feedback.

  2. Lightwave: An interactive estimation of indirect illumination using waves of light

    NASA Astrophysics Data System (ADS)

    Robertson, Michael

    With the growth of computers and technology, so to has grown the desire to accurately recreate our world using computer graphics. However, our world is very complex and in many ways beyond our comprehension. Therefore, in order to perform this task, we must consider multiple disciplines and areas of research including physics, mathematics, optics, geology, and many more to at the very least approximate the world around us. The applications of being able to do this are plentiful as well, including the use of graphics in entertainment such as movies and games, in science such as weather forecasts and simulations, in medicine with body scans, or used in architecture, design, and many other areas. In order to recreate the world around us, an important task is to accurately recreate the way light travels and affects the objects we see. Rendering lighting has been a heavily researched area since the 1970's and has gotten more sophisticated over the years. Until recent developments in technology, realistic lighting of scenes has only been achievable offline taking seconds to hours or more to create a single image, however, due to advances in graphics technology, realistic lighting can be done in real-time. An important aspect of realistic lighting involves the inclusion of indirect illumination. However, to achieve a real-time rendering with indirect illumination, we must make trade-offs between scientific accuracy and performance, but as will be discussed later, scientific accuracy may not be necessary after all.

  3. Sharing control with haptics: seamless driver support from manual to automatic control.

    PubMed

    Mulder, Mark; Abbink, David A; Boer, Erwin R

    2012-10-01

    Haptic shared control was investigated as a human-machine interface that can intuitively share control between drivers and an automatic controller for curve negotiation. As long as automation systems are not fully reliable, a role remains for the driver to be vigilant to the system and the environment to catch any automation errors. The conventional binary switches between supervisory and manual control has many known issues, and haptic shared control is a promising alternative. A total of 42 respondents of varying age and driving experience participated in a driving experiment in a fixed-base simulator, in which curve negotiation behavior during shared control was compared to during manual control, as well as to three haptic tunings of an automatic controller without driver intervention. Under the experimental conditions studied, the main beneficial effect of haptic shared control compared to manual control was that less control activity (16% in steering wheel reversal rate, 15% in standard deviation of steering wheel angle) was needed for realizing an improved safety performance (e.g., 11% in peak lateral error). Full automation removed the need for any human control activity and improved safety performance (e.g., 35% in peak lateral error) but put the human in a supervisory position. Haptic shared control kept the driver in the loop, with enhanced performance at reduced control activity, mitigating the known issues that plague full automation. Haptic support for vehicular control ultimately seeks to intuitively combine human intelligence and creativity with the benefits of automation systems.

  4. Early visual experience and the recognition of basic facial expressions: involvement of the middle temporal and inferior frontal gyri during haptic identification by the early blind

    PubMed Central

    Kitada, Ryo; Okamoto, Yuko; Sasaki, Akihiro T.; Kochiyama, Takanori; Miyahara, Motohide; Lederman, Susan J.; Sadato, Norihiro

    2012-01-01

    Face perception is critical for social communication. Given its fundamental importance in the course of evolution, the innate neural mechanisms can anticipate the computations necessary for representing faces. However, the effect of visual deprivation on the formation of neural mechanisms that underlie face perception is largely unknown. We previously showed that sighted individuals can recognize basic facial expressions by haptics surprisingly well. Moreover, the inferior frontal gyrus (IFG) and posterior superior temporal sulcus (pSTS) in the sighted subjects are involved in haptic and visual recognition of facial expressions. Here, we conducted both psychophysical and functional magnetic-resonance imaging (fMRI) experiments to determine the nature of the neural representation that subserves the recognition of basic facial expressions in early blind individuals. In a psychophysical experiment, both early blind and sighted subjects haptically identified basic facial expressions at levels well above chance. In the subsequent fMRI experiment, both groups haptically identified facial expressions and shoe types (control). The sighted subjects then completed the same task visually. Within brain regions activated by the visual and haptic identification of facial expressions (relative to that of shoes) in the sighted group, corresponding haptic identification in the early blind activated regions in the inferior frontal and middle temporal gyri. These results suggest that the neural system that underlies the recognition of basic facial expressions develops supramodally even in the absence of early visual experience. PMID:23372547

  5. Design of a 7-DOF slave robot integrated with a magneto-rheological haptic master

    NASA Astrophysics Data System (ADS)

    Hwang, Yong-Hoon; Cha, Seung-Woo; Kang, Seok-Rae; Choi, Seung-Bok

    2017-04-01

    In this study, a 7-DOF slave robot integrated with the haptic master is designed and its dynamic motion is controlled. The haptic master is made using a controllable magneto-rheological (MR) clutch and brake and it provides the surgeon with a sense of touch by using both kinetic and kinesthetic information. Due to the size constraint of the slave robot, a wire actuating is adopted to make the desired motion of the end-effector which has 3-DOF instead of a conventional direct-driven motor. Another motions of the link parts that have 4-DOF use direct-driven motor. In total system, for working as a haptic device, the haptic master need to receive the information of repulsive forces applied on the slave robot. Therefore, repulsive forces on the end-effector are sensed by using three uniaxial torque transducer inserted in the wire actuating system and another repulsive forces applied on link part are sensed by using 6-axis transducer that is able to sense forces and torques. Using another 6-axis transducer, verify the reliability of force information on final end of slave robot. Lastly, integrated with a MR haptic master, psycho-physical test is conducted by different operators who can feel the different repulsive force or torque generated from the haptic master which is equivalent to the force or torque occurred on the end-effector to demonstrate the effectiveness of the proposed system.

  6. Haptic exploration of fingertip-sized geometric features using a multimodal tactile sensor

    NASA Astrophysics Data System (ADS)

    Ponce Wong, Ruben D.; Hellman, Randall B.; Santos, Veronica J.

    2014-06-01

    Haptic perception remains a grand challenge for artificial hands. Dexterous manipulators could be enhanced by "haptic intelligence" that enables identification of objects and their features via touch alone. Haptic perception of local shape would be useful when vision is obstructed or when proprioceptive feedback is inadequate, as observed in this study. In this work, a robot hand outfitted with a deformable, bladder-type, multimodal tactile sensor was used to replay four human-inspired haptic "exploratory procedures" on fingertip-sized geometric features. The geometric features varied by type (bump, pit), curvature (planar, conical, spherical), and footprint dimension (1.25 - 20 mm). Tactile signals generated by active fingertip motions were used to extract key parameters for use as inputs to supervised learning models. A support vector classifier estimated order of curvature while support vector regression models estimated footprint dimension once curvature had been estimated. A distal-proximal stroke (along the long axis of the finger) enabled estimation of order of curvature with an accuracy of 97%. Best-performing, curvature-specific, support vector regression models yielded R2 values of at least 0.95. While a radial-ulnar stroke (along the short axis of the finger) was most helpful for estimating feature type and size for planar features, a rolling motion was most helpful for conical and spherical features. The ability to haptically perceive local shape could be used to advance robot autonomy and provide haptic feedback to human teleoperators of devices ranging from bomb defusal robots to neuroprostheses.

  7. Early visual experience and the recognition of basic facial expressions: involvement of the middle temporal and inferior frontal gyri during haptic identification by the early blind.

    PubMed

    Kitada, Ryo; Okamoto, Yuko; Sasaki, Akihiro T; Kochiyama, Takanori; Miyahara, Motohide; Lederman, Susan J; Sadato, Norihiro

    2013-01-01

    Face perception is critical for social communication. Given its fundamental importance in the course of evolution, the innate neural mechanisms can anticipate the computations necessary for representing faces. However, the effect of visual deprivation on the formation of neural mechanisms that underlie face perception is largely unknown. We previously showed that sighted individuals can recognize basic facial expressions by haptics surprisingly well. Moreover, the inferior frontal gyrus (IFG) and posterior superior temporal sulcus (pSTS) in the sighted subjects are involved in haptic and visual recognition of facial expressions. Here, we conducted both psychophysical and functional magnetic-resonance imaging (fMRI) experiments to determine the nature of the neural representation that subserves the recognition of basic facial expressions in early blind individuals. In a psychophysical experiment, both early blind and sighted subjects haptically identified basic facial expressions at levels well above chance. In the subsequent fMRI experiment, both groups haptically identified facial expressions and shoe types (control). The sighted subjects then completed the same task visually. Within brain regions activated by the visual and haptic identification of facial expressions (relative to that of shoes) in the sighted group, corresponding haptic identification in the early blind activated regions in the inferior frontal and middle temporal gyri. These results suggest that the neural system that underlies the recognition of basic facial expressions develops supramodally even in the absence of early visual experience.

  8. Ascending and Descending in Virtual Reality: Simple and Safe System Using Passive Haptics.

    PubMed

    Nagao, Ryohei; Matsumoto, Keigo; Narumi, Takuji; Tanikawa, Tomohiro; Hirose, Michitaka

    2018-04-01

    This paper presents a novel interactive system that provides users with virtual reality (VR) experiences, wherein users feel as if they are ascending/descending stairs through passive haptic feedback. The passive haptic stimuli are provided by small bumps under the feet of users; these stimuli are provided to represent the edges of the stairs in the virtual environment. The visual stimuli of the stairs and shoes, provided by head-mounted displays, evoke a visuo-haptic interaction that modifies a user's perception of the floor shape. Our system enables users to experience all types of stairs, such as half-turn and spiral stairs, in a VR setting. We conducted a preliminary user study and two experiments to evaluate the proposed technique. The preliminary user study investigated the effectiveness of the basic idea associated with the proposed technique for the case of a user ascending stairs. The results demonstrated that the passive haptic feedback produced by the small bumps enhanced the user's feeling of presence and sense of ascending. We subsequently performed an experiment to investigate an improved viewpoint manipulation method and the interaction of the manipulation and haptics for both the ascending and descending cases. The experimental results demonstrated that the participants had a feeling of presence and felt a steep stair gradient under the condition of haptic feedback and viewpoint manipulation based on the characteristics of actual stair walking data. However, these results also indicated that the proposed system may not be as effective in providing a sense of descending stairs without an optimization of the haptic stimuli. We then redesigned the shape of the small bumps, and evaluated the design in a second experiment. The results indicated that the best shape to present haptic stimuli is a right triangle cross section in both the ascending and descending cases. Although it is necessary to install small protrusions in the determined direction, by using this optimized shape the users feeling of presence of the stairs and the sensation of walking up and down was enhanced.

  9. Plenoptic layer-based modeling for image based rendering.

    PubMed

    Pearson, James; Brookes, Mike; Dragotti, Pier Luigi

    2013-09-01

    Image based rendering is an attractive alternative to model based rendering for generating novel views because of its lower complexity and potential for photo-realistic results. To reduce the number of images necessary for alias-free rendering, some geometric information for the 3D scene is normally necessary. In this paper, we present a fast automatic layer-based method for synthesizing an arbitrary new view of a scene from a set of existing views. Our algorithm takes advantage of the knowledge of the typical structure of multiview data to perform occlusion-aware layer extraction. In addition, the number of depth layers used to approximate the geometry of the scene is chosen based on plenoptic sampling theory with the layers placed non-uniformly to account for the scene distribution. The rendering is achieved using a probabilistic interpolation approach and by extracting the depth layer information on a small number of key images. Numerical results demonstrate that the algorithm is fast and yet is only 0.25 dB away from the ideal performance achieved with the ground-truth knowledge of the 3D geometry of the scene of interest. This indicates that there are measurable benefits from following the predictions of plenoptic theory and that they remain true when translated into a practical system for real world data.

  10. A survey of telerobotic surface finishing

    NASA Astrophysics Data System (ADS)

    Höglund, Thomas; Alander, Jarmo; Mantere, Timo

    2018-05-01

    This is a survey of research published on the subjects of telerobotics, haptic feedback, and mixed reality applied to surface finishing. The survey especially focuses on how visuo-haptic feedback can be used to improve a grinding process using a remote manipulator or robot. The benefits of teleoperation and reasons for using haptic feedback are presented. The use of genetic algorithms for optimizing haptic sensing is briefly discussed. Ways of augmenting the operator's vision are described. Visual feedback can be used to find defects and analyze the quality of the surface resulting from the surface finishing process. Visual cues can also be used to aid a human operator in manipulating a robot precisely and avoiding collisions.

  11. Differences between early-blind, late-blind, and blindfolded-sighted people in haptic spatial-configuration learning and resulting memory traces.

    PubMed

    Postma, Albert; Zuidhoek, Sander; Noordzij, Matthijs L; Kappers, Astrid M L

    2007-01-01

    The roles of visual and haptic experience in different aspects of haptic processing of objects in peripersonal space are examined. In three trials, early-blind, late-blind, and blindfolded-sighted individuals had to match ten shapes haptically to the cut-outs in a board as fast as possible. Both blind groups were much faster than the sighted in all three trials. All three groups improved considerably from trial to trial. In particular, the sighted group showed a strong improvement from the first to the second trial. While superiority of the blind remained for speeded matching after rotation of the stimulus frame, coordinate positional-memory scores in a non-speeded free-recall trial showed no significant differences between the groups. Moreover, when assessed with a verbal response, categorical spatial-memory appeared strongest in the late-blind group. The role of haptic and visual experience thus appears to depend on the task aspect tested.

  12. Forces on intraocular lens haptics induced by capsular fibrosis. An experimental study.

    PubMed

    Guthoff, R; Abramo, F; Draeger, J; Chumbley, L C; Lang, G K; Neumann, W

    1990-01-01

    Electronic dynamometry measurements, performed upon intraocular lens (IOL) haptics of prototype one-piece three-loop silicone lenses, accurately defined the relationships between elastic force and haptic displacement. Lens implantations in the capsular bag of dogs (loop span equal to capsular bag diameter, loops underformed immediately after the operation) were evaluated macrophotographically 5-8 months postoperatively. The highly constant elastic property of silicon rubber permitted quantitative correlation of subsequent in vivo haptic displacement with the resultant force vectors responsible for tissue contraction. The lens optics were well centered in 17 (85%) and slightly offcenter in 3 (15%) of 20 implanted eyes. Of the 60 supporting loops, 28 could be visualized sufficiently well to permit reliable haptic measurement. Of these 28, 20 (71%) were clearly displaced, ranging from 0.45 mm away from to 1.4 mm towards the lens' optic center. These extremes represented resultant vector forces of 0.20 and 1.23 mN respectively. Quantitative vector analysis permits better understanding of IOL-capsular interactions.

  13. A haptic-inspired audio approach for structural health monitoring decision-making

    NASA Astrophysics Data System (ADS)

    Mao, Zhu; Todd, Michael; Mascareñas, David

    2015-03-01

    Haptics is the field at the interface of human touch (tactile sensation) and classification, whereby tactile feedback is used to train and inform a decision-making process. In structural health monitoring (SHM) applications, haptic devices have been introduced and applied in a simplified laboratory scale scenario, in which nonlinearity, representing the presence of damage, was encoded into a vibratory manual interface. In this paper, the "spirit" of haptics is adopted, but here ultrasonic guided wave scattering information is transformed into audio (rather than tactile) range signals. After sufficient training, the structural damage condition, including occurrence and location, can be identified through the encoded audio waveforms. Different algorithms are employed in this paper to generate the transformed audio signals and the performance of each encoding algorithms is compared, and also compared with standard machine learning classifiers. In the long run, the haptic decision-making is aiming to detect and classify structural damages in a more rigorous environment, and approaching a baseline-free fashion with embedded temperature compensation.

  14. Learning to perceive haptic distance-to-break in the presence of friction.

    PubMed

    Altenhoff, Bliss M; Pagano, Christopher C; Kil, Irfan; Burg, Timothy C

    2017-02-01

    Two experiments employed attunement and calibration training to investigate whether observers are able to identify material break points in compliant materials through haptic force application. The task required participants to attune to a recently identified haptic invariant, distance-to-break (DTB), rather than haptic stimulation not related to the invariant, including friction. In the first experiment participants probed simulated force-displacement relationships (materials) under 3 levels of friction with the aim of pushing as far as possible into the materials without breaking them. In a second experiment a different set of participants pulled on the materials. Results revealed that participants are sensitive to DTB for both pushing and pulling, even in the presence of varying levels of friction, and this sensitivity can be improved through training. The results suggest that the simultaneous presence of friction may assist participants in perceiving DTB. Potential applications include the development of haptic training programs for minimally invasive (laparoscopic) surgery to reduce accidental tissue damage. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  15. Mnemonic neuronal activity in somatosensory cortex.

    PubMed Central

    Zhou, Y D; Fuster, J M

    1996-01-01

    Single-unit activity was recorded from the hand areas of the somatosensory cortex of monkeys trained to perform a haptic delayed matching to sample task with objects of identical dimensions but different surface features. During the memory retention period of the task (delay), many units showed sustained firing frequency change, either excitation or inhibition. In some cases, firing during that period was significantly higher after one sample object than after another. These observations indicate the participation of somatosensory neurons not only in the perception but in the short-term memory of tactile stimuli. Neurons most directly implicated in tactile memory are (i) those with object-selective delay activity, (ii) those with nondifferential delay activity but without activity related to preparation for movement, and (iii) those with delay activity in the haptic-haptic delayed matching task but no such activity in a control visuo-haptic delayed matching task. The results indicate that cells in early stages of cortical somatosensory processing participate in haptic short-term memory. PMID:8927629

  16. Ergonomic evaluation of 3D plane positioning using a mouse and a haptic device.

    PubMed

    Paul, Laurent; Cartiaux, Olivier; Docquier, Pierre-Louis; Banse, Xavier

    2009-12-01

    Preoperative planning and intraoperative assistance are needed to improve accuracy in tumour surgery. To be accepted, these processes must be efficient. An experiment was conducted to compare a mouse and a haptic device, with and without force feedback, to perform plan positioning in a 3D space. Ergonomics and performance factors were investigated during the experiment. Positioning strategies were observed. The task completion time, number of 3D orientations and failure rate were analysed. A questionnaire on ergonomics was filled out by each participant. The haptic device showed a significantly lower failure rate and was quicker and more ergonomic than the mouse. The force feedback was not beneficial to the accomplishment of the task. The haptic device is intuitive, ergonomic and more efficient than the mouse for positioning a 3D plane into a 3D space. Useful observations regarding positioning strategies will improve the integration of haptic devices into medical applications. Copyright (c) 2009 John Wiley & Sons, Ltd.

  17. Detection of Membrane Puncture with Haptic Feedback using a Tip-Force Sensing Needle.

    PubMed

    Elayaperumal, Santhi; Bae, Jung Hwa; Daniel, Bruce L; Cutkosky, Mark R

    2014-09-01

    This paper presents calibration and user test results of a 3-D tip-force sensing needle with haptic feedback. The needle is a modified MRI-compatible biopsy needle with embedded fiber Bragg grating (FBG) sensors for strain detection. After calibration, the needle is interrogated at 2 kHz, and dynamic forces are displayed remotely with a voice coil actuator. The needle is tested in a single-axis master/slave system, with the voice coil haptic display at the master, and the needle at the slave end. Tissue phantoms with embedded membranes were used to determine the ability of the tip-force sensors to provide real-time haptic feedback as compared to external sensors at the needle base during needle insertion via the master/slave system. Subjects were able to determine the position of the embedded membranes with significantly better accuracy using FBG tip feedback than with base feedback using a commercial force/torque sensor (p = 0.045) or with no added haptic feedback (p = 0.0024).

  18. Detection of Membrane Puncture with Haptic Feedback using a Tip-Force Sensing Needle

    PubMed Central

    Elayaperumal, Santhi; Bae, Jung Hwa; Daniel, Bruce L.; Cutkosky, Mark R.

    2015-01-01

    This paper presents calibration and user test results of a 3-D tip-force sensing needle with haptic feedback. The needle is a modified MRI-compatible biopsy needle with embedded fiber Bragg grating (FBG) sensors for strain detection. After calibration, the needle is interrogated at 2 kHz, and dynamic forces are displayed remotely with a voice coil actuator. The needle is tested in a single-axis master/slave system, with the voice coil haptic display at the master, and the needle at the slave end. Tissue phantoms with embedded membranes were used to determine the ability of the tip-force sensors to provide real-time haptic feedback as compared to external sensors at the needle base during needle insertion via the master/slave system. Subjects were able to determine the position of the embedded membranes with significantly better accuracy using FBG tip feedback than with base feedback using a commercial force/torque sensor (p = 0.045) or with no added haptic feedback (p = 0.0024). PMID:26509101

  19. The role of haptic versus visual volume cues in the size-weight illusion.

    PubMed

    Ellis, R R; Lederman, S J

    1993-03-01

    Three experiments establish the size-weight illusion as a primarily haptic phenomenon, despite its having been more traditionally considered an example of vision influencing haptic processing. Experiment 1 documents, across a broad range of stimulus weights and volumes, the existence of a purely haptic size-weight illusion, equal in strength to the traditional illusion. Experiment 2 demonstrates that haptic volume cues are both sufficient and necessary for a full-strength illusion. In contrast, visual volume cues are merely sufficient, and produce a relatively weaker effect. Experiment 3 establishes that congenitally blind subjects experience an effect as powerful as that of blindfolded sighted observers, thus demonstrating that visual imagery is also unnecessary for a robust size-weight illusion. The results are discussed in terms of their implications for both sensory and cognitive theories of the size-weight illusion. Applications of this work to a human factors design and to sensor-based systems for robotic manipulation are also briefly considered.

  20. Visual and visually mediated haptic illusions with Titchener's ⊥.

    PubMed

    Landwehr, Klaus

    2014-05-01

    For a replication and expansion of a previous experiment of mine, 14 newly recruited participants provided haptic and verbal estimates of the lengths of the two lines that make up Titchener's ⊥. The stimulus was presented at two different orientations (frontoparallel vs. horizontal) and rotated in steps of 45 deg around 2π. Haptically, the divided line of the ⊥ was generally underestimated, especially at a horizontal orientation. Verbal judgments also differed according to presentation condition and to which line was the target, with the overestimation of the undivided line ranging between 6.2 % and 15.3 %. The results are discussed with reference to the two-visual-systems theory of perception and action, neuroscientific accounts, and also recent historical developments (the use of handheld touchscreens, in particular), because the previously reported "haptic induction effect" (the scaling of haptic responses to the divided line of the ⊥, depending on the length of the undivided one) did not replicate.

  1. Potential Cost Savings of Contrast-Enhanced Digital Mammography.

    PubMed

    Patel, Bhavika K; Gray, Richard J; Pockaj, Barbara A

    2017-06-01

    The purpose of this article is to discuss whether the sensitivity and specificity of contrast-enhanced digital mammography (CEDM) render it a viable diagnostic alternative to breast MRI. That CEDM couples low-energy images (comparable to the diagnostic quality of standard mammography) and subtracted contrast-enhanced mammograms make it a cost-effective modality and a realistic substitute for the more costly breast MRI.

  2. Black holes with surrounding matter in scalar-tensor theories.

    PubMed

    Cardoso, Vitor; Carucci, Isabella P; Pani, Paolo; Sotiriou, Thomas P

    2013-09-13

    We uncover two mechanisms that can render Kerr black holes unstable in scalar-tensor gravity, both associated with the presence of matter in the vicinity of the black hole and the fact that this introduces an effective mass for the scalar. Our results highlight the importance of understanding the structure of spacetime in realistic, astrophysical black holes in scalar-tensor theories.

  3. Refining Windows and Frames: Visions toward Integration in the Discipline(s) of Communication. Part II.

    ERIC Educational Resources Information Center

    Burke, Ken

    1998-01-01

    Detailed analyses are made of the concepts of window (seemingly deep spatial renderings) and frame (flatter, more technique-conscious structures) as they apply to a wide variety of visual media and communicative purposes. Special cases of each of these are detailed, along with their applications in cinema history to a range of realist, formalist,…

  4. Command Recognition of Robot with Low Dimension Whole-Body Haptic Sensor

    NASA Astrophysics Data System (ADS)

    Ito, Tatsuya; Tsuji, Toshiaki

    The authors have developed “haptic armor”, a whole-body haptic sensor that has an ability to estimate contact position. Although it is developed for safety assurance of robots in human environment, it can also be used as an interface. This paper proposes a command recognition method based on finger trace information. This paper also discusses some technical issues for improving recognition accuracy of this system.

  5. Persistent Neuronal Firing in Primary Somatosensory Cortex in the Absence of Working Memory of Trial-Specific Features of the Sample Stimuli in a Haptic Working Memory Task

    ERIC Educational Resources Information Center

    Wang, Liping; Li, Xianchun; Hsiao, Steven S.; Bodner, Mark; Lenz, Fred; Zhou, Yong-Di

    2012-01-01

    Previous studies suggested that primary somatosensory (SI) neurons in well-trained monkeys participated in the haptic-haptic unimodal delayed matching-to-sample (DMS) task. In this study, 585 SI neurons were recorded in monkeys performing a task that was identical to that in the previous studies but without requiring discrimination and active…

  6. Exploring laterality and memory effects in the haptic discrimination of verbal and non-verbal shapes.

    PubMed

    Stoycheva, Polina; Tiippana, Kaisa

    2018-03-14

    The brain's left hemisphere often displays advantages in processing verbal information, while the right hemisphere favours processing non-verbal information. In the haptic domain due to contra-lateral innervations, this functional lateralization is reflected in a hand advantage during certain functions. Findings regarding the hand-hemisphere advantage for haptic information remain contradictory, however. This study addressed these laterality effects and their interaction with memory retention times in the haptic modality. Participants performed haptic discrimination of letters, geometric shapes and nonsense shapes at memory retention times of 5, 15 and 30 s with the left and right hand separately, and we measured the discriminability index d'. The d' values were significantly higher for letters and geometric shapes than for nonsense shapes. This might result from dual coding (naming + spatial) or/and from a low stimulus complexity. There was no stimulus-specific laterality effect. However, we found a time-dependent laterality effect, which revealed that the performance of the left hand-right hemisphere was sustained up to 15 s, while the performance of the right-hand-left hemisphere decreased progressively throughout all retention times. This suggests that haptic memory traces are more robust to decay when they are processed by the left hand-right hemisphere.

  7. Haptic feedback improves surgeons' user experience and fracture reduction in facial trauma simulation.

    PubMed

    Girod, Sabine; Schvartzman, Sara C; Gaudilliere, Dyani; Salisbury, Kenneth; Silva, Rebeka

    2016-01-01

    Computer-assisted surgical (CAS) planning tools are available for craniofacial surgery, but are usually based on computer-aided design (CAD) tools that lack the ability to detect the collision of virtual objects (i.e., fractured bone segments). We developed a CAS system featuring a sense of touch (haptic) that enables surgeons to physically interact with individual, patient-specific anatomy and immerse in a three-dimensional virtual environment. In this study, we evaluated initial user experience with our novel system compared to an existing CAD system. Ten surgery resident trainees received a brief verbal introduction to both the haptic and CAD systems. Users simulated mandibular fracture reduction in three clinical cases within a 15 min time limit for each system and completed a questionnaire to assess their subjective experience. We compared standard landmarks and linear and angular measurements between the simulated results and the actual surgical outcome and found that haptic simulation results were not significantly different from actual postoperative outcomes. In contrast, CAD results significantly differed from both the haptic simulation and actual postoperative results. In addition to enabling a more accurate fracture repair, the haptic system provided a better user experience than the CAD system in terms of intuitiveness and self-reported quality of repair.

  8. A method of emotion contagion for crowd evacuation

    NASA Astrophysics Data System (ADS)

    Cao, Mengxiao; Zhang, Guijuan; Wang, Mengsi; Lu, Dianjie; Liu, Hong

    2017-10-01

    The current evacuation model does not consider the impact of emotion and personality on crowd evacuation. Thus, there is large difference between evacuation results and the real-life behavior of the crowd. In order to generate more realistic crowd evacuation results, we present a method of emotion contagion for crowd evacuation. First, we combine OCEAN (Openness, Extroversion, Agreeableness, Neuroticism, Conscientiousness) model and SIS (Susceptible Infected Susceptible) model to construct the P-SIS (Personalized SIS) emotional contagion model. The P-SIS model shows the diversity of individuals in crowd effectively. Second, we couple the P-SIS model with the social force model to simulate emotional contagion on crowd evacuation. Finally, the photo-realistic rendering method is employed to obtain the animation of crowd evacuation. Experimental results show that our method can simulate crowd evacuation realistically and has guiding significance for crowd evacuation in the emergency circumstances.

  9. Challenges to the development of complex virtual reality surgical simulations.

    PubMed

    Seymour, N E; Røtnes, J S

    2006-11-01

    Virtual reality simulation in surgical training has become more widely used and intensely investigated in an effort to develop safer, more efficient, measurable training processes. The development of virtual reality simulation of surgical procedures has begun, but well-described technical obstacles must be overcome to permit varied training in a clinically realistic computer-generated environment. These challenges include development of realistic surgical interfaces and physical objects within the computer-generated environment, modeling of realistic interactions between objects, rendering of the surgical field, and development of signal processing for complex events associated with surgery. Of these, the realistic modeling of tissue objects that are fully responsive to surgical manipulations is the most challenging. Threats to early success include relatively limited resources for development and procurement, as well as smaller potential for return on investment than in other simulation industries that face similar problems. Despite these difficulties, steady progress continues to be made in these areas. If executed properly, virtual reality offers inherent advantages over other training systems in creating a realistic surgical environment and facilitating measurement of surgeon performance. Once developed, complex new virtual reality training devices must be validated for their usefulness in formative training and assessment of skill to be established.

  10. Modeling and modification of medical 3D objects. The benefit of using a haptic modeling tool.

    PubMed

    Kling-Petersen, T; Rydmark, M

    2000-01-01

    The Computer Laboratory of the medical faculty in Goteborg (Mednet) has since the end of 1998 been one of a limited numbers of participants in the development of a new modeling tool together with SensAble Technologies Inc [http:¿www.sensable.com/]. The software called SensAble FreeForm was officially released at Siggraph September 1999. Briefly, the software mimics the modeling techniques traditionally used by clay artists. An imported model or a user defined block of "clay" can be modified using different tools such as a ball, square block, scrape etc via the use of a SensAble Technologies PHANToM haptic arm. The model will deform in 3D as a result of touching the "clay" with any selected tool and the amount of deformation is linear to the force applied. By getting instantaneous haptic as well as visual feedback, precise and intuitive changes are easily made. While SensAble FreeForm lacks several of the features normally associated with a 3D modeling program (such as text handling, application of surface and bumpmaps, high-end rendering engines, etc) it's strength lies in the ability to rapidly create non-geometric 3D models. For medical use, very few anatomically correct models are created from scratch. However, FreeForm features tools enable advanced modification of reconstructed or 3D scanned models. One of the main problems with 3D laserscanning of medical specimens is that the technique usually leaves holes or gaps in the dataset corresponding to areas in shadows such as orifices, deep grooves etc. By using FreeForms different tools, these defects are easily corrected and gaps are filled out. Similarly, traditional 3D reconstruction (based on serial sections etc) often shows artifacts as a result of the triangulation and/or tessellation processes. These artifacts usually manifest as unnatural ridges or uneven areas ("the accordion effect"). FreeForm contains a smoothing algorithm that enables the user to select an area to be modified and subsequently apply any given amount of smoothing to the object. While the final objects need to be exported for further 3D graphic manipulation, FreeForm addresses one of the most time consuming problems of 3D modeling: modification and creation of non-geometric 3D objects.

  11. Force modeling for incision surgery into tissue with haptic application

    NASA Astrophysics Data System (ADS)

    Kim, Pyunghwa; Kim, Soomin; Choi, Seung-Hyun; Oh, Jong-Seok; Choi, Seung-Bok

    2015-04-01

    This paper presents a novel force modeling for an incision surgery into tissue and its haptic application for a surgeon. During the robot-assisted incision surgery, it is highly urgent to develop the haptic system for realizing sense of touch in the surgical area because surgeons cannot sense sensations. To achieve this goal, the force modeling related to reaction force of biological tissue is proposed in the perspective on energy. The force model describes reaction force focused on the elastic feature of tissue during the incision surgery. Furthermore, the force is realized using calculated information from the model by haptic device using magnetorheological fluid (MRF). The performance of realized force that is controlled by PID controller with open loop control is evaluated.

  12. A Model for Steering with Haptic-Force Guidance

    NASA Astrophysics Data System (ADS)

    Yang, Xing-Dong; Irani, Pourang; Boulanger, Pierre; Bischof, Walter F.

    Trajectory-based tasks are common in many applications and have been widely studied. Recently, researchers have shown that even very simple tasks, such as selecting items from cascading menus, can benefit from haptic-force guidance. Haptic guidance is also of significant value in many applications such as medical training, handwriting learning, and in applications requiring precise manipulations. There are, however, only very few guiding principles for selecting parameters that are best suited for proper force guiding. In this paper, we present a model, derived from the steering law that relates movement time to the essential components of a tunneling task in the presence of haptic-force guidance. Results of an experiment show that our model is highly accurate for predicting performance times in force-enhanced tunneling tasks.

  13. Environments for online maritime simulators with cloud computing capabilities

    NASA Astrophysics Data System (ADS)

    Raicu, Gabriel; Raicu, Alexandra

    2016-12-01

    This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.

  14. Practice on an augmented reality/haptic simulator and library of virtual brains improves residents' ability to perform a ventriculostomy.

    PubMed

    Yudkowsky, Rachel; Luciano, Cristian; Banerjee, Pat; Schwartz, Alan; Alaraj, Ali; Lemole, G Michael; Charbel, Fady; Smith, Kelly; Rizzi, Silvio; Byrne, Richard; Bendok, Bernard; Frim, David

    2013-02-01

    Ventriculostomy is a neurosurgical procedure for providing therapeutic cerebrospinal fluid drainage. Complications may arise during repeated attempts at placing the catheter in the ventricle. We studied the impact of simulation-based practice with a library of virtual brains on neurosurgery residents' performance in simulated and live surgical ventriculostomies. Using computed tomographic scans of actual patients, we developed a library of 15 virtual brains for the ImmersiveTouch system, a head- and hand-tracked augmented reality and haptic simulator. The virtual brains represent a range of anatomies including normal, shifted, and compressed ventricles. Neurosurgery residents participated in individual simulator practice on the library of brains including visualizing the 3-dimensional location of the catheter within the brain immediately after each insertion. Performance of participants on novel brains in the simulator and during actual surgery before and after intervention was analyzed using generalized linear mixed models. Simulator cannulation success rates increased after intervention, and live procedure outcomes showed improvement in the rate of successful cannulation on the first pass. However, the incidence of deeper, contralateral (simulator) and third-ventricle (live) placements increased after intervention. Residents reported that simulations were realistic and helpful in improving procedural skills such as aiming the probe, sensing the pressure change when entering the ventricle, and estimating how far the catheter should be advanced within the ventricle. Simulator practice with a library of virtual brains representing a range of anatomies and difficulty levels may improve performance, potentially decreasing complications due to inexpert technique.

  15. Color analysis and image rendering of woodblock prints with oil-based ink

    NASA Astrophysics Data System (ADS)

    Horiuchi, Takahiko; Tanimoto, Tetsushi; Tominaga, Shoji

    2012-01-01

    This paper proposes a method for analyzing the color characteristics of woodblock prints having oil-based ink and rendering realistic images based on camera data. The analysis results of woodblock prints show some characteristic features in comparison with oil paintings: 1) A woodblock print can be divided into several cluster areas, each with similar surface spectral reflectance; and 2) strong specular reflection from the influence of overlapping paints arises only in specific cluster areas. By considering these properties, we develop an effective rendering algorithm by modifying our previous algorithm for oil paintings. A set of surface spectral reflectances of a woodblock print is represented by using only a small number of average surface spectral reflectances and the registered scaling coefficients, whereas the previous algorithm for oil paintings required surface spectral reflectances of high dimension at all pixels. In the rendering process, in order to reproduce the strong specular reflection in specific cluster areas, we use two sets of parameters in the Torrance-Sparrow model for cluster areas with or without strong specular reflection. An experiment on a woodblock printing with oil-based ink was performed to demonstrate the feasibility of the proposed method.

  16. Volumetric Visualization of Human Skin

    NASA Astrophysics Data System (ADS)

    Kawai, Toshiyuki; Kurioka, Yoshihiro

    We propose a modeling and rendering technique of human skin, which can provide realistic color, gloss and translucency for various applications in computer graphics. Our method is based on volumetric representation of the structure inside of the skin. Our model consists of the stratum corneum and three layers of pigments. The stratum corneum has also layered structure in which the incident light is reflected, refracted and diffused. Each layer of pigment has carotene, melanin or hemoglobin. The density distributions of pigments which define the color of each layer can be supplied as one of the voxel values. Surface normals of upper-side voxels are fluctuated to produce bumps and lines on the skin. We apply ray tracing approach to this model to obtain the rendered image. Multiple scattering in the stratum corneum, reflective and absorptive spectrum of pigments are considered. We also consider Fresnel term to calculate the specular component for glossy surface of skin. Some examples of rendered images are shown, which can successfully visualize a human skin.

  17. INCREASING SAVING BEHAVIOR THROUGH AGE-PROGRESSED RENDERINGS OF THE FUTURE SELF.

    PubMed

    Hershfield, Hal E; Goldstein, Daniel G; Sharpe, William F; Fox, Jesse; Yeykelis, Leo; Carstensen, Laura L; Bailenson, Jeremy N

    2011-11-01

    Many people fail to save what they need to for retirement (Munnell, Webb, and Golub-Sass 2009). Research on excessive discounting of the future suggests that removing the lure of immediate rewards by pre-committing to decisions, or elaborating the value of future rewards can both make decisions more future-oriented. In this article, we explore a third and complementary route, one that deals not with present and future rewards, but with present and future selves. In line with thinkers who have suggested that people may fail, through a lack of belief or imagination, to identify with their future selves (Parfit 1971; Schelling 1984), we propose that allowing people to interact with age-progressed renderings of themselves will cause them to allocate more resources toward the future. In four studies, participants interacted with realistic computer renderings of their future selves using immersive virtual reality hardware and interactive decision aids. In all cases, those who interacted with virtual future selves exhibited an increased tendency to accept later monetary rewards over immediate ones.

  18. Haptic over visual information in the distribution of visual attention after tool-use in near and far space.

    PubMed

    Park, George D; Reed, Catherine L

    2015-10-01

    Despite attentional prioritization for grasping space near the hands, tool-use appears to transfer attentional bias to the tool's end/functional part. The contributions of haptic and visual inputs to attentional distribution along a tool were investigated as a function of tool-use in near (Experiment 1) and far (Experiment 2) space. Visual attention was assessed with a 50/50, go/no-go, target discrimination task, while a tool was held next to targets appearing near the tool-occupied hand or tool-end. Target response times (RTs) and sensitivity (d-prime) were measured at target locations, before and after functional tool practice for three conditions: (1) open-tool: tool-end visible (visual + haptic inputs), (2) hidden-tool: tool-end visually obscured (haptic input only), and (3) short-tool: stick missing tool's length/end (control condition: hand occupied but no visual/haptic input). In near space, both open- and hidden-tool groups showed a tool-end, attentional bias (faster RTs toward tool-end) before practice; after practice, RTs near the hand improved. In far space, the open-tool group showed no bias before practice; after practice, target RTs near the tool-end improved. However, the hidden-tool group showed a consistent tool-end bias despite practice. Lack of short-tool group results suggested that hidden-tool group results were specific to haptic inputs. In conclusion, (1) allocation of visual attention along a tool due to tool practice differs in near and far space, and (2) visual attention is drawn toward the tool's end even when visually obscured, suggesting haptic input provides sufficient information for directing attention along the tool.

  19. Haptic spatial matching in near peripersonal space.

    PubMed

    Kaas, Amanda L; Mier, Hanneke I van

    2006-04-01

    Research has shown that haptic spatial matching at intermanual distances over 60 cm is prone to large systematic errors. The error pattern has been explained by the use of reference frames intermediate between egocentric and allocentric coding. This study investigated haptic performance in near peripersonal space, i.e. at intermanual distances of 60 cm and less. Twelve blindfolded participants (six males and six females) were presented with two turn bars at equal distances from the midsagittal plane, 30 or 60 cm apart. Different orientations (vertical/horizontal or oblique) of the left bar had to be matched by adjusting the right bar to either a mirror symmetric (/ \\) or parallel (/ /) position. The mirror symmetry task can in principle be performed accurately in both an egocentric and an allocentric reference frame, whereas the parallel task requires an allocentric representation. Results showed that parallel matching induced large systematic errors which increased with distance. Overall error was significantly smaller in the mirror task. The task difference also held for the vertical orientation at 60 cm distance, even though this orientation required the same response in both tasks, showing a marked effect of task instruction. In addition, men outperformed women on the parallel task. Finally, contrary to our expectations, systematic errors were found in the mirror task, predominantly at 30 cm distance. Based on these findings, we suggest that haptic performance in near peripersonal space might be dominated by different mechanisms than those which come into play at distances over 60 cm. Moreover, our results indicate that both inter-individual differences and task demands affect task performance in haptic spatial matching. Therefore, we conclude that the study of haptic spatial matching in near peripersonal space might reveal important additional constraints for the specification of adequate models of haptic spatial performance.

  20. Subthalamic nucleus deep brain stimulation improves somatosensory function in Parkinson's disease.

    PubMed

    Aman, Joshua E; Abosch, Aviva; Bebler, Maggie; Lu, Chia-Hao; Konczak, Jürgen

    2014-02-01

    An established treatment for the motor symptoms of Parkinson's disease (PD) is deep brain stimulation (DBS) of the subthalamic nucleus (STN). Mounting evidence suggests that PD is also associated with somatosensory deficits, yet the effect of STN-DBS on somatosensory processing is largely unknown. This study investigated whether STN-DBS affects somatosensory processing, specifically the processing of tactile and proprioceptive cues, by systematically examining the accuracy of haptic perception of object size. (Haptic perception refers to one's ability to extract object features such as shape and size by active touch.) Without vision, 13 PD patients with implanted STN-DBS and 13 healthy controls haptically explored the heights of 2 successively presented 3-dimensional (3D) blocks using a precision grip. Participants verbally indicated which block was taller and then used their nonprobing hand to motorically match the perceived size of the comparison block. Patients were tested during ON and OFF stimulation, following a 12-hour medication washout period. First, when compared to controls, the PD group's haptic discrimination threshold during OFF stimulation was elevated by 192% and mean hand aperture error was increased by 105%. Second, DBS lowered the haptic discrimination threshold by 26% and aperture error decreased by 20%. Third, during DBS ON, probing with the motorically more affected hand decreased haptic precision compared to probing with the less affected hand. This study offers the first evidence that STN-DBS improves haptic precision, further indicating that somatosensory function is improved by STN-DBS. We conclude that DBS-related improvements are not explained by improvements in motor function alone, but rather by enhanced somatosensory processing. © 2013 Movement Disorder Society.

  1. Web-based Three-dimensional Virtual Body Structures: W3D-VBS

    PubMed Central

    Temkin, Bharti; Acosta, Eric; Hatfield, Paul; Onal, Erhan; Tong, Alex

    2002-01-01

    Major efforts are being made to improve the teaching of human anatomy to foster cognition of visuospatial relationships. The Visible Human Project of the National Library of Medicine makes it possible to create virtual reality-based applications for teaching anatomy. Integration of traditional cadaver and illustration-based methods with Internet-based simulations brings us closer to this goal. Web-based three-dimensional Virtual Body Structures (W3D-VBS) is a next-generation immersive anatomical training system for teaching human anatomy over the Internet. It uses Visible Human data to dynamically explore, select, extract, visualize, manipulate, and stereoscopically palpate realistic virtual body structures with a haptic device. Tracking user’s progress through evaluation tools helps customize lesson plans. A self-guided “virtual tour” of the whole body allows investigation of labeled virtual dissections repetitively, at any time and place a user requires it. PMID:12223495

  2. Web-based three-dimensional Virtual Body Structures: W3D-VBS.

    PubMed

    Temkin, Bharti; Acosta, Eric; Hatfield, Paul; Onal, Erhan; Tong, Alex

    2002-01-01

    Major efforts are being made to improve the teaching of human anatomy to foster cognition of visuospatial relationships. The Visible Human Project of the National Library of Medicine makes it possible to create virtual reality-based applications for teaching anatomy. Integration of traditional cadaver and illustration-based methods with Internet-based simulations brings us closer to this goal. Web-based three-dimensional Virtual Body Structures (W3D-VBS) is a next-generation immersive anatomical training system for teaching human anatomy over the Internet. It uses Visible Human data to dynamically explore, select, extract, visualize, manipulate, and stereoscopically palpate realistic virtual body structures with a haptic device. Tracking user's progress through evaluation tools helps customize lesson plans. A self-guided "virtual tour" of the whole body allows investigation of labeled virtual dissections repetitively, at any time and place a user requires it.

  3. Endotracheal intubation: application of virtual reality to emergency medical services education.

    PubMed

    Mayrose, James; Myers, Jeffrey W

    2007-01-01

    Virtual reality simulation has been identified as an emerging educational tool with significant potential to enhance teaching of residents and students in emergency clinical encounters and procedures. Endotracheal intubation represents a critical procedure for emergency care providers. Current methods of training include working with cadavers and mannequins, which have limitations in their representation of reality, ethical concerns, and overall availability with access, cost, and location of models. This paper will present a human airway simulation model designed for tracheal intubation and discuss the aspects that lend itself to use as an educational tool. This realistic and dynamic model is used to teach routine intubations, while future models will include more difficult airway management scenarios. This work provides a solid foundation for future versions of the intubation simulator, which will incorporate two haptic devices to allow for simultaneous control of the laryngoscope blade and endotracheal tube.

  4. A haptic interface for virtual simulation of endoscopic surgery.

    PubMed

    Rosenberg, L B; Stredney, D

    1996-01-01

    Virtual reality can be described as a convincingly realistic and naturally interactive simulation in which the user is given a first person illusion of being immersed within a computer generated environment While virtual reality systems offer great potential to reduce the cost and increase the quality of medical training, many technical challenges must be overcome before such simulation platforms offer effective alternatives to more traditional training means. A primary challenge in developing effective virtual reality systems is designing the human interface hardware which allows rich sensory information to be presented to users in natural ways. When simulating a given manual procedure, task specific human interface requirements dictate task specific human interface hardware. The following paper explores the design of human interface hardware that satisfies the task specific requirements of virtual reality simulation of Endoscopic surgical procedures. Design parameters were derived through direct cadaver studies and interviews with surgeons. Final hardware design is presented.

  5. Sensorimotor Interactions in the Haptic Perception of Virtual Objects

    DTIC Science & Technology

    1997-01-01

    the human user. 2 Compared to our understanding of vision and audition , our knowledge of the human haptic perception is very limited. Many basic...modalities such as vision and audition on haptic perception of viscosity or mass, for example. 116 Some preliminary work has already been done in this...string[3]; *posx="x" *forf="f’ *velv="v" * acca ="a" trial[64]; resp[64]; /* random number */ /* trial number */ /* index */ /* array holding stim

  6. Towards open-source, low-cost haptics for surgery simulation.

    PubMed

    Suwelack, Stefan; Sander, Christian; Schill, Julian; Serf, Manuel; Danz, Marcel; Asfour, Tamim; Burger, Wolfgang; Dillmann, Rüdiger; Speidel, Stefanie

    2014-01-01

    In minimally invasive surgery (MIS), virtual reality (VR) training systems have become a promising education tool. However, the adoption of these systems in research and clinical settings is still limited by the high costs of dedicated haptics hardware for MIS. In this paper, we present ongoing research towards an open-source, low-cost haptic interface for MIS simulation. We demonstrate the basic mechanical design of the device, the sensor setup as well as its software integration.

  7. Multimodal Virtual Environments: MAGIC Toolkit and Visual-Haptic Interaction Paradigms

    DTIC Science & Technology

    1998-01-01

    2.7.3 Load/Save Options ..... 2.7.4 Information Display .... 2.8 Library Files. 2.9 Evaluation .............. 3 Visual-Haptic Interactions 3.1...Northwestern University[ Colgate , 1994]. It is possible for a user to touch one side of a thin object and be propelled out the opposite side, because...when there is a high correlation in motion and force between the visual and haptic realms. * Chapter 7 concludes with an evaluation of the application

  8. Vision-Based Haptic Feedback for Remote Micromanipulation in-SEM Environment

    NASA Astrophysics Data System (ADS)

    Bolopion, Aude; Dahmen, Christian; Stolle, Christian; Haliyo, Sinan; Régnier, Stéphane; Fatikow, Sergej

    2012-07-01

    This article presents an intuitive environment for remote micromanipulation composed of both haptic feedback and virtual reconstruction of the scene. To enable nonexpert users to perform complex teleoperated micromanipulation tasks, it is of utmost importance to provide them with information about the 3-D relative positions of the objects and the tools. Haptic feedback is an intuitive way to transmit such information. Since position sensors are not available at this scale, visual feedback is used to derive information about the scene. In this work, three different techniques are implemented, evaluated, and compared to derive the object positions from scanning electron microscope images. The modified correlation matching with generated template algorithm is accurate and provides reliable detection of objects. To track the tool, a marker-based approach is chosen since fast detection is required for stable haptic feedback. Information derived from these algorithms is used to propose an intuitive remote manipulation system that enables users situated in geographically distant sites to benefit from specific equipments, such as SEMs. Stability of the haptic feedback is ensured by the minimization of the delays, the computational efficiency of vision algorithms, and the proper tuning of the haptic coupling. Virtual guides are proposed to avoid any involuntary collisions between the tool and the objects. This approach is validated by a teleoperation involving melamine microspheres with a diameter of less than 2 μ m between Paris, France and Oldenburg, Germany.

  9. NEDE: an open-source scripting suite for developing experiments in 3D virtual environments.

    PubMed

    Jangraw, David C; Johri, Ansh; Gribetz, Meron; Sajda, Paul

    2014-09-30

    As neuroscientists endeavor to understand the brain's response to ecologically valid scenarios, many are leaving behind hyper-controlled paradigms in favor of more realistic ones. This movement has made the use of 3D rendering software an increasingly compelling option. However, mastering such software and scripting rigorous experiments requires a daunting amount of time and effort. To reduce these startup costs and make virtual environment studies more accessible to researchers, we demonstrate a naturalistic experimental design environment (NEDE) that allows experimenters to present realistic virtual stimuli while still providing tight control over the subject's experience. NEDE is a suite of open-source scripts built on the widely used Unity3D game development software, giving experimenters access to powerful rendering tools while interfacing with eye tracking and EEG, randomizing stimuli, and providing custom task prompts. Researchers using NEDE can present a dynamic 3D virtual environment in which randomized stimulus objects can be placed, allowing subjects to explore in search of these objects. NEDE interfaces with a research-grade eye tracker in real-time to maintain precise timing records and sync with EEG or other recording modalities. Python offers an alternative for experienced programmers who feel comfortable mastering and integrating the various toolboxes available. NEDE combines many of these capabilities with an easy-to-use interface and, through Unity's extensive user base, a much more substantial body of assets and tutorials. Our flexible, open-source experimental design system lowers the barrier to entry for neuroscientists interested in developing experiments in realistic virtual environments. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. The mere exposure effect in the domain of haptics.

    PubMed

    Jakesch, Martina; Carbon, Claus-Christian

    2012-01-01

    Zajonc showed that the attitude towards stimuli that one had been previously exposed to is more positive than towards novel stimuli. This mere exposure effect (MEE) has been tested extensively using various visual stimuli. Research on the MEE is sparse, however, for other sensory modalities. We used objects of two material categories (stone and wood) and two complexity levels (simple and complex) to test the influence of exposure frequency (F0 = novel stimuli, F2 = stimuli exposed twice, F10 = stimuli exposed ten times) under two sensory modalities (haptics only and haptics & vision). Effects of exposure frequency were found for high complex stimuli with significantly increasing liking from F0 to F2 and F10, but only for the stone category. Analysis of "Need for Touch" data showed the MEE in participants with high need for touch, which suggests different sensitivity or saturation levels of MEE. This different sensitivity or saturation levels might also reflect the effects of expertise on the haptic evaluation of objects. It seems that haptic and cross-modal MEEs are influenced by factors similar to those in the visual domain indicating a common cognitive basis.

  11. Aging and the haptic perception of 3D surface shape.

    PubMed

    Norman, J Farley; Kappers, Astrid M L; Beers, Amanda M; Scott, A Kate; Norman, Hideko F; Koenderink, Jan J

    2011-04-01

    Two experiments evaluated the ability of older and younger adults to perceive the three-dimensional (3D) shape of object surfaces from active touch (haptics). The ages of the older adults ranged from 64 to 84 years, while those of the younger adults ranged from 18 to 27 years. In Experiment 1, the participants haptically judged the shape of large (20 cm diameter) surfaces with an entire hand. In contrast, in Experiment 2, the participants explored the shape of small (5 cm diameter) surfaces with a single finger. The haptic surfaces varied in shape index (Koenderink, Solid shape, 1990; Koenderink, Image and Vision Computing, 10, 557-564, 1992) from -1.0 to +1.0 in steps of 0.25. For both types of surfaces (large and small), the participants were able to judge surface shape reliably. The older participants' judgments of surface shape were just as accurate and precise as those of the younger participants. The results of the current study demonstrate that while older adults do possess reductions in tactile sensitivity and acuity, they nevertheless can effectively perceive 3D surface shape from haptic exploration.

  12. A visual graphic/haptic rendering model for hysteroscopic procedures.

    PubMed

    Lim, Fabian; Brown, Ian; McColl, Ryan; Seligman, Cory; Alsaraira, Amer

    2006-03-01

    Hysteroscopy is an extensively popular option in evaluating and treating women with infertility. The procedure utilises an endoscope, inserted through the vagina and cervix to examine the intra-uterine cavity via a monitor. The difficulty of hysteroscopy from the surgeon's perspective is the visual spatial perception of interpreting 3D images on a 2D monitor, and the associated psychomotor skills in overcoming the fulcrum-effect. Despite the widespread use of this procedure, current qualified hysteroscopy surgeons have not been trained the fundamentals through an organised curriculum. The emergence of virtual reality as an educational tool for this procedure, and for other endoscopic procedures, has undoubtedly raised interests. The ultimate objective is for the inclusion of virtual reality training as a mandatory component for gynaecologic endoscopy training. Part of this process involves the design of a simulator, encompassing the technical difficulties and complications associated with the procedure. The proposed research examines fundamental hysteroscopy factors, current training and accreditation, and proposes a hysteroscopic simulator design that is suitable for educating and training.

  13. Haptic-STM: a human-in-the-loop interface to a scanning tunneling microscope.

    PubMed

    Perdigão, Luís M A; Saywell, Alex

    2011-07-01

    The operation of a haptic device interfaced with a scanning tunneling microscope (STM) is presented here. The user moves the STM tip in three dimensions by means of a stylus attached to the haptic instrument. The tunneling current measured by the STM is converted to a vertical force, applied to the stylus and felt by the user, with the user being incorporated into the feedback loop that controls the tip-surface distance. A haptic-STM interface of this nature allows the user to feel atomic features on the surface and facilitates the tactile manipulation of the adsorbate/substrate system. The operation of this device is demonstrated via the room temperature STM imaging of C(60) molecules adsorbed on an Au(111) surface in ultra-high vacuum.

  14. Haptics-based immersive telerobotic system for improvised explosive device disposal: Are two hands better than one?

    NASA Astrophysics Data System (ADS)

    Erickson, David; Lacheray, Hervé; Lambert, Jason Michel; Mantegh, Iraj; Crymble, Derry; Daly, John; Zhao, Yan

    2012-06-01

    State-of-the-art robotic explosive ordnance disposal robotics have not, in general, adopted recent advances in control technology and man-machine interfaces and lag many years behind academia. This paper describes the Haptics-based Immersive Telerobotic System project investigating an immersive telepresence envrionment incorporating advanced vehicle control systems, Augmented immersive sensory feedback, dynamic 3D visual information, and haptic feedback for explosive ordnance disposal operators. The project aim is to provide operatiors a more sophisticated interface and expand sensory input to perform complex tasks to defeat improvised explosive devices successfully. The introduction of haptics and immersive teleprescence has the potential to shift the way teleprescence systems work for explosive ordnance disposal tasks or more widely for first responders scenarios involving remote unmanned ground vehicles.

  15. Recruitment of Foveal Retinotopic Cortex During Haptic Exploration of Shapes and Actions in the Dark.

    PubMed

    Monaco, Simona; Gallivan, Jason P; Figley, Teresa D; Singhal, Anthony; Culham, Jody C

    2017-11-29

    The role of the early visual cortex and higher-order occipitotemporal cortex has been studied extensively for visual recognition and to a lesser degree for haptic recognition and visually guided actions. Using a slow event-related fMRI experiment, we investigated whether tactile and visual exploration of objects recruit the same "visual" areas (and in the case of visual cortex, the same retinotopic zones) and if these areas show reactivation during delayed actions in the dark toward haptically explored objects (and if so, whether this reactivation might be due to imagery). We examined activation during visual or haptic exploration of objects and action execution (grasping or reaching) separated by an 18 s delay. Twenty-nine human volunteers (13 females) participated in this study. Participants had their eyes open and fixated on a point in the dark. The objects were placed below the fixation point and accordingly visual exploration activated the cuneus, which processes retinotopic locations in the lower visual field. Strikingly, the occipital pole (OP), representing foveal locations, showed higher activation for tactile than visual exploration, although the stimulus was unseen and location in the visual field was peripheral. Moreover, the lateral occipital tactile-visual area (LOtv) showed comparable activation for tactile and visual exploration. Psychophysiological interaction analysis indicated that the OP showed stronger functional connectivity with anterior intraparietal sulcus and LOtv during the haptic than visual exploration of shapes in the dark. After the delay, the cuneus, OP, and LOtv showed reactivation that was independent of the sensory modality used to explore the object. These results show that haptic actions not only activate "visual" areas during object touch, but also that this information appears to be used in guiding grasping actions toward targets after a delay. SIGNIFICANCE STATEMENT Visual presentation of an object activates shape-processing areas and retinotopic locations in early visual areas. Moreover, if the object is grasped in the dark after a delay, these areas show "reactivation." Here, we show that these areas are also activated and reactivated for haptic object exploration and haptically guided grasping. Touch-related activity occurs not only in the retinotopic location of the visual stimulus, but also at the occipital pole (OP), corresponding to the foveal representation, even though the stimulus was unseen and located peripherally. That is, the same "visual" regions are implicated in both visual and haptic exploration; however, touch also recruits high-acuity central representation within early visual areas during both haptic exploration of objects and subsequent actions toward them. Functional connectivity analysis shows that the OP is more strongly connected with ventral and dorsal stream areas when participants explore an object in the dark than when they view it. Copyright © 2017 the authors 0270-6474/17/3711572-20$15.00/0.

  16. Freely-available, true-color volume rendering software and cryohistology data sets for virtual exploration of the temporal bone anatomy.

    PubMed

    Kahrs, Lüder Alexander; Labadie, Robert Frederick

    2013-01-01

    Cadaveric dissection of temporal bone anatomy is not always possible or feasible in certain educational environments. Volume rendering using CT and/or MRI helps understanding spatial relationships, but they suffer in nonrealistic depictions especially regarding color of anatomical structures. Freely available, nonstained histological data sets and software which are able to render such data sets in realistic color could overcome this limitation and be a very effective teaching tool. With recent availability of specialized public-domain software, volume rendering of true-color, histological data sets is now possible. We present both feasibility as well as step-by-step instructions to allow processing of publicly available data sets (Visible Female Human and Visible Ear) into easily navigable 3-dimensional models using free software. Example renderings are shown to demonstrate the utility of these free methods in virtual exploration of the complex anatomy of the temporal bone. After exploring the data sets, the Visible Ear appears more natural than the Visible Human. We provide directions for an easy-to-use, open-source software in conjunction with freely available histological data sets. This work facilitates self-education of spatial relationships of anatomical structures inside the human temporal bone as well as it allows exploration of surgical approaches prior to cadaveric testing and/or clinical implementation. Copyright © 2013 S. Karger AG, Basel.

  17. Haptic Technologies for MEMS Design

    NASA Astrophysics Data System (ADS)

    Calis, Mustafa; Desmulliez, Marc P. Y.

    2006-04-01

    This paper presents for the first time a design methodology for MEMS/NEMS based on haptic sensing technologies. The software tool created as a result of this methodology will enable designers to model and interact in real time with their virtual prototype. One of the main advantages of haptic sensing is the ability to bring unusual microscopic forces back to the designer's world. Other significant benefits for developing such a methodology include gain productivity and the capability to include manufacturing costs within the design cycle.

  18. 3D imaging, 3D printing and 3D virtual planning in endodontics.

    PubMed

    Shah, Pratik; Chong, B S

    2018-03-01

    The adoption and adaptation of recent advances in digital technology, such as three-dimensional (3D) printed objects and haptic simulators, in dentistry have influenced teaching and/or management of cases involving implant, craniofacial, maxillofacial, orthognathic and periodontal treatments. 3D printed models and guides may help operators plan and tackle complicated non-surgical and surgical endodontic treatment and may aid skill acquisition. Haptic simulators may assist in the development of competency in endodontic procedures through the acquisition of psycho-motor skills. This review explores and discusses the potential applications of 3D printed models and guides, and haptic simulators in the teaching and management of endodontic procedures. An understanding of the pertinent technology related to the production of 3D printed objects and the operation of haptic simulators are also presented.

  19. A haptic device for guide wire in interventional radiology procedures.

    PubMed

    Moix, Thomas; Ilic, Dejan; Bleuler, Hannes; Zoethout, Jurjen

    2006-01-01

    Interventional Radiology (IR) is a minimally invasive procedure where thin tubular instruments, guide wires and catheters, are steered through the patient's vascular system under X-ray imaging. In order to perform these procedures, a radiologist has to be trained to master hand-eye coordination, instrument manipulation and procedure protocols. The existing simulation systems all have major drawbacks: the use of modified instruments, unrealistic insertion lengths, high inertia of the haptic device that creates a noticeably degraded dynamic behavior or excessive friction that is not properly compensated for. In this paper we propose a quality training environment dedicated to IR. The system is composed of a virtual reality (VR) simulation of the patient's anatomy linked to a robotic interface providing haptic force feedback. This paper focuses on the requirements, design and prototyping of a specific haptic interface for guide wires.

  20. An Enhanced Soft Vibrotactile Actuator Based on ePVC Gel with Silicon Dioxide Nanoparticles.

    PubMed

    Park, Won-Hyeong; Shin, Eun-Jae; Yun, Sungryul; Kim, Sang-Youn

    2018-01-01

    In this paper, we propose a soft vibrotactile actuator made by mixing silicon dioxide nanoparticles and plasticized PVC gel. The effect of the silicon dioxide nanoparticles in the plasticized PVC gel for the haptic performance is investigated in terms of electric, dielectric, and mechanical properties. Furthermore, eight soft vibrotactile actuators are prepared as a function of the content. Experiments are conducted to examine the haptic performance of the prepared eight soft vibrotactile actuators and to find the best weight ratio of the plasticized PVC gel to the nanoparticles. The experiments should show that the plasticized PVC gel with silicon dioxide nanoparticles improves the haptic performance of the plasticized PVC gel-based vibrotactile actuator, and the proposed vibrotactile actuator can create a variety of haptic sensations in a wide frequency range.

  1. Morphologic compatibility or intraocular lens haptics and the lens capsule.

    PubMed

    Nagamoto, T; Eguchi, G

    1997-10-01

    To evaluate the mechanical relationship between the intraocular lens (IOL) haptic and the capsular bag by quantitatively analyzing the fit of the haptic with the capsule equator and the capsular bag deformity induced by the implanted lens haptics. Division of Morphogenesis, Department of Developmental Biology, National Institute for Basic Biology, Okazaki, Japan. Following implantation of a poly(methyl methacrylate)(PMMA) ring in three excised human capsular bags with continuous curvilinear capsulorhexis (CCC), IOLs with different overall lengths or haptic designs were implanted in the bags and photographed. The straight length of the area of contact between the haptic and the capsule equator on the photographs was measured to provide a quantitative index of in-the-bag fixation and the length from the external margin of the PMMA ring to the external margin of the loop along the maximal diameter of the capsular bag, to indicate the quantitative degree of capsular deformity induced by an IOL. An IOL with modified-C loops produced better fit along the capsule equator and less deformity than an IOL with modified-J loops, and an IOL with an overall length of 12.0 or 12.5 mm produced a sufficiently good fit and less distortion of the capsular bag than an IOL with an overall length over 13.0 mm. An IOL with modified-C loops and an overall length of 12.0 or 12.5 mm is adequate for in-the-bag implantation following CCC.

  2. Fragility of haptic memory in human full-term newborns.

    PubMed

    Lejeune, Fleur; Borradori Tolsa, Cristina; Gentaz, Edouard; Barisnikov, Koviljka

    2018-05-31

    Numerous studies have established that newborns can memorize tactile information about the specific features of an object with their hands and detect differences with another object. However, the robustness of haptic memory abilities has already been examined in preterm newborns and in full-term infants, but not yet in full-term newborns. This research is aimed to better understand the robustness of haptic memory abilities at birth by examining the effects of a change in the objects' temperature and haptic interference. Sixty-eight full-term newborns (mean postnatal age: 2.5 days) were included. The two experiments were conducted in three phases: habituation (repeated presentation of the same object, a prism or cylinder in the newborn's hand), discrimination (presentation of a novel object), and recognition (presentation of the familiar object). In Experiment 1, the change in the objects' temperature was controlled during the three phases. Results reveal that newborns can memorize specific features that differentiate prism and cylinder shapes by touch, and discriminate between them, but surprisingly they did not show evidence of recognizing them after interference. As no significant effect of the temperature condition was observed in habituation, discrimination and recognition abilities, these findings suggest that discrimination abilities in newborns may be determined by the detection of shape differences. Overall, it seems that the ontogenesis of haptic recognition memory is not linear. The developmental schedule is likely crucial for haptic development between 34 and 40 GW. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. Design and implementation of visual-haptic assistive control system for virtual rehabilitation exercise and teleoperation manipulation.

    PubMed

    Veras, Eduardo J; De Laurentis, Kathryn J; Dubey, Rajiv

    2008-01-01

    This paper describes the design and implementation of a control system that integrates visual and haptic information to give assistive force feedback through a haptic controller (Omni Phantom) to the user. A sensor-based assistive function and velocity scaling program provides force feedback that helps the user complete trajectory following exercises for rehabilitation purposes. This system also incorporates a PUMA robot for teleoperation, which implements a camera and a laser range finder, controlled in real time by a PC, were implemented into the system to help the user to define the intended path to the selected target. The real-time force feedback from the remote robot to the haptic controller is made possible by using effective multithreading programming strategies in the control system design and by novel sensor integration. The sensor-based assistant function concept applied to teleoperation as well as shared control enhances the motion range and manipulation capabilities of the users executing rehabilitation exercises such as trajectory following along a sensor-based defined path. The system is modularly designed to allow for integration of different master devices and sensors. Furthermore, because this real-time system is versatile the haptic component can be used separately from the telerobotic component; in other words, one can use the haptic device for rehabilitation purposes for cases in which assistance is needed to perform tasks (e.g., stroke rehab) and also for teleoperation with force feedback and sensor assistance in either supervisory or automatic modes.

  4. Optimal visual-haptic integration with articulated tools.

    PubMed

    Takahashi, Chie; Watt, Simon J

    2017-05-01

    When we feel and see an object, the nervous system integrates visual and haptic information optimally, exploiting the redundancy in multiple signals to estimate properties more precisely than is possible from either signal alone. We examined whether optimal integration is similarly achieved when using articulated tools. Such tools (tongs, pliers, etc) are a defining characteristic of human hand function, but complicate the classical sensory 'correspondence problem' underlying multisensory integration. Optimal integration requires establishing the relationship between signals acquired by different sensors (hand and eye) and, therefore, in fundamentally unrelated units. The system must also determine when signals refer to the same property of the world-seeing and feeling the same thing-and only integrate those that do. This could be achieved by comparing the pattern of current visual and haptic input to known statistics of their normal relationship. Articulated tools disrupt this relationship, however, by altering the geometrical relationship between object properties and hand posture (the haptic signal). We examined whether different tool configurations are taken into account in visual-haptic integration. We indexed integration by measuring the precision of size estimates, and compared our results to optimal predictions from a maximum-likelihood integrator. Integration was near optimal, independent of tool configuration/hand posture, provided that visual and haptic signals referred to the same object in the world. Thus, sensory correspondence was determined correctly (trial-by-trial), taking tool configuration into account. This reveals highly flexible multisensory integration underlying tool use, consistent with the brain constructing internal models of tools' properties.

  5. GPU-based efficient realistic techniques for bleeding and smoke generation in surgical simulators.

    PubMed

    Halic, Tansel; Sankaranarayanan, Ganesh; De, Suvranu

    2010-12-01

    In actual surgery, smoke and bleeding due to cauterization processes provide important visual cues to the surgeon, which have been proposed as factors in surgical skill assessment. While several virtual reality (VR)-based surgical simulators have incorporated the effects of bleeding and smoke generation, they are not realistic due to the requirement of real-time performance. To be interactive, visual update must be performed at at least 30 Hz and haptic (touch) information must be refreshed at 1 kHz. Simulation of smoke and bleeding is, therefore, either ignored or simulated using highly simplified techniques, since other computationally intensive processes compete for the available Central Processing Unit (CPU) resources. In this study we developed a novel low-cost method to generate realistic bleeding and smoke in VR-based surgical simulators, which outsources the computations to the graphical processing unit (GPU), thus freeing up the CPU for other time-critical tasks. This method is independent of the complexity of the organ models in the virtual environment. User studies were performed using 20 subjects to determine the visual quality of the simulations compared to real surgical videos. The smoke and bleeding simulation were implemented as part of a laparoscopic adjustable gastric banding (LAGB) simulator. For the bleeding simulation, the original implementation using the shader did not incur noticeable overhead. However, for smoke generation, an input/output (I/O) bottleneck was observed and two different methods were developed to overcome this limitation. Based on our benchmark results, a buffered approach performed better than a pipelined approach and could support up to 15 video streams in real time. Human subject studies showed that the visual realism of the simulations were as good as in real surgery (median rating of 4 on a 5-point Likert scale). Based on the performance results and subject study, both bleeding and smoke simulations were concluded to be efficient, highly realistic and well suited to VR-based surgical simulators. Copyright © 2010 John Wiley & Sons, Ltd.

  6. GPU-based Efficient Realistic Techniques for Bleeding and Smoke Generation in Surgical Simulators

    PubMed Central

    Halic, Tansel; Sankaranarayanan, Ganesh; De, Suvranu

    2010-01-01

    Background In actual surgery, smoke and bleeding due to cautery processes, provide important visual cues to the surgeon which have been proposed as factors in surgical skill assessment. While several virtual reality (VR)-based surgical simulators have incorporated effects of bleeding and smoke generation, they are not realistic due to the requirement of real time performance. To be interactive, visual update must be performed at least 30 Hz and haptic (touch) information must be refreshed at 1 kHz. Simulation of smoke and bleeding is, therefore, either ignored or simulated using highly simplified techniques since other computationally intensive processes compete for the available CPU resources. Methods In this work, we develop a novel low-cost method to generate realistic bleeding and smoke in VR-based surgical simulators which outsources the computations to the graphical processing unit (GPU), thus freeing up the CPU for other time-critical tasks. This method is independent of the complexity of the organ models in the virtual environment. User studies were performed using 20 subjects to determine the visual quality of the simulations compared to real surgical videos. Results The smoke and bleeding simulation were implemented as part of a Laparoscopic Adjustable Gastric Banding (LAGB) simulator. For the bleeding simulation, the original implementation using the shader did not incur in noticeable overhead. However, for smoke generation, an I/O (Input/Output) bottleneck was observed and two different methods were developed to overcome this limitation. Based on our benchmark results, a buffered approach performed better than a pipelined approach and could support up to 15 video streams in real time. Human subject studies showed that the visual realism of the simulations were as good as in real surgery (median rating of 4 on a 5-point Likert scale). Conclusions Based on the performance results and subject study, both bleeding and smoke simulations were concluded to be efficient, highly realistic and well suited in VR-based surgical simulators. PMID:20878651

  7. The Mere Exposure Effect in the Domain of Haptics

    PubMed Central

    Jakesch, Martina; Carbon, Claus-Christian

    2012-01-01

    Background Zajonc showed that the attitude towards stimuli that one had been previously exposed to is more positive than towards novel stimuli. This mere exposure effect (MEE) has been tested extensively using various visual stimuli. Research on the MEE is sparse, however, for other sensory modalities. Methodology/Principal Findings We used objects of two material categories (stone and wood) and two complexity levels (simple and complex) to test the influence of exposure frequency (F0 = novel stimuli, F2 = stimuli exposed twice, F10 = stimuli exposed ten times) under two sensory modalities (haptics only and haptics & vision). Effects of exposure frequency were found for high complex stimuli with significantly increasing liking from F0 to F2 and F10, but only for the stone category. Analysis of “Need for Touch” data showed the MEE in participants with high need for touch, which suggests different sensitivity or saturation levels of MEE. Conclusions/Significance This different sensitivity or saturation levels might also reflect the effects of expertise on the haptic evaluation of objects. It seems that haptic and cross-modal MEEs are influenced by factors similar to those in the visual domain indicating a common cognitive basis. PMID:22347451

  8. Development of a Robotic Colonoscopic Manipulation System, Using Haptic Feedback Algorithm.

    PubMed

    Woo, Jaehong; Choi, Jae Hyuk; Seo, Jong Tae; Kim, Tae Il; Yi, Byung Ju

    2017-01-01

    Colonoscopy is one of the most effective diagnostic and therapeutic tools for colorectal diseases. We aim to propose a master-slave robotic colonoscopy that is controllable in remote site using conventional colonoscopy. The master and slave robot were developed to use conventional flexible colonoscopy. The robotic colonoscopic procedure was performed using a colonoscope training model by one expert endoscopist and two unexperienced engineers. To provide the haptic sensation, the insertion force and the rotating torque were measured and sent to the master robot. A slave robot was developed to hold the colonoscopy and its knob, and perform insertion, rotation, and two tilting motions of colonoscope. A master robot was designed to teach motions of the slave robot. These measured force and torque were scaled down by one tenth to provide the operator with some reflection force and torque at the haptic device. The haptic sensation and feedback system was successful and helpful to feel the constrained force or torque in colon. The insertion time using robotic system decreased with repeated procedures. This work proposed a robotic approach for colonoscopy using haptic feedback algorithm, and this robotic device would effectively perform colonoscopy with reduced burden and comparable safety for patients in remote site.

  9. Haptic shape discrimination and interhemispheric communication.

    PubMed

    Dowell, Catherine J; Norman, J Farley; Moment, Jackie R; Shain, Lindsey M; Norman, Hideko F; Phillips, Flip; Kappers, Astrid M L

    2018-01-10

    In three experiments participants haptically discriminated object shape using unimanual (single hand explored two objects) and bimanual exploration (both hands were used, but each hand, left or right, explored a separate object). Such haptic exploration (one versus two hands) requires somatosensory processing in either only one or both cerebral hemispheres; previous studies related to the perception of shape/curvature found superior performance for unimanual exploration, indicating that shape comparison is more effective when only one hemisphere is utilized. The current results, obtained for naturally shaped solid objects (bell peppers, Capsicum annuum) and simple cylindrical surfaces demonstrate otherwise: bimanual haptic exploration can be as effective as unimanual exploration, showing that there is no necessary reduction in ability when haptic shape comparison requires interhemispheric communication. We found that while successive bimanual exploration produced high shape discriminability, the participants' bimanual performance deteriorated for simultaneous shape comparisons. This outcome suggests that either interhemispheric interference or the need to attend to multiple objects simultaneously reduces shape discrimination ability. The current results also reveal a significant effect of age: older adults' shape discrimination abilities are moderately reduced relative to younger adults, regardless of how objects are manipulated (left hand only, right hand only, or bimanual exploration).

  10. A Review of Simulators with Haptic Devices for Medical Training.

    PubMed

    Escobar-Castillejos, David; Noguez, Julieta; Neri, Luis; Magana, Alejandra; Benes, Bedrich

    2016-04-01

    Medical procedures often involve the use of the tactile sense to manipulate organs or tissues by using special tools. Doctors require extensive preparation in order to perform them successfully; for example, research shows that a minimum of 750 operations are needed to acquire sufficient experience to perform medical procedures correctly. Haptic devices have become an important training alternative and they have been considered to improve medical training because they let users interact with virtual environments by adding the sense of touch to the simulation. Previous articles in the field state that haptic devices enhance the learning of surgeons compared to current training environments used in medical schools (corpses, animals, or synthetic skin and organs). Consequently, virtual environments use haptic devices to improve realism. The goal of this paper is to provide a state of the art review of recent medical simulators that use haptic devices. In particular we focus on stitching, palpation, dental procedures, endoscopy, laparoscopy, and orthopaedics. These simulators are reviewed and compared from the viewpoint of used technology, the number of degrees of freedom, degrees of force feedback, perceived realism, immersion, and feedback provided to the user. In the conclusion, several observations per area and suggestions for future work are provided.

  11. Differential effects of delay upon visually and haptically guided grasping and perceptual judgments.

    PubMed

    Pettypiece, Charles E; Culham, Jody C; Goodale, Melvyn A

    2009-05-01

    Experiments with visual illusions have revealed a dissociation between the systems that mediate object perception and those responsible for object-directed action. More recently, an experiment on a haptic version of the visual size-contrast illusion has provided evidence for the notion that the haptic modality shows a similar dissociation when grasping and estimating the size of objects in real-time. Here we present evidence suggesting that the similarities between the two modalities begin to break down once a delay is introduced between when people feel the target object and when they perform the grasp or estimation. In particular, when grasping after a delay in a haptic paradigm, people scale their grasps differently when the target is presented with a flanking object of a different size (although the difference does not reflect a size-contrast effect). When estimating after a delay, however, it appears that people ignore the size of the flanking objects entirely. This does not fit well with the results commonly found in visual experiments. Thus, introducing a delay reveals important differences in the way in which haptic and visual memories are stored and accessed.

  12. Adaptive displays and controllers using alternative feedback.

    PubMed

    Repperger, D W

    2004-12-01

    Investigations on the design of haptic (force reflecting joystick or force display) controllers were conducted by viewing the display of force information within the context of several different paradigms. First, using analogies from electrical and mechanical systems, certain schemes of the haptic interface were hypothesized which may improve the human-machine interaction with respect to various criteria. A discussion is given on how this interaction benefits the electrical and mechanical system. To generalize this concept to the design of human-machine interfaces, three studies with haptic mechanisms were then synthesized and analyzed.

  13. Does haptic steering guidance instigate speeding? A driving simulator study into causes and remedies.

    PubMed

    Melman, T; de Winter, J C F; Abbink, D A

    2017-01-01

    An important issue in road traffic safety is that drivers show adverse behavioral adaptation (BA) to driver assistance systems. Haptic steering guidance is an upcoming assistance system which facilitates lane-keeping performance while keeping drivers in the loop, and which may be particularly prone to BA. Thus far, experiments on haptic steering guidance have measured driver performance while the vehicle speed was kept constant. The aim of the present driving simulator study was to examine whether haptic steering guidance causes BA in the form of speeding, and to evaluate two types of haptic steering guidance designed not to suffer from BA. Twenty-four participants drove a 1.8m wide car for 13.9km on a curved road, with cones demarcating a single 2.2m narrow lane. Participants completed four conditions in a counterbalanced design: no guidance (Manual), continuous haptic guidance (Cont), continuous guidance that linearly reduced feedback gains from full guidance at 125km/h towards manual control at 130km/h and above (ContRF), and haptic guidance provided only when the predicted lateral position was outside a lateral bandwidth (Band). Participants were familiarized with each condition prior to the experimental runs and were instructed to drive as they normally would while minimizing the number of cone hits. Compared to Manual, the Cont condition yielded a significantly higher driving speed (on average by 7km/h), whereas ContRF and Band did not. All three guidance conditions yielded better lane-keeping performance than Manual, whereas Cont and ContRF yielded lower self-reported workload than Manual. In conclusion, continuous steering guidance entices drivers to increase their speed, thereby diminishing its potential safety benefits. It is possible to prevent BA while retaining safety benefits by making a design adjustment either in lateral (Band) or in longitudinal (ContRF) direction. Copyright © 2016. Published by Elsevier Ltd.

  14. Feel, imagine and learn! - Haptic augmented simulation and embodied instruction in physics learning

    NASA Astrophysics Data System (ADS)

    Han, In Sook

    The purpose of this study was to investigate the potentials and effects of an embodied instructional model in abstract concept learning. This embodied instructional process included haptic augmented educational simulation as an instructional tool to provide perceptual experiences as well as further instruction to activate those previous experiences with perceptual simulation. In order to verify the effectiveness of this instructional model, haptic augmented simulation with three different haptic levels (force and kinesthetic, kinesthetic, and non-haptic) and instructional materials (narrative and expository) were developed and their effectiveness tested. 220 fifth grade students were recruited to participate in the study from three elementary schools located in lower SES neighborhoods in Bronx, New York. The study was conducted for three consecutive weeks in regular class periods. The data was analyzed using ANCOVA, ANOVA, and MANOVA. The result indicates that haptic augmented simulations, both the force and kinesthetic and the kinesthetic simulations, was more effective than the non-haptic simulation in providing perceptual experiences and helping elementary students to create multimodal representations about machines' movements. However, in most cases, force feedback was needed to construct a fully loaded multimodal representation that could be activated when the instruction with less sensory modalities was being given. In addition, the force and kinesthetic simulation was effective in providing cognitive grounding to comprehend a new learning content based on the multimodal representation created with enhanced force feedback. Regarding the instruction type, it was found that the narrative and the expository instructions did not make any difference in activating previous perceptual experiences. These findings suggest that it is important to help students to make a solid cognitive ground with perceptual anchor. Also, sequential abstraction process would deepen students' understanding by providing an opportunity to practice their mental simulation by removing sensory modalities used one by one and to gradually reach abstract level of understanding where students can imagine the machine's movements and working mechanisms with only abstract language without any perceptual supports.

  15. Metal Sounds Stiffer than Drums for Ears, but Not Always for Hands: Low-Level Auditory Features Affect Multisensory Stiffness Perception More than High-Level Categorical Information

    PubMed Central

    Liu, Juan; Ando, Hiroshi

    2016-01-01

    Most real-world events stimulate multiple sensory modalities simultaneously. Usually, the stiffness of an object is perceived haptically. However, auditory signals also contain stiffness-related information, and people can form impressions of stiffness from the different impact sounds of metal, wood, or glass. To understand whether there is any interaction between auditory and haptic stiffness perception, and if so, whether the inferred material category is the most relevant auditory information, we conducted experiments using a force-feedback device and the modal synthesis method to present haptic stimuli and impact sound in accordance with participants’ actions, and to modulate low-level acoustic parameters, i.e., frequency and damping, without changing the inferred material categories of sound sources. We found that metal sounds consistently induced an impression of stiffer surfaces than did drum sounds in the audio-only condition, but participants haptically perceived surfaces with modulated metal sounds as significantly softer than the same surfaces with modulated drum sounds, which directly opposes the impression induced by these sounds alone. This result indicates that, although the inferred material category is strongly associated with audio-only stiffness perception, low-level acoustic parameters, especially damping, are more tightly integrated with haptic signals than the material category is. Frequency played an important role in both audio-only and audio-haptic conditions. Our study provides evidence that auditory information influences stiffness perception differently in unisensory and multisensory tasks. Furthermore, the data demonstrated that sounds with higher frequency and/or shorter decay time tended to be judged as stiffer, and contact sounds of stiff objects had no effect on the haptic perception of soft surfaces. We argue that the intrinsic physical relationship between object stiffness and acoustic parameters may be applied as prior knowledge to achieve robust estimation of stiffness in multisensory perception. PMID:27902718

  16. A pseudo-haptic knot diagram interface

    NASA Astrophysics Data System (ADS)

    Zhang, Hui; Weng, Jianguang; Hanson, Andrew J.

    2011-01-01

    To make progress in understanding knot theory, we will need to interact with the projected representations of mathematical knots which are of course continuous in 3D but significantly interrupted in the projective images. One way to achieve such a goal would be to design an interactive system that allows us to sketch 2D knot diagrams by taking advantage of a collision-sensing controller and explore their underlying smooth structures through a continuous motion. Recent advances of interaction techniques have been made that allow progress to be made in this direction. Pseudo-haptics that simulates haptic effects using pure visual feedback can be used to develop such an interactive system. This paper outlines one such pseudo-haptic knot diagram interface. Our interface derives from the familiar pencil-and-paper process of drawing 2D knot diagrams and provides haptic-like sensations to facilitate the creation and exploration of knot diagrams. A centerpiece of the interaction model simulates a "physically" reactive mouse cursor, which is exploited to resolve the apparent conflict between the continuous structure of the actual smooth knot and the visual discontinuities in the knot diagram representation. Another value in exploiting pseudo-haptics is that an acceleration (or deceleration) of the mouse cursor (or surface locator) can be used to indicate the slope of the curve (or surface) of whom the projective image is being explored. By exploiting these additional visual cues, we proceed to a full-featured extension to a pseudo-haptic 4D visualization system that simulates the continuous navigation on 4D objects and allows us to sense the bumps and holes in the fourth dimension. Preliminary tests of the software show that main features of the interface overcome some expected perceptual limitations in our interaction with 2D knot diagrams of 3D knots and 3D projective images of 4D mathematical objects.

  17. Perceiving Object Shape from Specular Highlight Deformation, Boundary Contour Deformation, and Active Haptic Manipulation.

    PubMed

    Norman, J Farley; Phillips, Flip; Cheeseman, Jacob R; Thomason, Kelsey E; Ronning, Cecilia; Behari, Kriti; Kleinman, Kayla; Calloway, Autum B; Lamirande, Davora

    2016-01-01

    It is well known that motion facilitates the visual perception of solid object shape, particularly when surface texture or other identifiable features (e.g., corners) are present. Conventional models of structure-from-motion require the presence of texture or identifiable object features in order to recover 3-D structure. Is the facilitation in 3-D shape perception similar in magnitude when surface texture is absent? On any given trial in the current experiments, participants were presented with a single randomly-selected solid object (bell pepper or randomly-shaped "glaven") for 12 seconds and were required to indicate which of 12 (for bell peppers) or 8 (for glavens) simultaneously visible objects possessed the same shape. The initial single object's shape was defined either by boundary contours alone (i.e., presented as a silhouette), specular highlights alone, specular highlights combined with boundary contours, or texture. In addition, there was a haptic condition: in this condition, the participants haptically explored with both hands (but could not see) the initial single object for 12 seconds; they then performed the same shape-matching task used in the visual conditions. For both the visual and haptic conditions, motion (rotation in depth or active object manipulation) was present in half of the trials and was not present for the remaining trials. The effect of motion was quantitatively similar for all of the visual and haptic conditions-e.g., the participants' performance in Experiment 1 was 93.5 percent higher in the motion or active haptic manipulation conditions (when compared to the static conditions). The current results demonstrate that deforming specular highlights or boundary contours facilitate 3-D shape perception as much as the motion of objects that possess texture. The current results also indicate that the improvement with motion that occurs for haptics is similar in magnitude to that which occurs for vision.

  18. Perceiving Object Shape from Specular Highlight Deformation, Boundary Contour Deformation, and Active Haptic Manipulation

    PubMed Central

    Cheeseman, Jacob R.; Thomason, Kelsey E.; Ronning, Cecilia; Behari, Kriti; Kleinman, Kayla; Calloway, Autum B.; Lamirande, Davora

    2016-01-01

    It is well known that motion facilitates the visual perception of solid object shape, particularly when surface texture or other identifiable features (e.g., corners) are present. Conventional models of structure-from-motion require the presence of texture or identifiable object features in order to recover 3-D structure. Is the facilitation in 3-D shape perception similar in magnitude when surface texture is absent? On any given trial in the current experiments, participants were presented with a single randomly-selected solid object (bell pepper or randomly-shaped “glaven”) for 12 seconds and were required to indicate which of 12 (for bell peppers) or 8 (for glavens) simultaneously visible objects possessed the same shape. The initial single object’s shape was defined either by boundary contours alone (i.e., presented as a silhouette), specular highlights alone, specular highlights combined with boundary contours, or texture. In addition, there was a haptic condition: in this condition, the participants haptically explored with both hands (but could not see) the initial single object for 12 seconds; they then performed the same shape-matching task used in the visual conditions. For both the visual and haptic conditions, motion (rotation in depth or active object manipulation) was present in half of the trials and was not present for the remaining trials. The effect of motion was quantitatively similar for all of the visual and haptic conditions–e.g., the participants’ performance in Experiment 1 was 93.5 percent higher in the motion or active haptic manipulation conditions (when compared to the static conditions). The current results demonstrate that deforming specular highlights or boundary contours facilitate 3-D shape perception as much as the motion of objects that possess texture. The current results also indicate that the improvement with motion that occurs for haptics is similar in magnitude to that which occurs for vision. PMID:26863531

  19. Haptics in forensics: the possibilities and advantages in using the haptic device for reconstruction approaches in forensic science.

    PubMed

    Buck, Ursula; Naether, Silvio; Braun, Marcel; Thali, Michael

    2008-09-18

    Non-invasive documentation methods such as surface scanning and radiological imaging are gaining in importance in the forensic field. These three-dimensional technologies provide digital 3D data, which are processed and handled in the computer. However, the sense of touch gets lost using the virtual approach. The haptic device enables the use of the sense of touch to handle and feel digital 3D data. The multifunctional application of a haptic device for forensic approaches is evaluated and illustrated in three different cases: the representation of bone fractures of the lower extremities, by traffic accidents, in a non-invasive manner; the comparison of bone injuries with the presumed injury-inflicting instrument; and in a gunshot case, the identification of the gun by the muzzle imprint, and the reconstruction of the holding position of the gun. The 3D models of the bones are generated from the Computed Tomography (CT) images. The 3D models of the exterior injuries, the injury-inflicting tools and the bone injuries, where a higher resolution is necessary, are created by the optical surface scan. The haptic device is used in combination with the software FreeForm Modelling Plus for touching the surface of the 3D models to feel the minute injuries and the surface of tools, to reposition displaced bone parts and to compare an injury-causing instrument with an injury. The repositioning of 3D models in a reconstruction is easier, faster and more precisely executed by means of using the sense of touch and with the user-friendly movement in the 3D space. For representation purposes, the fracture lines of bones are coloured. This work demonstrates that the haptic device is a suitable and efficient application in forensic science. The haptic device offers a new way in the handling of digital data in the virtual 3D space.

  20. Selective attention modulates visual and haptic repetition priming: effects in aging and Alzheimer's disease.

    PubMed

    Ballesteros, Soledad; Reales, José M; Mayas, Julia; Heller, Morton A

    2008-08-01

    In two experiments, we examined the effect of selective attention at encoding on repetition priming in normal aging and Alzheimer's disease (AD) patients for objects presented visually (experiment 1) or haptically (experiment 2). We used a repetition priming paradigm combined with a selective attention procedure at encoding. Reliable priming was found for both young adults and healthy older participants for visually presented pictures (experiment 1) as well as for haptically presented objects (experiment 2). However, this was only found for attended and not for unattended stimuli. The results suggest that independently of the perceptual modality, repetition priming requires attention at encoding and that perceptual facilitation is maintained in normal aging. However, AD patients did not show priming for attended stimuli, or for unattended visual or haptic objects. These findings suggest an early deficit of selective attention in AD. Results are discussed from a cognitive neuroscience approach.

  1. Haptic interfaces using dielectric electroactive polymers

    NASA Astrophysics Data System (ADS)

    Ozsecen, Muzaffer Y.; Sivak, Mark; Mavroidis, Constantinos

    2010-04-01

    Quality, amplitude and frequency of the interaction forces between a human and an actuator are essential traits for haptic applications. A variety of Electro-Active Polymer (EAP) based actuators can provide these characteristics simultaneously with quiet operation, low weight, high power density and fast response. This paper demonstrates a rolled Dielectric Elastomer Actuator (DEA) being used as a telepresence device in a heart beat measurement application. In the this testing, heart signals were acquired from a remote location using a wireless heart rate sensor, sent through a network and DEA was used to haptically reproduce the heart beats at the medical expert's location. A series of preliminary human subject tests were conducted that demonstrated that a) DE based haptic feeling can be used in heart beat measurement tests and b) through subjective testing the stiffness and actuator properties of the EAP can be tuned for a variety of applications.

  2. A three-axis force sensor for dual finger haptic interfaces.

    PubMed

    Fontana, Marco; Marcheschi, Simone; Salsedo, Fabio; Bergamasco, Massimo

    2012-10-10

    In this work we present the design process, the characterization and testing of a novel three-axis mechanical force sensor. This sensor is optimized for use in closed-loop force control of haptic devices with three degrees of freedom. In particular the sensor has been conceived for integration with a dual finger haptic interface that aims at simulating forces that occur during grasping and surface exploration. The sensing spring structure has been purposely designed in order to match force and layout specifications for the application. In this paper the design of the sensor is presented, starting from an analytic model that describes the characteristic matrix of the sensor. A procedure for designing an optimal overload protection mechanism is proposed. In the last part of the paper the authors describe the experimental characterization and the integrated test on a haptic hand exoskeleton showing the improvements in the controller performances provided by the inclusion of the force sensor.

  3. Blindness enhances tactile acuity and haptic 3-D shape discrimination.

    PubMed

    Norman, J Farley; Bartholomew, Ashley N

    2011-10-01

    This study compared the sensory and perceptual abilities of the blind and sighted. The 32 participants were required to perform two tasks: tactile grating orientation discrimination (to determine tactile acuity) and haptic three-dimensional (3-D) shape discrimination. The results indicated that the blind outperformed their sighted counterparts (individually matched for both age and sex) on both tactile tasks. The improvements in tactile acuity that accompanied blindness occurred for all blind groups (congenital, early, and late). However, the improvements in haptic 3-D shape discrimination only occurred for the early-onset and late-onset blindness groups; the performance of the congenitally blind was no better than that of the sighted controls. The results of the present study demonstrate that blindness does lead to an enhancement of tactile abilities, but they also suggest that early visual experience may play a role in facilitating haptic 3-D shape discrimination.

  4. Verticality perception during and after galvanic vestibular stimulation.

    PubMed

    Volkening, Katharina; Bergmann, Jeannine; Keller, Ingo; Wuehr, Max; Müller, Friedemann; Jahn, Klaus

    2014-10-03

    The human brain constructs verticality perception by integrating vestibular, somatosensory, and visual information. Here we investigated whether galvanic vestibular stimulation (GVS) has an effect on verticality perception both during and after application, by assessing the subjective verticals (visual, haptic and postural) in healthy subjects at those times. During stimulation the subjective visual vertical and the subjective haptic vertical shifted towards the anode, whereas this shift was reversed towards the cathode in all modalities once stimulation was turned off. Overall, the effects were strongest for the haptic modality. Additional investigation of the time course of GVS-induced changes in the haptic vertical revealed that anodal shifts persisted for the entire 20-min stimulation interval in the majority of subjects. Aftereffects exhibited different types of decay, with a preponderance for an exponential decay. The existence of such reverse effects after stimulation could have implications for GVS-based therapy. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  5. Spatio-temporal visualization of air-sea CO2 flux and carbon budget using volume rendering

    NASA Astrophysics Data System (ADS)

    Du, Zhenhong; Fang, Lei; Bai, Yan; Zhang, Feng; Liu, Renyi

    2015-04-01

    This paper presents a novel visualization method to show the spatio-temporal dynamics of carbon sinks and sources, and carbon fluxes in the ocean carbon cycle. The air-sea carbon budget and its process of accumulation are demonstrated in the spatial dimension, while the distribution pattern and variation of CO2 flux are expressed by color changes. In this way, we unite spatial and temporal characteristics of satellite data through visualization. A GPU-based direct volume rendering technique using half-angle slicing is adopted to dynamically visualize the released or absorbed CO2 gas with shadow effects. A data model is designed to generate four-dimensional (4D) data from satellite-derived air-sea CO2 flux products, and an out-of-core scheduling strategy is also proposed for on-the-fly rendering of time series of satellite data. The presented 4D visualization method is implemented on graphics cards with vertex, geometry and fragment shaders. It provides a visually realistic simulation and user interaction for real-time rendering. This approach has been integrated into the Information System of Ocean Satellite Monitoring for Air-sea CO2 Flux (IssCO2) for the research and assessment of air-sea CO2 flux in the China Seas.

  6. Realtime Compositing of Procedural Facade Textures on the Gpu

    NASA Astrophysics Data System (ADS)

    Krecklau, L.; Kobbelt, L.

    2011-09-01

    The real time rendering of complex virtual city models has become more important in the last few years for many practical applications like realistic navigation or urban planning. For maximum rendering performance, the complexity of the geometry or textures can be reduced by decreasing the resolution until the data set can fully reside on the memory of the graphics card. This typically results in a low quality of the virtual city model. Alternatively, a streaming algorithm can load the high quality data set from the hard drive. However, this approach requires a large amount of persistent storage providing several gigabytes of static data. We present a system that uses a texture atlas containing atomic tiles like windows, doors or wall patterns, and that combines those elements on-the-fly directly on the graphics card. The presented approach benefits from a sophisticated randomization approach that produces lots of different facades while the grammar description itself remains small. By using a ray casting apporach, we are able to trace through transparent windows revealing procedurally generated rooms which further contributes to the realism of the rendering. The presented method enables real time rendering of city models with a high level of detail for facades while still relying on a small memory footprint.

  7. Effects of 3D virtual haptics force feedback on brand personality perception: the mediating role of physical presence in advergames.

    PubMed

    Jin, Seung-A Annie

    2010-06-01

    This study gauged the effects of force feedback in the Novint Falcon haptics system on the sensory and cognitive dimensions of a virtual test-driving experience. First, in order to explore the effects of tactile stimuli with force feedback on users' sensory experience, feelings of physical presence (the extent to which virtual physical objects are experienced as actual physical objects) were measured after participants used the haptics interface. Second, to evaluate the effects of force feedback on the cognitive dimension of consumers' virtual experience, this study investigated brand personality perception. The experiment utilized the Novint Falcon haptics controller to induce immersive virtual test-driving through tactile stimuli. The author designed a two-group (haptics stimuli with force feedback versus no force feedback) comparison experiment (N = 238) by manipulating the level of force feedback. Users in the force feedback condition were exposed to tactile stimuli involving various force feedback effects (e.g., terrain effects, acceleration, and lateral forces) while test-driving a rally car. In contrast, users in the control condition test-drove the rally car using the Novint Falcon but were not given any force feedback. Results of ANOVAs indicated that (a) users exposed to force feedback felt stronger physical presence than those in the no force feedback condition, and (b) users exposed to haptics stimuli with force feedback perceived the brand personality of the car to be more rugged than those in the control condition. Managerial implications of the study for product trial in the business world are discussed.

  8. Patient DF's visual brain in action: Visual feedforward control in visual form agnosia.

    PubMed

    Whitwell, Robert L; Milner, A David; Cavina-Pratesi, Cristiana; Barat, Masihullah; Goodale, Melvyn A

    2015-05-01

    Patient DF, who developed visual form agnosia following ventral-stream damage, is unable to discriminate the width of objects, performing at chance, for example, when asked to open her thumb and forefinger a matching amount. Remarkably, however, DF adjusts her hand aperture to accommodate the width of objects when reaching out to pick them up (grip scaling). While this spared ability to grasp objects is presumed to be mediated by visuomotor modules in her relatively intact dorsal stream, it is possible that it may rely abnormally on online visual or haptic feedback. We report here that DF's grip scaling remained intact when her vision was completely suppressed during grasp movements, and it still dissociated sharply from her poor perceptual estimates of target size. We then tested whether providing trial-by-trial haptic feedback after making such perceptual estimates might improve DF's performance, but found that they remained significantly impaired. In a final experiment, we re-examined whether DF's grip scaling depends on receiving veridical haptic feedback during grasping. In one condition, the haptic feedback was identical to the visual targets. In a second condition, the haptic feedback was of a constant intermediate width while the visual target varied trial by trial. Despite this incongruent feedback, DF still scaled her grip aperture to the visual widths of the target blocks, showing only normal adaptation to the false haptically-experienced width. Taken together, these results strengthen the view that DF's spared grasping relies on a normal mode of dorsal-stream functioning, based chiefly on visual feedforward processing. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Advanced Maintenance Simulation by Means of Hand-Based Haptic Interfaces

    NASA Astrophysics Data System (ADS)

    Nappi, Michele; Paolino, Luca; Ricciardi, Stefano; Sebillo, Monica; Vitiello, Giuliana

    Aerospace industry has been involved in virtual simulation for design and testing since the birth of virtual reality. Today this industry is showing a growing interest in the development of haptic-based maintenance training applications, which represent the most advanced way to simulate maintenance and repair tasks within a virtual environment by means of a visual-haptic approach. The goal is to allow the trainee to experiment the service procedures not only as a workflow reproduced at a visual level but also in terms of the kinaesthetic feedback involved with the manipulation of tools and components. This study, conducted in collaboration with aerospace industry specialists, is aimed to the development of an immersive virtual capable of immerging the trainees into a virtual environment where mechanics and technicians can perform maintenance simulation or training tasks by directly manipulating 3D virtual models of aircraft parts while perceiving force feedback through the haptic interface. The proposed system is based on ViRstperson, a virtual reality engine under development at the Italian Center for Aerospace Research (CIRA) to support engineering and technical activities such as design-time maintenance procedure validation, and maintenance training. This engine has been extended to support haptic-based interaction, enabling a more complete level of interaction, also in terms of impedance control, and thus fostering the development of haptic knowledge in the user. The user’s “sense of touch” within the immersive virtual environment is simulated through an Immersion CyberForce® hand-based force-feedback device. Preliminary testing of the proposed system seems encouraging.

  10. Motor skills, haptic perception and social abilities in children with mild speech disorders.

    PubMed

    Müürsepp, Iti; Aibast, Herje; Gapeyeva, Helena; Pääsuke, Mati

    2012-02-01

    The aim of the study was to evaluate motor skills, haptic object recognition and social interaction in 5-year-old children with mild specific expressive language impairment (expressive-SLI) and articulation disorder (AD) in comparison of age- and gender matched healthy children. Twenty nine children (23 boys and 6 girls) with expressive-SLI, 27 children (20 boys and 7 girls) with AD and 30 children (23 boys and 7 girls) with typically developing language as controls participated in our study. The children were examined for manual dexterity, ball skills, static and dynamic balance by M-ABC test, haptic object recognition and for social interaction by questionnaire completed by teachers. Children with mild expressive-SLI demonstrated significantly poorer results in all subtests of motor skills (p<0.05), in haptic object recognition and social interaction (p<0.01) compared to controls. There were no statistically significant differences (p>0.05) in measured parameters between children with AD and controls. Children with expressive-SLI performed considerably poorer compared to AD group in balance subtest (p<0.05), and in overall M-ABC test (p<0.01). In children with mild expressive-SLI the functional motor performance, haptic perception and social interaction are considerably more affected than in children with AD. Although motor difficulties in speech production are prevalent in AD, it is localised and does not involve children's general motor skills, haptic perception or social interaction. Copyright © 2011 The Japanese Society of Child Neurology. Published by Elsevier B.V. All rights reserved.

  11. High-fidelity bilateral teleoperation systems and the effect of multimodal haptics.

    PubMed

    Tavakoli, Mahdi; Aziminejad, Arash; Patel, Rajni V; Moallem, Mehrdad

    2007-12-01

    In master-slave teleoperation applications that deal with a delicate and sensitive environment, it is important to provide haptic feedback of slave/environment interactions to the user's hand as it improves task performance and teleoperation transparency (fidelity), which is the extent of telepresence of the remote environment available to the user through the master-slave system. For haptic teleoperation, in addition to a haptics-capable master interface, often one or more force sensors are also used, which warrant new bilateral control architectures while increasing the cost and the complexity of the teleoperation system. In this paper, we investigate the added benefits of using force sensors that measure hand/master and slave/environment interactions and of utilizing local feedback loops on the teleoperation transparency. We compare the two-channel and the four-channel bilateral control systems in terms of stability and transparency, and study the stability and performance robustness of the four-channel method against nonidealities that arise during bilateral control implementation, which include master-slave communication latency and changes in the environment dynamics. The next issue addressed in the paper deals with the case where the master interface is not haptics capable, but the slave is equipped with a force sensor. In the context of robotics-assisted soft-tissue surgical applications, we explore through human factors experiments whether slave/environment force measurements can be of any help with regard to improving task performance. The last problem we study is whether slave/environment force information, with and without haptic capability in the master interface, can help improve outcomes under degraded visual conditions.

  12. The impact of haptic feedback on students' conceptions of the cell

    NASA Astrophysics Data System (ADS)

    Minogue, James

    2005-07-01

    The purpose of this study was to investigate the efficacy of adding haptic (sense of touch) feedback to computer generated visualizations for use in middle school science instruction. Current technology allows for the simulation of tactile and kinesthetic sensations via haptic devices and a computer interface. This study, conducted with middle school students (n = 80), explored the cognitive and affective impacts of this innovative technology on students' conceptions of the cell and the process of passive transport. A pretest-posttest control group design was used and participants were randomly assigned to one of two treatment groups (n = 40 for each). Both groups experienced the same core computer-mediated instructional program. This Cell Exploration program engaged students in a 3-D immersive environment that allowed them to actively investigate the form and function of a typical animal cell including its major organelles. The program also engaged students in a study of the structure and function of the cell membrane as it pertains to the process of passive transport and the mechanisms behind the membrane's selective permeability. As they conducted their investigations, students in the experimental group received bi-modal visual and haptic (simulated tactile and kinesthetic) feedback whereas the control group students experienced the program with only visual stimuli. A battery of assessments, including objective and open-ended written response items as well as a haptic performance assessment, were used to gather quantitative and qualitative data regarding changes in students' understandings of the cell concepts prior to and following their completion of the instructional program. Additionally, the impact of haptics on the affective domain of students' learning was assessed using a post-experience semi-structured interview and an attitudinal survey. Results showed that students from both conditions (Visual-Only and Visual + Haptic) found the instructional program interesting and engaging. Additionally, the vast majority of the students reported that they learned a lot about and were more interested in the topic due to their participation. Moreover, students who received the bi-modal (Visual + Haptic) feedback indicated that they experienced lower levels of frustration and spatial disorientation as they conducted their investigations when compared to individuals that relied solely on vision. There were no significant differences measured across the treatment groups on the cognitive assessment items. Despite this finding, the study provided valuable insight into the theoretical and practical considerations involved in the development of multimodal instructional programs.

  13. INCREASING SAVING BEHAVIOR THROUGH AGE-PROGRESSED RENDERINGS OF THE FUTURE SELF

    PubMed Central

    HERSHFIELD, HAL E.; GOLDSTEIN, DANIEL G.; SHARPE, WILLIAM F.; FOX, JESSE; YEYKELIS, LEO; CARSTENSEN, LAURA L.; BAILENSON, JEREMY N.

    2014-01-01

    Many people fail to save what they need to for retirement (Munnell, Webb, and Golub-Sass 2009). Research on excessive discounting of the future suggests that removing the lure of immediate rewards by pre-committing to decisions, or elaborating the value of future rewards can both make decisions more future-oriented. In this article, we explore a third and complementary route, one that deals not with present and future rewards, but with present and future selves. In line with thinkers who have suggested that people may fail, through a lack of belief or imagination, to identify with their future selves (Parfit 1971; Schelling 1984), we propose that allowing people to interact with age-progressed renderings of themselves will cause them to allocate more resources toward the future. In four studies, participants interacted with realistic computer renderings of their future selves using immersive virtual reality hardware and interactive decision aids. In all cases, those who interacted with virtual future selves exhibited an increased tendency to accept later monetary rewards over immediate ones. PMID:24634544

  14. Tangible display systems: direct interfaces for computer-based studies of surface appearance

    NASA Astrophysics Data System (ADS)

    Darling, Benjamin A.; Ferwerda, James A.

    2010-02-01

    When evaluating the surface appearance of real objects, observers engage in complex behaviors involving active manipulation and dynamic viewpoint changes that allow them to observe the changing patterns of surface reflections. We are developing a class of tangible display systems to provide these natural modes of interaction in computer-based studies of material perception. A first-generation tangible display was created from an off-the-shelf laptop computer containing an accelerometer and webcam as standard components. Using these devices, custom software estimated the orientation of the display and the user's viewing position. This information was integrated with a 3D rendering module so that rotating the display or moving in front of the screen would produce realistic changes in the appearance of virtual objects. In this paper, we consider the design of a second-generation system to improve the fidelity of the virtual surfaces rendered to the screen. With a high-quality display screen and enhanced tracking and rendering capabilities, a secondgeneration system will be better able to support a range of appearance perception applications.

  15. Patient-specific bronchoscopy visualization through BRDF estimation and disocclusion correction.

    PubMed

    Chung, Adrian J; Deligianni, Fani; Shah, Pallav; Wells, Athol; Yang, Guang-Zhong

    2006-04-01

    This paper presents an image-based method for virtual bronchoscope with photo-realistic rendering. The technique is based on recovering bidirectional reflectance distribution function (BRDF) parameters in an environment where the choice of viewing positions, directions, and illumination conditions are restricted. Video images of bronchoscopy examinations are combined with patient-specific three-dimensional (3-D) computed tomography data through two-dimensional (2-D)/3-D registration and shading model parameters are then recovered by exploiting the restricted lighting configurations imposed by the bronchoscope. With the proposed technique, the recovered BRDF is used to predict the expected shading intensity, allowing a texture map independent of lighting conditions to be extracted from each video frame. To correct for disocclusion artefacts, statistical texture synthesis was used to recreate the missing areas. New views not present in the original bronchoscopy video are rendered by evaluating the BRDF with different viewing and illumination parameters. This allows free navigation of the acquired 3-D model with enhanced photo-realism. To assess the practical value of the proposed technique, a detailed visual scoring that involves both real and rendered bronchoscope images is conducted.

  16. Sharing control between humans and automation using haptic interface: primary and secondary task performance benefits.

    PubMed

    Griffiths, Paul G; Gillespie, R Brent

    2005-01-01

    This paper describes a paradigm for human/automation control sharing in which the automation acts through a motor coupled to a machine's manual control interface. The manual interface becomes a haptic display, continually informing the human about automation actions. While monitoring by feel, users may choose either to conform to the automation or override it and express their own control intentions. This paper's objective is to demonstrate that adding automation through haptic display can be used not only to improve performance on a primary task but also to reduce perceptual demands or free attention for a secondary task. Results are presented from three experiments in which 11 participants completed a lane-following task using a motorized steering wheel on a fixed-base driving simulator. The automation behaved like a copilot, assisting with lane following by applying torques to the steering wheel. Results indicate that haptic assist improves lane following by least 30%, p < .0001, while reducing visual demand by 29%, p < .0001, or improving reaction time in a secondary tone localization task by 18 ms, p = .0009. Potential applications of this research include the design of automation interfaces based on haptics that support human/automation control sharing better than traditional push-button automation interfaces.

  17. Figure/Ground Segmentation via a Haptic Glance: Attributing Initial Finger Contacts to Objects or Their Supporting Surfaces.

    PubMed

    Pawluk, D; Kitada, R; Abramowicz, A; Hamilton, C; Lederman, S J

    2011-01-01

    The current study addresses the well-known "figure/ground" problem in human perception, a fundamental topic that has received surprisingly little attention from touch scientists to date. Our approach is grounded in, and directly guided by, current knowledge concerning the nature of haptic processing. Given inherent figure/ground ambiguity in natural scenes and limited sensory inputs from first contact (a "haptic glance"), we consider first whether people are even capable of differentiating figure from ground (Experiments 1 and 2). Participants were required to estimate the strength of their subjective impression that they were feeling an object (i.e., figure) as opposed to just the supporting structure (i.e., ground). Second, we propose a tripartite factor classification scheme to further assess the influence of kinetic, geometric (Experiments 1 and 2), and material (Experiment 2) factors on haptic figure/ground segmentation, complemented by more open-ended subjective responses obtained at the end of the experiment. Collectively, the results indicate that under certain conditions it is possible to segment figure from ground via a single haptic glance with a reasonable degree of certainty, and that all three factor classes influence the estimated likelihood that brief, spatially distributed fingertip contacts represent contact with an object and/or its background supporting structure.

  18. Development of a Robotic Colonoscopic Manipulation System, Using Haptic Feedback Algorithm

    PubMed Central

    Woo, Jaehong; Choi, Jae Hyuk; Seo, Jong Tae

    2017-01-01

    Purpose Colonoscopy is one of the most effective diagnostic and therapeutic tools for colorectal diseases. We aim to propose a master-slave robotic colonoscopy that is controllable in remote site using conventional colonoscopy. Materials and Methods The master and slave robot were developed to use conventional flexible colonoscopy. The robotic colonoscopic procedure was performed using a colonoscope training model by one expert endoscopist and two unexperienced engineers. To provide the haptic sensation, the insertion force and the rotating torque were measured and sent to the master robot. Results A slave robot was developed to hold the colonoscopy and its knob, and perform insertion, rotation, and two tilting motions of colonoscope. A master robot was designed to teach motions of the slave robot. These measured force and torque were scaled down by one tenth to provide the operator with some reflection force and torque at the haptic device. The haptic sensation and feedback system was successful and helpful to feel the constrained force or torque in colon. The insertion time using robotic system decreased with repeated procedures. Conclusion This work proposed a robotic approach for colonoscopy using haptic feedback algorithm, and this robotic device would effectively perform colonoscopy with reduced burden and comparable safety for patients in remote site. PMID:27873506

  19. Improved PMMA single-piece haptic materials

    NASA Astrophysics Data System (ADS)

    Healy, Donald D.; Wilcox, Christopher D.

    1991-12-01

    During the past fifteen years, Intraocular lens (IOL) haptic preferences have shifted from a variety of multi-piece haptic materials to single-piece PMMA. This is due in part to the research of David Apple, M.D., and other who have suggested that All-PMMA implants result in reduced cell flare and better centration. Consequently, single-piece IOLs now represent 45% of all IOL implants. However, many surgeons regard single-piece IOL designs as nonflexible and more difficult to implant than multipiece IOLs. These handling characteristics have slowed the shift from multi-piece to single-piece IOLs. As a result of these handling characteristics, single-piece lenses experience relatively high breakage rates because of handling before insertion and during insertion. To improve these characteristics, manufacturers have refined single-piece IOL haptic designs by pushing the limits of PMMA's physical properties. Furthermore, IOL manufacturers have begun to alter the material itself to change its physical properties. In particular, two new PMMA materials have emerged in the marketplace: Flexeon trademark, a crosslinked polymer and CM trademark, a material with molecularly realigned PMMA. This paper examines three specific measurements of a haptic's strength and flexibility: tensile strength, plastic memory and material plasticity/elasticity. The paper compares with Flexeon trademark and CM trademark lenses to noncrosslinked one-piece lenses and standard polypropylene multi-piece lenses.

  20. Discriminating Tissue Stiffness with a Haptic Catheter: Feeling the Inside of the Beating Heart.

    PubMed

    Kesner, Samuel B; Howe, Robert D

    2011-01-01

    Catheter devices allow physicians to access the inside of the human body easily and painlessly through natural orifices and vessels. Although catheters allow for the delivery of fluids and drugs, the deployment of devices, and the acquisition of the measurements, they do not allow clinicians to assess the physical properties of tissue inside the body due to the tissue motion and transmission limitations of the catheter devices, including compliance, friction, and backlash. The goal of this research is to increase the tactile information available to physicians during catheter procedures by providing haptic feedback during palpation procedures. To accomplish this goal, we have developed the first motion compensated actuated catheter system that enables haptic perception of fast moving tissue structures. The actuated catheter is instrumented with a distal tip force sensor and a force feedback interface that allows users to adjust the position of the catheter while experiencing the forces on the catheter tip. The efficacy of this device and interface is evaluated through a psychophyisical study comparing how accurately users can differentiate various materials attached to a cardiac motion simulator using the haptic device and a conventional manual catheter. The results demonstrate that haptics improves a user's ability to differentiate material properties and decreases the total number of errors by 50% over the manual catheter system.

  1. Palpation imaging using a haptic system for virtual reality applications in medicine.

    PubMed

    Khaled, W; Reichling, S; Bruhns, O T; Boese, H; Baumann, M; Monkman, G; Egersdoerfer, S; Klein, D; Tunayar, A; Freimuth, H; Lorenz, A; Pessavento, A; Ermert, H

    2004-01-01

    In the field of medical diagnosis, there is a strong need to determine mechanical properties of biological tissue, which are of histological and pathological relevance. Malignant tumors are significantly stiffer than surrounding healthy tissue. One of the established diagnosis procedures is the palpation of body organs and tissue. Palpation is used to measure swelling, detect bone fracture, find and measure pulse, or to locate changes in the pathological state of tissue and organs. Current medical practice routinely uses sophisticated diagnostic tests through magnetic resonance imaging (MRI), computed tomography (CT) and ultrasound (US) imaging. However, they cannot provide direct measure of tissue elasticity. Last year we presented the concept of the first haptic sensor actuator system to visualize and reconstruct mechanical properties of tissue using ultrasonic elastography and a haptic display with electrorheological fluids. We developed a real time strain imaging system for tumor diagnosis. It allows biopsies simultaneously to conventional ultrasound B-Mode and strain imaging investigations. We deduce the relative mechanical properties by using finite element simulations and numerical solution models solving the inverse problem. Various modifications on the haptic sensor actuator system have been investigated. This haptic system has the potential of inducing real time substantial forces, using a compact lightweight mechanism which can be applied to numerous areas including intraoperative navigation, telemedicine, teaching and telecommunication.

  2. An Interactive Virtual 3D Tool for Scientific Exploration of Planetary Surfaces

    NASA Astrophysics Data System (ADS)

    Traxler, Christoph; Hesina, Gerd; Gupta, Sanjeev; Paar, Gerhard

    2014-05-01

    In this paper we present an interactive 3D visualization tool for scientific analysis and planning of planetary missions. At the moment scientists have to look at individual camera images separately. There is no tool to combine them in three dimensions and look at them seamlessly as a geologist would do (by walking backwards and forwards resulting in different scales). For this reason a virtual 3D reconstruction of the terrain that can be interactively explored is necessary. Such a reconstruction has to consider multiple scales ranging from orbital image data to close-up surface image data from rover cameras. The 3D viewer allows seamless zooming between these various scales, giving scientists the possibility to relate small surface features (e.g. rock outcrops) to larger geological contexts. For a reliable geologic assessment a realistic surface rendering is important. Therefore the material properties of the rock surfaces will be considered for real-time rendering. This is achieved by an appropriate Bidirectional Reflectance Distribution Function (BRDF) estimated from the image data. The BRDF is implemented to run on the Graphical Processing Unit (GPU) to enable realistic real-time rendering, which allows a naturalistic perception for scientific analysis. Another important aspect for realism is the consideration of natural lighting conditions, which means skylight to illuminate the reconstructed scene. In our case we provide skylights from Mars and Earth, which allows switching between these two modes of illumination. This gives geologists the opportunity to perceive rock outcrops from Mars as they would appear on Earth facilitating scientific assessment. Besides viewing the virtual reconstruction on multiple scales, scientists can also perform various measurements, i.e. geo-coordinates of a selected point or distance between two surface points. Rover or other models can be placed into the scene and snapped onto certain location of the terrain. These are important features to support the planning of rover paths. In addition annotations can be placed directly into the 3D scene, which also serve as landmarks to aid navigation. The presented visualization and planning tool is a valuable asset for scientific analysis of planetary mission data. It complements traditional methods by giving access to an interactive virtual 3D reconstruction, which is realistically rendered. Representative examples and further information about the interactive 3D visualization tool can be found on the FP7-SPACE Project PRoViDE web page http://www.provide-space.eu/interactive-virtual-3d-tool/. The research leading to these results has received funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under grant agreement n° 312377 'PRoViDE'.

  3. 77 FR 15390 - Certain Mobile Electronic Devices Incorporating Haptics; Receipt of Amended Complaint...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-15

    ... INTERNATIONAL TRADE COMMISSION [DN 2875] Certain Mobile Electronic Devices Incorporating Haptics.... International Trade Commission. ACTION: Notice. SUMMARY: Notice is hereby given that the U.S. International Trade Commission has received an amended complaint entitled Certain Mobile Electronic Devices...

  4. Research on techniques for computer three-dimensional simulation of satellites and night sky

    NASA Astrophysics Data System (ADS)

    Yan, Guangwei; Hu, Haitao

    2007-11-01

    To study space attack-defense technology, a simulation of satellites is needed. We design and implement a 3d simulating system of satellites. The satellites are rendered under the Night sky background. The system structure is as follows: one computer is used to simulate the orbital of satellites, the other computers are used to render 3d simulation scene. To get a realistic effect, a three-channel multi-projector display system is constructed. We use MultiGen Creator to construct satellite and star models. We use MultiGen Distributed Vega to render the three-channel scene. There are one master and three slaves. The master controls the three slaves to render three channels separately. To get satellites' positions and attitudes, the master communicates with the satellite orbit simulator based on TCP/IP protocol. Then it calculates the observer's position, the satellites' position, the moon's and the sun's position and transmits the data to the slaves. To get a smooth orbit of target satellites, an orbit prediction method is used. Because the target satellite data packets and the attack satellite data packets cannot keep synchronization in the network, a target satellite dithering phenomenon will occur when the scene is rendered. To resolve this problem, an anti-dithering algorithm is designed. To render Night sky background, a file which stores stars' position and brightness data is used. According to the brightness of each star, the stars are classified into different magnitude. The star model is scaled according to the magnitude. All the stars are distributed on a celestial sphere. Experiments show, the whole system can run correctly, and the frame rate can reach 30Hz. The system can be used in a space attack-defense simulation field.

  5. Effects of visual information regarding allocentric processing in haptic parallelity matching.

    PubMed

    Van Mier, Hanneke I

    2013-10-01

    Research has revealed that haptic perception of parallelity deviates from physical reality. Large and systematic deviations have been found in haptic parallelity matching most likely due to the influence of the hand-centered egocentric reference frame. Providing information that increases the influence of allocentric processing has been shown to improve performance on haptic matching. In this study allocentric processing was stimulated by providing informative vision in haptic matching tasks that were performed using hand- and arm-centered reference frames. Twenty blindfolded participants (ten men, ten women) explored the orientation of a reference bar with the non-dominant hand and subsequently matched (task HP) or mirrored (task HM) its orientation on a test bar with the dominant hand. Visual information was provided by means of informative vision with participants having full view of the test bar, while the reference bar was blocked from their view (task VHP). To decrease the egocentric bias of the hands, participants also performed a visual haptic parallelity drawing task (task VHPD) using an arm-centered reference frame, by drawing the orientation of the reference bar. In all tasks, the distance between and orientation of the bars were manipulated. A significant effect of task was found; performance improved from task HP, to VHP to VHPD, and HM. Significant effects of distance were found in the first three tasks, whereas orientation and gender effects were only significant in tasks HP and VHP. The results showed that stimulating allocentric processing by means of informative vision and reducing the egocentric bias by using an arm-centered reference frame led to most accurate performance on parallelity matching. © 2013 Elsevier B.V. All rights reserved.

  6. Robot-Assisted Proprioceptive Training with Added Vibro-Tactile Feedback Enhances Somatosensory and Motor Performance.

    PubMed

    Cuppone, Anna Vera; Squeri, Valentina; Semprini, Marianna; Masia, Lorenzo; Konczak, Jürgen

    2016-01-01

    This study examined the trainability of the proprioceptive sense and explored the relationship between proprioception and motor learning. With vision blocked, human learners had to perform goal-directed wrist movements relying solely on proprioceptive/haptic cues to reach several haptically specified targets. One group received additional somatosensory movement error feedback in form of vibro-tactile cues applied to the skin of the forearm. We used a haptic robotic device for the wrist and implemented a 3-day training regimen that required learners to make spatially precise goal-directed wrist reaching movements without vision. We assessed whether training improved the acuity of the wrist joint position sense. In addition, we checked if sensory learning generalized to the motor domain and improved spatial precision of wrist tracking movements that were not trained. The main findings of the study are: First, proprioceptive acuity of the wrist joint position sense improved after training for the group that received the combined proprioceptive/haptic and vibro-tactile feedback (VTF). Second, training had no impact on the spatial accuracy of the untrained tracking task. However, learners who had received VTF significantly reduced their reliance on haptic guidance feedback when performing the untrained motor task. That is, concurrent VTF was highly salient movement feedback and obviated the need for haptic feedback. Third, VTF can be also provided by the limb not involved in the task. Learners who received VTF to the contralateral limb equally benefitted. In conclusion, somatosensory training can significantly enhance proprioceptive acuity within days when learning is coupled with vibro-tactile sensory cues that provide feedback about movement errors. The observable sensory improvements in proprioception facilitates motor learning and such learning may generalize to the sensorimotor control of the untrained motor tasks. The implications of these findings for neurorehabilitation are discussed.

  7. Using haptic feedback to increase seat belt use of service vehicle drivers.

    DOT National Transportation Integrated Search

    2011-01-01

    This study pilot-tested a new application of a technology-based intervention to increase seat belt use. The technology was based on a : contingency in which unbelted drivers experienced sustained haptic feedback to the gas pedal when they exceeded 25...

  8. Haptic identification of objects and their depictions.

    PubMed

    Klatzky, R L; Loomis, J M; Lederman, S J; Wake, H; Fujita, N

    1993-08-01

    Haptic identification of real objects is superior to that of raised two-dimensional (2-D) depictions. Three explanations of real-object superiority were investigated: contribution of material information, contribution of 3-D shape and size, and greater potential for integration across the fingers. In Experiment 1, subjects, while wearing gloves that gently attenuated material information, haptically identified real objects that provided reduced cues to compliance, mass, and part motion. The gloves permitted exploration with free hand movement, a single outstretched finger, or five outstretched fingers. Performance decreased over these three conditions but was superior to identification of pictures of the same objects in all cases, indicating the contribution of 3-D structure and integration across the fingers. Picture performance was also better with five fingers than with one. In Experiment 2, the subjects wore open-fingered gloves, which provided them with material information. Consequently, the effect of type of exploration was substantially reduced but not eliminated. Material compensates somewhat for limited access to object structure but is not the primary basis for haptic object identification.

  9. When vision is not an option: children's integration of auditory and haptic information is suboptimal.

    PubMed

    Petrini, Karin; Remark, Alicia; Smith, Louise; Nardini, Marko

    2014-05-01

    When visual information is available, human adults, but not children, have been shown to reduce sensory uncertainty by taking a weighted average of sensory cues. In the absence of reliable visual information (e.g. extremely dark environment, visual disorders), the use of other information is vital. Here we ask how humans combine haptic and auditory information from childhood. In the first experiment, adults and children aged 5 to 11 years judged the relative sizes of two objects in auditory, haptic, and non-conflicting bimodal conditions. In , different groups of adults and children were tested in non-conflicting and conflicting bimodal conditions. In , adults reduced sensory uncertainty by integrating the cues optimally, while children did not. In , adults and children used similar weighting strategies to solve audio-haptic conflict. These results suggest that, in the absence of visual information, optimal integration of cues for discrimination of object size develops late in childhood. © 2014 The Authors. Developmental Science Published by John Wiley & Sons Ltd.

  10. Functional Equivalence of Spatial Images from Touch and Vision: Evidence from Spatial Updating in Blind and Sighted Individuals

    PubMed Central

    Giudice, Nicholas A.; Betty, Maryann R.; Loomis, Jack M.

    2012-01-01

    This research examines whether visual and haptic map learning yield functionally equivalent spatial images in working memory, as evidenced by similar encoding bias and updating performance. In three experiments, participants learned four-point routes either by seeing or feeling the maps. At test, blindfolded participants made spatial judgments about the maps from imagined perspectives that were either aligned or misaligned with the maps as represented in working memory. Results from Experiments 1 and 2 revealed a highly similar pattern of latencies and errors between visual and haptic conditions. These findings extend the well known alignment biases for visual map learning to haptic map learning, provide further evidence of haptic updating, and most importantly, show that learning from the two modalities yields very similar performance across all conditions. Experiment 3 found the same encoding biases and updating performance with blind individuals, demonstrating that functional equivalence cannot be due to visual recoding and is consistent with an amodal hypothesis of spatial images. PMID:21299331

  11. Design and Calibration of a New 6 DOF Haptic Device

    PubMed Central

    Qin, Huanhuan; Song, Aiguo; Liu, Yuqing; Jiang, Guohua; Zhou, Bohe

    2015-01-01

    For many applications such as tele-operational robots and interactions with virtual environments, it is better to have performance with force feedback than without. Haptic devices are force reflecting interfaces. They can also track human hand positions simultaneously. A new 6 DOF (degree-of-freedom) haptic device was designed and calibrated in this study. It mainly contains a double parallel linkage, a rhombus linkage, a rotating mechanical structure and a grasping interface. Benefited from the unique design, it is a hybrid structure device with a large workspace and high output capability. Therefore, it is capable of multi-finger interactions. Moreover, with an adjustable base, operators can change different postures without interrupting haptic tasks. To investigate the performance regarding position tracking accuracy and static output forces, we conducted experiments on a three-dimensional electric sliding platform and a digital force gauge, respectively. Displacement errors and force errors are calculated and analyzed. To identify the capability and potential of the device, four application examples were programmed. PMID:26690449

  12. A haptic pedal for surgery assistance.

    PubMed

    Díaz, Iñaki; Gil, Jorge Juan; Louredo, Marcos

    2014-09-01

    The research and development of mechatronic aids for surgery is a persistent challenge in the field of robotic surgery. This paper presents a new haptic pedal conceived to assist surgeons in the operating room by transmitting real-time surgical information through the foot. An effective human-robot interaction system for medical practice must exchange appropriate information with the operator as quickly and accurately as possible. Moreover, information must flow through the appropriate sensory modalities for a natural and simple interaction. However, users of current robotic systems might experience cognitive overload and be increasingly overwhelmed by data streams from multiple modalities. A new haptic channel is thus explored to complement and improve existing systems. A preliminary set of experiments has been carried out to evaluate the performance of the proposed system in a virtual surgical drilling task. The results of the experiments show the effectiveness of the haptic pedal in providing surgical information through the foot. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  13. Sensory subtraction in robot-assisted surgery: fingertip skin deformation feedback to ensure safety and improve transparency in bimanual haptic interaction.

    PubMed

    Meli, Leonardo; Pacchierotti, Claudio; Prattichizzo, Domenico

    2014-04-01

    This study presents a novel approach to force feedback in robot-assisted surgery. It consists of substituting haptic stimuli, composed of a kinesthetic component and a skin deformation, with cutaneous stimuli only. The force generated can then be thought as a subtraction between the complete haptic interaction, cutaneous, and kinesthetic, and the kinesthetic part of it. For this reason, we refer to this approach as sensory subtraction. Sensory subtraction aims at outperforming other nonkinesthetic feedback techniques in teleoperation (e.g., sensory substitution) while guaranteeing the stability and safety of the system. We tested the proposed approach in a challenging 7-DoF bimanual teleoperation task, similar to the Pegboard experiment of the da Vinci Skills Simulator. Sensory subtraction showed improved performance in terms of completion time, force exerted, and total displacement of the rings with respect to two popular sensory substitution techniques. Moreover, it guaranteed a stable interaction in the presence of a communication delay in the haptic loop.

  14. Influence of surgical gloves on haptic perception thresholds.

    PubMed

    Hatzfeld, Christian; Dorsch, Sarah; Neupert, Carsten; Kupnik, Mario

    2018-02-01

    Impairment of haptic perception by surgical gloves could reduce requirements on haptic systems for surgery. While grip forces and manipulation capabilities were not impaired in previous studies, no data is available for perception thresholds. Absolute and differential thresholds (20 dB above threshold) of 24 subjects were measured for frequencies of 25 and 250 Hz with a Ψ-method. Effects of wearing a surgical glove, moisture on the contact surface and subject's experience with gloves were incorporated in a full-factorial experimental design. Absolute thresholds of 12.8 dB and -29.6 dB (means for 25 and 250 Hz, respectively) and differential thresholds of -12.6 dB and -9.5 dB agree with previous studies. A relevant effect of the frequency on absolute thresholds was found. Comparisons of glove- and no-glove-conditions did not reveal a significant mean difference. Wearing a single surgical glove does not affect absolute and differential haptic perception thresholds. Copyright © 2017 John Wiley & Sons, Ltd.

  15. A Three-Axis Force Sensor for Dual Finger Haptic Interfaces

    PubMed Central

    Fontana, Marco; Marcheschi, Simone; Salsedo, Fabio; Bergamasco, Massimo

    2012-01-01

    In this work we present the design process, the characterization and testing of a novel three-axis mechanical force sensor. This sensor is optimized for use in closed-loop force control of haptic devices with three degrees of freedom. In particular the sensor has been conceived for integration with a dual finger haptic interface that aims at simulating forces that occur during grasping and surface exploration. The sensing spring structure has been purposely designed in order to match force and layout specifications for the application. In this paper the design of the sensor is presented, starting from an analytic model that describes the characteristic matrix of the sensor. A procedure for designing an optimal overload protection mechanism is proposed. In the last part of the paper the authors describe the experimental characterization and the integrated test on a haptic hand exoskeleton showing the improvements in the controller performances provided by the inclusion of the force sensor. PMID:23202012

  16. Development of a novel haptic glove for improving finger dexterity in poststroke rehabilitation.

    PubMed

    Lin, Chi-Ying; Tsai, Chia-Min; Shih, Pei-Cheng; Wu, Hsiao-Ching

    2015-01-01

    Almost all stroke patients experience a certain degree of fine motor impairment, and impeded finger movement may limit activities in daily life. Thus, to improve the quality of life of stroke patients, designing an efficient training device for fine motor rehabilitation is crucial. This study aimed to develop a novel fine motor training glove that integrates a virtual-reality based interactive environment with vibrotactile feedback for more effective post stroke hand rehabilitation. The proposed haptic rehabilitation device is equipped with small DC vibration motors for vibrotactile feedback stimulation and piezoresistive thin-film force sensors for motor function evaluation. Two virtual-reality based games ``gopher hitting'' and ``musical note hitting'' were developed as a haptic interface. According to the designed rehabilitation program, patients intuitively push and practice their fingers to improve the finger isolation function. Preliminary tests were conducted to assess the feasibility of the developed haptic rehabilitation system and to identify design concerns regarding the practical use in future clinical testing.

  17. The Effect of Visual Experience on Perceived Haptic Verticality When Tilted in the Roll Plane

    PubMed Central

    Cuturi, Luigi F.; Gori, Monica

    2017-01-01

    The orientation of the body in space can influence perception of verticality leading sometimes to biases consistent with priors peaked at the most common head and body orientation, that is upright. In this study, we investigate haptic perception of verticality in sighted individuals and early and late blind adults when tilted counterclockwise in the roll plane. Participants were asked to perform a stimulus orientation discrimination task with their body tilted to their left ear side 90° relative to gravity. Stimuli were presented by using a motorized haptic bar. In order to test whether different reference frames relative to the head influenced perception of verticality, we varied the position of the stimulus on the body longitudinal axis. Depending on the stimulus position sighted participants tended to have biases away or toward their body tilt. Visually impaired individuals instead show a different pattern of verticality estimations. A bias toward head and body tilt (i.e., Aubert effect) was observed in late blind individuals. Interestingly, no strong biases were observed in early blind individuals. Overall, these results posit visual sensory information to be fundamental in influencing the haptic readout of proprioceptive and vestibular information about body orientation relative to gravity. The acquisition of an idiotropic vector signaling the upright might take place through vision during development. Regarding early blind individuals, independent spatial navigation experience likely enhanced by echolocation behavior might have a role in such acquisition. In the case of participants with late onset blindness, early experience of vision might lead them to anchor their visually acquired priors to the haptic modality with no disambiguation between head and body references as observed in sighted individuals (Fraser et al., 2015). With our study, we aim to investigate haptic perception of gravity direction in unusual body tilts when vision is absent due to visual impairment. Insofar, our findings throw light on the influence of proprioceptive/vestibular sensory information on haptic perceived verticality in blind individuals showing how this phenomenon is affected by visual experience. PMID:29270109

  18. Does the Integration of Haptic and Visual Cues Reduce the Effect of a Biased Visual Reference Frame on the Subjective Head Orientation?

    PubMed Central

    Gueguen, Marc; Vuillerme, Nicolas; Isableu, Brice

    2012-01-01

    Background The selection of appropriate frames of reference (FOR) is a key factor in the elaboration of spatial perception and the production of robust interaction with our environment. The extent to which we perceive the head axis orientation (subjective head orientation, SHO) with both accuracy and precision likely contributes to the efficiency of these spatial interactions. A first goal of this study was to investigate the relative contribution of both the visual and egocentric FOR (centre-of-mass) in the SHO processing. A second goal was to investigate humans' ability to process SHO in various sensory response modalities (visual, haptic and visuo-haptic), and the way they modify the reliance to either the visual or egocentric FORs. A third goal was to question whether subjects combined visual and haptic cues optimally to increase SHO certainty and to decrease the FORs disruption effect. Methodology/Principal Findings Thirteen subjects were asked to indicate their SHO while the visual and/or egocentric FORs were deviated. Four results emerged from our study. First, visual rod settings to SHO were altered by the tilted visual frame but not by the egocentric FOR alteration, whereas no haptic settings alteration was observed whether due to the egocentric FOR alteration or the tilted visual frame. These results are modulated by individual analysis. Second, visual and egocentric FOR dependency appear to be negatively correlated. Third, the response modality enrichment appears to improve SHO. Fourth, several combination rules of the visuo-haptic cues such as the Maximum Likelihood Estimation (MLE), Winner-Take-All (WTA) or Unweighted Mean (UWM) rule seem to account for SHO improvements. However, the UWM rule seems to best account for the improvement of visuo-haptic estimates, especially in situations with high FOR incongruence. Finally, the data also indicated that FOR reliance resulted from the application of UWM rule. This was observed more particularly, in the visual dependent subject. Conclusions: Taken together, these findings emphasize the importance of identifying individual spatial FOR preferences to assess the efficiency of our interaction with the environment whilst performing spatial tasks. PMID:22509295

  19. Haptic-assistive technologies for audition and vision sensory disabilities.

    PubMed

    Sorgini, Francesca; Caliò, Renato; Carrozza, Maria Chiara; Oddo, Calogero Maria

    2018-05-01

    The aim of this review is to analyze haptic sensory substitution technologies for deaf, blind and deaf-blind individuals. The literature search has been performed in Scopus, PubMed and Google Scholar databases using selected keywords, analyzing studies from 1960s to present. Search on databases for scientific publications has been accompanied by web search for commercial devices. Results have been classified by sensory disability and functionality, and analyzed by assistive technology. Complementary analyses have also been carried out on websites of public international agencies, such as the World Health Organization (WHO), and of associations representing sensory disabled persons. The reviewed literature provides evidences that sensory substitution aids are able to mitigate in part the deficits in language learning, communication and navigation for deaf, blind and deaf-blind individuals, and that the tactile sense can be a means of communication to provide some kind of information to sensory disabled individuals. A lack of acceptance emerged from the discussion of capabilities and limitations of haptic assistive technologies. Future researches shall go towards miniaturized, custom-designed and low-cost haptic interfaces and integration with personal devices such as smartphones for a major diffusion of sensory aids among disabled. Implications for rehabilitation Systematic review of state of the art of haptic assistive technologies for vision and audition sensory disabilities. Sensory substitution systems for visual and hearing disabilities have a central role in the transmission of information for patients with sensory impairments, enabling users to interact with the not disabled community in daily activities. Visual and auditory inputs are converted in haptic feedback via different actuation technologies. The information is presented in the form of static or dynamic stimulation of the skin. Their effectiveness and ease of use make haptic sensory substitution systems suitable for patients with different levels of disabilities. They constitute a cheaper and less invasive alternative to implantable partial sensory restitution systems. Future researches are oriented towards the optimization of the stimulation parameters together with the development of miniaturized, custom-designed and low-cost aids operating in synergy in networks, aiming to increase patients' acceptability of these technologies.

  20. Predictability, Force and (Anti-)Resonance in Complex Object Control.

    PubMed

    Maurice, Pauline; Hogan, Neville; Sternad, Dagmar

    2018-04-18

    Manipulation of complex objects as in tool use is ubiquitous and has given humans an evolutionary advantage. This study examined the strategies humans choose when manipulating an object with underactuated internal dynamics, such as a cup of coffee. The object's dynamics renders the temporal evolution complex, possibly even chaotic, and difficult to predict. A cart-and-pendulum model, loosely mimicking coffee sloshing in a cup, was implemented in a virtual environment with a haptic interface. Participants rhythmically manipulated the virtual cup containing a rolling ball; they could choose the oscillation frequency, while the amplitude was prescribed. Three hypotheses were tested: 1) humans decrease interaction forces between hand and object; 2) humans increase the predictability of the object dynamics; 3) humans exploit the resonances of the coupled object-hand system. Analysis revealed that humans chose either a high-frequency strategy with anti-phase cup-and-ball movements or a low-frequency strategy with in-phase cup-and-ball movements. Counter Hypothesis 1, they did not decrease interaction force; instead, they increased the predictability of the interaction dynamics, quantified by mutual information, supporting Hypothesis 2. To address Hypothesis 3, frequency analysis of the coupled hand-object system revealed two resonance frequencies separated by an anti-resonance frequency. The low-frequency strategy exploited one resonance, while the high-frequency strategy afforded more choice, consistent with the frequency response of the coupled system; both strategies avoided the anti-resonance. Hence, humans did not prioritize interaction force, but rather strategies that rendered interactions predictable. These findings highlight that physical interactions with complex objects pose control challenges not present in unconstrained movements.

  1. Learning from vision-to-touch is different than learning from touch-to-vision.

    PubMed

    Wismeijer, Dagmar A; Gegenfurtner, Karl R; Drewing, Knut

    2012-01-01

    We studied whether vision can teach touch to the same extent as touch seems to teach vision. In a 2 × 2 between-participants learning study, we artificially correlated visual gloss cues with haptic compliance cues. In two "natural" tasks, we tested whether visual gloss estimations have an influence on haptic estimations of softness and vice versa. In two "novel" tasks, in which participants were either asked to haptically judge glossiness or to visually judge softness, we investigated how perceptual estimates transfer from one sense to the other. Our results showed that vision does not teach touch as efficient as touch seems to teach vision.

  2. Fiber optical sensor system for shape and haptics for flexible instruments in minimally invasive surgery: overview and status quo

    NASA Astrophysics Data System (ADS)

    Ledermann, Christoph; Pauer, Hendrikje; Woern, Heinz

    2014-05-01

    In minimally invasive surgery, exible mechatronic instruments promise to improve the overall performance of surgical interventions. However, those instruments require highly developed sensors in order to provide haptic feedback to the surgeon or to enable (semi-)autonomous tasks. Precisely, haptic sensors and a shape sensor are required. In this paper, we present our ber optical sensor system of Fiber Bragg Gratings, which consists of a shape sensor, a kinesthetic sensor and a tactile sensor. The status quo of each of the three sensors is described, as well as the concept to integrate them into one ber optical sensor system.

  3. Sensorimotor enhancement with a mixed reality system for balance and mobility rehabilitation.

    PubMed

    Fung, Joyce; Perez, Claire F

    2011-01-01

    We have developed a mixed reality system incorporating virtual reality (VR), surface perturbations and light touch for gait rehabilitation. Haptic touch has emerged as a novel and efficient technique to improve postural control and dynamic stability. Our system combines visual display with the manipulation of physical environments and addition of haptic feedback to enhance balance and mobility post stroke. A research study involving 9 participants with stroke and 9 age-matched healthy individuals show that the haptic cue provided while walking is an effective means of improving gait stability in people post stroke, especially during challenging environmental conditions such as downslope walking.

  4. 78 FR 23593 - Certain Mobile Electronic Devices Incorporating Haptics; Termination of Investigation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-19

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-834] Certain Mobile Electronic Devices... this investigation may be viewed on the Commission's electronic docket (EDIS) at http://edis.usitc.gov... mobile electronic devices incorporating haptics that infringe certain claims of six Immersion patents. 77...

  5. Real-time simulation of soft tissue deformation and electrocautery procedures in laparoscopic rectal cancer radical surgery.

    PubMed

    Sui, Yuan; Pan, Jun J; Qin, Hong; Liu, Hao; Lu, Yun

    2017-12-01

    Laparoscopic surgery (LS), also referred to as minimally invasive surgery, is a modern surgical technique which is widely applied. The fulcrum effect makes LS a non-intuitive motor skill with a steep learning curve. A hybrid model of tetrahedrons and a multi-layer triangular mesh are constructed to simulate the deformable behavior of the rectum and surrounding tissues in the Position-Based Dynamics (PBD) framework. A heat-conduction based electric-burn technique is employed to simulate the electrocautery procedure. The simulator has been applied for laparoscopic rectum cancer surgery training. From the experimental results, trainees can operate in real time with high degrees of stability and fidelity. A preliminary study was performed to evaluate the realism and usefulness. This prototype simulator has been tested and verified by colorectal surgeons through a pilot study. They believed both the visual and the haptic performance of the simulation are realistic and helpful to enhance laparoscopic skills. Copyright © 2017 John Wiley & Sons, Ltd.

  6. On the design of a miniature haptic ring for cutaneous force feedback using shape memory alloy actuators

    NASA Astrophysics Data System (ADS)

    Hwang, Donghyun; Lee, Jaemin; Kim, Keehoon

    2017-10-01

    This paper proposes a miniature haptic ring that can display touch/pressure and shearing force to the user’s fingerpad. For practical use and wider application of the device, it is developed with the aim of achieving high wearability and mobility/portability as well as cutaneous force feedback functionality. A main body of the device is designed as a ring-shaped lightweight structure with a simple driving mechanism, and thin shape memory alloy (SMA) wires having high energy density are applied as actuating elements. Also, based on a band-type wireless control unit including a wireless data communication module, the whole device could be realized as a wearable mobile haptic device system. These features enable the device to take diverse advantages on functional performances and to provide users with significant usability. In this work, the proposed miniature haptic ring is systematically designed, and its working performances are experimentally evaluated with a fabricated functional prototype. The experimental results obviously demonstrate that the proposed device exhibits higher force-to-weight ratio than conventional finger-wearable haptic devices for cutaneous force feedback. Also, it is investigated that operational performances of the device are strongly influenced by electro-thermomechanical behaviors of the SMA actuator. In addition to the experiments for performance evaluation, we conduct a preliminary user test to assess practical feasibility and usability based on user’s qualitative feedback.

  7. Vibrotactile perception assessment for a haptic interface on an antigravity suit.

    PubMed

    Ko, Sang Min; Lee, Kwangil; Kim, Daeho; Ji, Yong Gu

    2017-01-01

    Haptic technology is used in various fields to transmit information to the user with or without visual and auditory cues. This study aimed to provide preliminary data for use in developing a haptic interface for an antigravity (anti-G) suit. With the structural characteristics of the anti-G suit in mind, we determined five areas on the body (lower back, outer thighs, inner thighs, outer calves, and inner calves) on which to install ten bar-type eccentric rotating mass (ERM) motors as vibration actuators. To determine the design factors of the haptic anti-G suit, we conducted three experiments to find the absolute threshold, moderate intensity, and subjective assessments of vibrotactile stimuli. Twenty-six fighter pilots participated in the experiments, which were conducted in a fixed-based flight simulator. From the results of our study, we recommend 1) absolute thresholds of ∼11.98-15.84 Hz and 102.01-104.06 dB, 2) moderate intensities of 74.36 Hz and 126.98 dB for the lower back and 58.65 Hz and 122.37 dB for either side of the thighs and calves, and 3) subjective assessments of vibrotactile stimuli (displeasure, easy to perceive, and level of comfort). The results of this study will be useful for the design of a haptic anti-G suit. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Framework for e-learning assessment in dental education: a global model for the future.

    PubMed

    Arevalo, Carolina R; Bayne, Stephen C; Beeley, Josie A; Brayshaw, Christine J; Cox, Margaret J; Donaldson, Nora H; Elson, Bruce S; Grayden, Sharon K; Hatzipanagos, Stylianos; Johnson, Lynn A; Reynolds, Patricia A; Schönwetter, Dieter J

    2013-05-01

    The framework presented in this article demonstrates strategies for a global approach to e-curricula in dental education by considering a collection of outcome assessment tools. By combining the outcomes for overall assessment, a global model for a pilot project that applies e-assessment tools to virtual learning environments (VLE), including haptics, is presented. Assessment strategies from two projects, HapTEL (Haptics in Technology Enhanced Learning) and UDENTE (Universal Dental E-learning), act as case-user studies that have helped develop the proposed global framework. They incorporate additional assessment tools and include evaluations from questionnaires and stakeholders' focus groups. These measure each of the factors affecting the classical teaching/learning theory framework as defined by Entwistle in a standardized manner. A mathematical combinatorial approach is proposed to join these results together as a global assessment. With the use of haptic-based simulation learning, exercises for tooth preparation assessing enamel and dentine were compared to plastic teeth in manikins. Equivalence for student performance for haptic versus traditional preparation methods was established, thus establishing the validity of the haptic solution for performing these exercises. Further data collected from HapTEL are still being analyzed, and pilots are being conducted to validate the proposed test measures. Initial results have been encouraging, but clearly the need persists to develop additional e-assessment methods for new learning domains.

  9. Teaching bovine abdominal anatomy: use of a haptic simulator.

    PubMed

    Kinnison, Tierney; Forrest, Neil David; Frean, Stephen Philip; Baillie, Sarah

    2009-01-01

    Traditional methods of teaching anatomy to undergraduate medical and veterinary students are being challenged and need to adapt to modern concerns and requirements. There is a move away from the use of cadavers to new technologies as a way of complementing the traditional approaches and addressing resource and ethical problems. Haptic (touch) technology, which allows the student to feel a 3D computer-generated virtual environment, provides a novel way to address some of these challenges. To evaluate the practicalities and usefulness of a haptic simulator, first year veterinary students at the Royal Veterinary College, University of London, were taught basic bovine abdominal anatomy using a rectal palpation simulator: "The Haptic Cow." Over two days, 186 students were taught in small groups and 184 provided feedback via a questionnaire. The results were positive; the majority of students considered that the simulator had been useful for appreciating both the feel and location of key internal anatomical structures, had helped with their understanding of bovine abdominal anatomy and 3D visualization, and the tutorial had been enjoyable. The students were mostly in favor of the small group tutorial format, but some requested more time on the simulator. The findings indicate that the haptic simulator is an engaging way of teaching bovine abdominal anatomy to a large number of students in an efficient manner without using cadavers, thereby addressing some of the current challenges in anatomy teaching.

  10. Discriminating Tissue Stiffness with a Haptic Catheter: Feeling the Inside of the Beating Heart

    PubMed Central

    Kesner, Samuel B.; Howe, Robert D.

    2011-01-01

    Catheter devices allow physicians to access the inside of the human body easily and painlessly through natural orifices and vessels. Although catheters allow for the delivery of fluids and drugs, the deployment of devices, and the acquisition of the measurements, they do not allow clinicians to assess the physical properties of tissue inside the body due to the tissue motion and transmission limitations of the catheter devices, including compliance, friction, and backlash. The goal of this research is to increase the tactile information available to physicians during catheter procedures by providing haptic feedback during palpation procedures. To accomplish this goal, we have developed the first motion compensated actuated catheter system that enables haptic perception of fast moving tissue structures. The actuated catheter is instrumented with a distal tip force sensor and a force feedback interface that allows users to adjust the position of the catheter while experiencing the forces on the catheter tip. The efficacy of this device and interface is evaluated through a psychophyisical study comparing how accurately users can differentiate various materials attached to a cardiac motion simulator using the haptic device and a conventional manual catheter. The results demonstrate that haptics improves a user's ability to differentiate material properties and decreases the total number of errors by 50% over the manual catheter system. PMID:25285321

  11. Assisting Movement Training and Execution With Visual and Haptic Feedback.

    PubMed

    Ewerton, Marco; Rother, David; Weimar, Jakob; Kollegger, Gerrit; Wiemeyer, Josef; Peters, Jan; Maeda, Guilherme

    2018-01-01

    In the practice of motor skills in general, errors in the execution of movements may go unnoticed when a human instructor is not available. In this case, a computer system or robotic device able to detect movement errors and propose corrections would be of great help. This paper addresses the problem of how to detect such execution errors and how to provide feedback to the human to correct his/her motor skill using a general, principled methodology based on imitation learning. The core idea is to compare the observed skill with a probabilistic model learned from expert demonstrations. The intensity of the feedback is regulated by the likelihood of the model given the observed skill. Based on demonstrations, our system can, for example, detect errors in the writing of characters with multiple strokes. Moreover, by using a haptic device, the Haption Virtuose 6D, we demonstrate a method to generate haptic feedback based on a distribution over trajectories, which could be used as an auxiliary means of communication between an instructor and an apprentice. Additionally, given a performance measurement, the haptic device can help the human discover and perform better movements to solve a given task. In this case, the human first tries a few times to solve the task without assistance. Our framework, in turn, uses a reinforcement learning algorithm to compute haptic feedback, which guides the human toward better solutions.

  12. The role of visuohaptic experience in visually perceived depth.

    PubMed

    Ho, Yun-Xian; Serwe, Sascha; Trommershäuser, Julia; Maloney, Laurence T; Landy, Michael S

    2009-06-01

    Berkeley suggested that "touch educates vision," that is, haptic input may be used to calibrate visual cues to improve visual estimation of properties of the world. Here, we test whether haptic input may be used to "miseducate" vision, causing observers to rely more heavily on misleading visual cues. Human subjects compared the depth of two cylindrical bumps illuminated by light sources located at different positions relative to the surface. As in previous work using judgments of surface roughness, we find that observers judge bumps to have greater depth when the light source is located eccentric to the surface normal (i.e., when shadows are more salient). Following several sessions of visual judgments of depth, subjects then underwent visuohaptic training in which haptic feedback was artificially correlated with the "pseudocue" of shadow size and artificially decorrelated with disparity and texture. Although there were large individual differences, almost all observers demonstrated integration of haptic cues during visuohaptic training. For some observers, subsequent visual judgments of bump depth were unaffected by the training. However, for 5 of 12 observers, training significantly increased the weight given to pseudocues, causing subsequent visual estimates of shape to be less veridical. We conclude that haptic information can be used to reweight visual cues, putting more weight on misleading pseudocues, even when more trustworthy visual cues are available in the scene.

  13. Shared control of a medical robot with haptic guidance.

    PubMed

    Xiong, Linfei; Chng, Chin Boon; Chui, Chee Kong; Yu, Peiwu; Li, Yao

    2017-01-01

    Tele-operation of robotic surgery reduces the radiation exposure during the interventional radiological operations. However, endoscope vision without force feedback on the surgical tool increases the difficulty for precise manipulation and the risk of tissue damage. The shared control of vision and force provides a novel approach of enhanced control with haptic guidance, which could lead to subtle dexterity and better maneuvrability during MIS surgery. The paper provides an innovative shared control method for robotic minimally invasive surgery system, in which vision and haptic feedback are incorporated to provide guidance cues to the clinician during surgery. The incremental potential field (IPF) method is utilized to generate a guidance path based on the anatomy of tissue and surgical tool interaction. Haptic guidance is provided at the master end to assist the clinician during tele-operative surgical robotic task. The approach has been validated with path following and virtual tumor targeting experiments. The experiment results demonstrate that comparing with vision only guidance, the shared control with vision and haptics improved the accuracy and efficiency of surgical robotic manipulation, where the tool-position error distance and execution time are reduced. The validation experiment demonstrates that the shared control approach could help the surgical robot system provide stable assistance and precise performance to execute the designated surgical task. The methodology could also be implemented with other surgical robot with different surgical tools and applications.

  14. When Neuroscience 'Touches' Architecture: From Hapticity to a Supramodal Functioning of the Human Brain.

    PubMed

    Papale, Paolo; Chiesi, Leonardo; Rampinini, Alessandra C; Pietrini, Pietro; Ricciardi, Emiliano

    2016-01-01

    In the last decades, the rapid growth of functional brain imaging methodologies allowed cognitive neuroscience to address open questions in philosophy and social sciences. At the same time, novel insights from cognitive neuroscience research have begun to influence various disciplines, leading to a turn to cognition and emotion in the fields of planning and architectural design. Since 2003, the Academy of Neuroscience for Architecture has been supporting 'neuro-architecture' as a way to connect neuroscience and the study of behavioral responses to the built environment. Among the many topics related to multisensory perceptual integration and embodiment, the concept of hapticity was recently introduced, suggesting a pivotal role of tactile perception and haptic imagery in architectural appraisal. Arguments have thus risen in favor of the existence of shared cognitive foundations between hapticity and the supramodal functional architecture of the human brain. Precisely, supramodality refers to the functional feature of defined brain regions to process and represent specific information content in a more abstract way, independently of the sensory modality conveying such information to the brain. Here, we highlight some commonalities and differences between the concepts of hapticity and supramodality according to the distinctive perspectives of architecture and cognitive neuroscience. This comparison and connection between these two different approaches may lead to novel observations in regard to people-environment relationships, and even provide empirical foundations for a renewed evidence-based design theory.

  15. Computer animation challenges for computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Vines, Mauricio; Lee, Won-Sook; Mavriplis, Catherine

    2012-07-01

    Computer animation requirements differ from those of traditional computational fluid dynamics (CFD) investigations in that visual plausibility and rapid frame update rates trump physical accuracy. We present an overview of the main techniques for fluid simulation in computer animation, starting with Eulerian grid approaches, the Lattice Boltzmann method, Fourier transform techniques and Lagrangian particle introduction. Adaptive grid methods, precomputation of results for model reduction, parallelisation and computation on graphical processing units (GPUs) are reviewed in the context of accelerating simulation computations for animation. A survey of current specific approaches for the application of these techniques to the simulation of smoke, fire, water, bubbles, mixing, phase change and solid-fluid coupling is also included. Adding plausibility to results through particle introduction, turbulence detail and concentration on regions of interest by level set techniques has elevated the degree of accuracy and realism of recent animations. Basic approaches are described here. Techniques to control the simulation to produce a desired visual effect are also discussed. Finally, some references to rendering techniques and haptic applications are mentioned to provide the reader with a complete picture of the challenges of simulating fluids in computer animation.

  16. 77 FR 49458 - Certain Mobile Electronic Devices Incorporating Haptics; Amendment of the Complaint and Notice of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-16

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-834] Certain Mobile Electronic Devices.... 1337 in the importation, sale for importation, and sale within the United States after importation of certain mobile electronic devices incorporating haptics, by reason of the infringement of claims of six...

  17. Pattern Perception and Pictures for the Blind

    ERIC Educational Resources Information Center

    Heller, Morton A.; McCarthy, Melissa; Clark, Ashley

    2005-01-01

    This article reviews recent research on perception of tangible pictures in sighted and blind people. Haptic picture naming accuracy is dependent upon familiarity and access to semantic memory, just as in visual recognition. Performance is high when haptic picture recognition tasks do not depend upon semantic memory. Viewpoint matters for the ease…

  18. The Use of Haptic Display Technology in Education

    ERIC Educational Resources Information Center

    Barfield, Woodrow

    2009-01-01

    The experience of "virtual reality" can consist of head-tracked and stereoscopic virtual worlds, spatialized sound, haptic feedback, and to a lesser extent olfactory cues. Although virtual reality systems have been proposed for numerous applications, the field of education is one particular application that seems well-suited for virtual…

  19. Mediating Haptic Exploratory Strategies in Children Who Have Visual Impairment and Intellectual Disabilities

    ERIC Educational Resources Information Center

    McLinden, M.

    2012-01-01

    This article provides a synthesis of literature pertaining to the development of haptic exploratory strategies in children who have visual impairment and intellectual disabilities. The information received through such strategies assumes particular significance for these children, given the restricted information available through their visual…

  20. Do Haptic Representations Help Complex Molecular Learning?

    ERIC Educational Resources Information Center

    Bivall, Petter; Ainsworth, Shaaron; Tibell, Lena A. E.

    2011-01-01

    This study explored whether adding a haptic interface (that provides users with somatosensory information about virtual objects by force and tactile feedback) to a three-dimensional (3D) chemical model enhanced students' understanding of complex molecular interactions. Two modes of the model were compared in a between-groups pre- and posttest…

  1. A discrete mechanics framework for real time virtual surgical simulations with application to virtual laparoscopic nephrectomy.

    PubMed

    Zhou, Xiangmin; Zhang, Nan; Sha, Desong; Shen, Yunhe; Tamma, Kumar K; Sweet, Robert

    2009-01-01

    The inability to render realistic soft-tissue behavior in real time has remained a barrier to face and content aspects of validity for many virtual reality surgical training systems. Biophysically based models are not only suitable for training purposes but also for patient-specific clinical applications, physiological modeling and surgical planning. When considering the existing approaches for modeling soft tissue for virtual reality surgical simulation, the computer graphics-based approach lacks predictive capability; the mass-spring model (MSM) based approach lacks biophysically realistic soft-tissue dynamic behavior; and the finite element method (FEM) approaches fail to meet the real-time requirement. The present development stems from physics fundamental thermodynamic first law; for a space discrete dynamic system directly formulates the space discrete but time continuous governing equation with embedded material constitutive relation and results in a discrete mechanics framework which possesses a unique balance between the computational efforts and the physically realistic soft-tissue dynamic behavior. We describe the development of the discrete mechanics framework with focused attention towards a virtual laparoscopic nephrectomy application.

  2. Application of cellular automata approach for cloud simulation and rendering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christopher Immanuel, W.; Paul Mary Deborrah, S.; Samuel Selvaraj, R.

    Current techniques for creating clouds in games and other real time applications produce static, homogenous clouds. These clouds, while viable for real time applications, do not exhibit an organic feel that clouds in nature exhibit. These clouds, when viewed over a time period, were able to deform their initial shape and move in a more organic and dynamic way. With cloud shape technology we should be able in the future to extend to create even more cloud shapes in real time with more forces. Clouds are an essential part of any computer model of a landscape or an animation ofmore » an outdoor scene. A realistic animation of clouds is also important for creating scenes for flight simulators, movies, games, and other. Our goal was to create a realistic animation of clouds.« less

  3. Audio Haptic Videogaming for Developing Wayfinding Skills in Learners Who are Blind

    PubMed Central

    Sánchez, Jaime; de Borba Campos, Marcia; Espinoza, Matías; Merabet, Lotfi B.

    2014-01-01

    Interactive digital technologies are currently being developed as a novel tool for education and skill development. Audiopolis is an audio and haptic based videogame designed for developing orientation and mobility (O&M) skills in people who are blind. We have evaluated the cognitive impact of videogame play on O&M skills by assessing performance on a series of behavioral tasks carried out in both indoor and outdoor virtual spaces. Our results demonstrate that the use of Audiopolis had a positive impact on the development and use of O&M skills in school-aged learners who are blind. The impact of audio and haptic information on learning is also discussed. PMID:25485312

  4. Social Touch Technology: A Survey of Haptic Technology for Social Touch.

    PubMed

    Huisman, Gijs

    2017-01-01

    This survey provides an overview of work on haptic technology for social touch. Social touch has been studied extensively in psychology and neuroscience. With the development of new technologies, it is now possible to engage in social touch at a distance or engage in social touch with artificial social agents. Social touch research has inspired research into technology mediated social touch, and this line of research has found effects similar to actual social touch. The importance of haptic stimulus qualities, multimodal cues, and contextual factors in technology mediated social touch is discussed. This survey is concluded by reflecting on the current state of research into social touch technology, and providing suggestions for future research and applications.

  5. Enhancing the Performance of Passive Teleoperation Systems via Cutaneous Feedback.

    PubMed

    Pacchierotti, Claudio; Tirmizi, Asad; Bianchini, Gianni; Prattichizzo, Domenico

    2015-01-01

    We introduce a novel method to improve the performance of passive teleoperation systems with force reflection. It consists of integrating kinesthetic haptic feedback provided by common grounded haptic interfaces with cutaneous haptic feedback. The proposed approach can be used on top of any time-domain control technique that ensures a stable interaction by scaling down kinesthetic feedback when this is required to satisfy stability conditions (e.g., passivity) at the expense of transparency. Performance is recovered by providing a suitable amount of cutaneous force through custom wearable cutaneous devices. The viability of the proposed approach is demonstrated through an experiment of perceived stiffness and an experiment of teleoperated needle insertion in soft tissue.

  6. Keep an eye on your hands: on the role of visual mechanisms in processing of haptic space

    PubMed Central

    Zuidhoek, Sander; Noordzij, Matthijs L.; Kappers, Astrid M. L.

    2008-01-01

    The present paper reviews research on a haptic orientation processing. Central is a task in which a test bar has to be set parallel to a reference bar at another location. Introducing a delay between inspecting the reference bar and setting the test bar leads to a surprising improvement. Moreover, offering visual background information also elevates performance. Interestingly, (congenitally) blind individuals do not or to a weaker extent show the improvement with time, while in parallel to this, they appear to benefit less from spatial imagery processing. Together this strongly points to an important role for visual processing mechanisms in the perception of haptic inputs. PMID:18196305

  7. Learning from vision-to-touch is different than learning from touch-to-vision

    PubMed Central

    Wismeijer, Dagmar A.; Gegenfurtner, Karl R.; Drewing, Knut

    2012-01-01

    We studied whether vision can teach touch to the same extent as touch seems to teach vision. In a 2 × 2 between-participants learning study, we artificially correlated visual gloss cues with haptic compliance cues. In two “natural” tasks, we tested whether visual gloss estimations have an influence on haptic estimations of softness and vice versa. In two “novel” tasks, in which participants were either asked to haptically judge glossiness or to visually judge softness, we investigated how perceptual estimates transfer from one sense to the other. Our results showed that vision does not teach touch as efficient as touch seems to teach vision. PMID:23181012

  8. Generating soft shadows with a depth buffer algorithm

    NASA Technical Reports Server (NTRS)

    Brotman, L. S.; Badler, N. I.

    1984-01-01

    Computer-synthesized shadows used to appear with a sharp edge when cast onto a surface. At present the production of more realistic, soft shadows is considered. However, significant costs arise in connection with such a representation. The current investigation is concerned with a pragmatic approach, which combines an existing shadowing method with a popular visible surface rendering technique, called a 'depth buffer', to generate soft shadows resulting from light sources of finite extent. The considered method represents an extension of Crow's (1977) shadow volume algorithm.

  9. Development of a virtual speaking simulator using Image Based Rendering.

    PubMed

    Lee, J M; Kim, H; Oh, M J; Ku, J H; Jang, D P; Kim, I Y; Kim, S I

    2002-01-01

    The fear of speaking is often cited as the world's most common social phobia. The rapid growth of computer technology has enabled the use of virtual reality (VR) for the treatment of the fear of public speaking. There are two techniques for building virtual environments for the treatment of this fear: a model-based and a movie-based method. Both methods have the weakness that they are unrealistic and not controllable individually. To understand these disadvantages, this paper presents a virtual environment produced with Image Based Rendering (IBR) and a chroma-key simultaneously. IBR enables the creation of realistic virtual environments where the images are stitched panoramically with the photos taken from a digital camera. And the use of chroma-keys puts virtual audience members under individual control in the environment. In addition, real time capture technique is used in constructing the virtual environments enabling spoken interaction between the subject and a therapist or another subject.

  10. Modeling Images of Natural 3D Surfaces: Overview and Potential Applications

    NASA Technical Reports Server (NTRS)

    Jalobeanu, Andre; Kuehnel, Frank; Stutz, John

    2004-01-01

    Generative models of natural images have long been used in computer vision. However, since they only describe the of 2D scenes, they fail to capture all the properties of the underlying 3D world. Even though such models are sufficient for many vision tasks a 3D scene model is when it comes to inferring a 3D object or its characteristics. In this paper, we present such a generative model, incorporating both a multiscale surface prior model for surface geometry and reflectance, and an image formation process model based on realistic rendering, the computation of the posterior model parameter densities, and on the critical aspects of the rendering. We also how to efficiently invert the model within a Bayesian framework. We present a few potential applications, such as asteroid modeling and Planetary topography recovery, illustrated by promising results on real images.

  11. A Physics-Based Vibrotactile Feedback Library for Collision Events.

    PubMed

    Park, Gunhyuk; Choi, Seungmoon

    2017-01-01

    We present PhysVib: a software solution on the mobile platform extending an open-source physics engine in a multi-rate rendering architecture for automatic vibrotactile feedback upon collision events. PhysVib runs concurrently with a physics engine at a low update rate and generates vibrotactile feedback commands at a high update rate based on the simulation results of the physics engine using an exponentially-decaying sinusoidal model. We demonstrate through a user study that this vibration model is more appropriate to our purpose in terms of perceptual quality than more complex models based on sound synthesis. We also evaluated the perceptual performance of PhysVib by comparing eight vibrotactile rendering methods. Experimental results suggested that PhysVib enables more realistic vibrotactile feedback than the other methods as to perceived similarity to the visual events. PhysVib is an effective solution for providing physically plausible vibrotactile responses while reducing application development time to great extent.

  12. Factors Influencing Undergraduate Students' Acceptance of a Haptic Interface for Learning Gross Anatomy

    ERIC Educational Resources Information Center

    Yeom, Soonja; Choi-Lundberg, Derek L.; Fluck, Andrew Edward; Sale, Arthur

    2017-01-01

    Purpose: This study aims to evaluate factors influencing undergraduate students' acceptance of a computer-aided learning resource using the Phantom Omni haptic stylus to enable rotation, touch and kinaesthetic feedback and display of names of three-dimensional (3D) human anatomical structures on a visual display. Design/methodology/approach: The…

  13. Comparing Tactile Maps and Haptic Digital Representations of a Maritime Environment

    ERIC Educational Resources Information Center

    Simonnet, Mathieu; Vieilledent, Steephane; Jacobson, R. Daniel; Tisseau, Jacques

    2011-01-01

    A map exploration and representation exercise was conducted with participants who were totally blind. Representations of maritime environments were presented either with a tactile map or with a digital haptic virtual map. We assessed the knowledge of spatial configurations using a triangulation technique. The results revealed that both types of…

  14. 77 FR 20847 - Certain Mobile Electronic Devices Incorporating Haptics; Institution of Investigation Pursuant to...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-06

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-834] Certain Mobile Electronic Devices Incorporating Haptics; Institution of Investigation Pursuant to 19 U.S.C. 1337 AGENCY: U.S. International Trade.... International Trade Commission on February 7, 2012, and an amended complaint was filed with the U.S...

  15. Feel, Imagine and Learn!--Haptic Augmented Simulation and Embodied Instruction in Physics Learning

    ERIC Educational Resources Information Center

    Han, In Sook

    2010-01-01

    The purpose of this study was to investigate the potentials and effects of an embodied instructional model in abstract concept learning. This embodied instructional process included haptic augmented educational simulation as an instructional tool to provide perceptual experiences as well as further instruction to activate those previous…

  16. Let the Force Be with Us: Dyads Exploit Haptic Coupling for Coordination

    ERIC Educational Resources Information Center

    van der Wel, Robrecht P. R. D.; Knoblich, Guenther; Sebanz, Natalie

    2011-01-01

    People often perform actions that involve a direct physical coupling with another person, such as when moving furniture together. Here, we examined how people successfully coordinate such actions with others. We tested the hypothesis that dyads amplify their forces to create haptic information to coordinate. Participants moved a pole (resembling a…

  17. Haptic Tracking Permits Bimanual Independence

    ERIC Educational Resources Information Center

    Rosenbaum, David A.; Dawson, Amanda A.; Challis, John H.

    2006-01-01

    This study shows that in a novel task--bimanual haptic tracking--neurologically normal human adults can move their 2 hands independently for extended periods of time with little or no training. Participants lightly touched buttons whose positions were moved either quasi-randomly in the horizontal plane by 1 or 2 human drivers (Experiment 1), in…

  18. Unpacking Students' Conceptualizations through Haptic Feedback

    ERIC Educational Resources Information Center

    Magana, A. J.; Balachandran, S.

    2017-01-01

    While it is clear that the use of computer simulations has a beneficial effect on learning when compared to instruction without computer simulations, there is still room for improvement to fully realize their benefits for learning. Haptic technologies can fulfill the educational potential of computer simulations by adding the sense of touch.…

  19. User Acceptance of a Haptic Interface for Learning Anatomy

    ERIC Educational Resources Information Center

    Yeom, Soonja; Choi-Lundberg, Derek; Fluck, Andrew; Sale, Arthur

    2013-01-01

    Visualizing the structure and relationships in three dimensions (3D) of organs is a challenge for students of anatomy. To provide an alternative way of learning anatomy engaging multiple senses, we are developing a force-feedback (haptic) interface for manipulation of 3D virtual organs, using design research methodology, with iterations of system…

  20. Supramodality Effects in Visual and Haptic Spatial Processes

    ERIC Educational Resources Information Center

    Cattaneo, Zaira; Vecchi, Tomaso

    2008-01-01

    In this article, the authors investigated unimodal and cross-modal processes in spatial working memory. A number of locations had to be memorized within visual or haptic matrices according to different experimental conditions known to be critical in accounting for the effects of perception on imagery. Results reveal that some characteristics of…

  1. Teaching Bovine Abdominal Anatomy: Use of a Haptic Simulator

    ERIC Educational Resources Information Center

    Kinnison, Tierney; Forrest, Neil David; Frean, Stephen Philip; Baillie, Sarah

    2009-01-01

    Traditional methods of teaching anatomy to undergraduate medical and veterinary students are being challenged and need to adapt to modern concerns and requirements. There is a move away from the use of cadavers to new technologies as a way of complementing the traditional approaches and addressing resource and ethical problems. Haptic (touch)…

  2. Influence of Stimulus Symmetry and Complexity upon Haptic Scanning Strategies During Detection, Learning and Recognition Tasks.

    ERIC Educational Resources Information Center

    Locher, Paul J.; Simmons, Roger W.

    Two experiments were conducted to investigate the perceptual processes involved in haptic exploration of randomly generated shapes. Experiment one required subjects to detect symmetrical or asymmetrical characteristics of individually presented plastic shapes, also varying in complexity. Scanning time for both symmetrical and asymmetrical shapes…

  3. What Aspects of Vision Facilitate Haptic Processing?

    ERIC Educational Resources Information Center

    Millar, Susanna; Al-Attar, Zainab

    2005-01-01

    We investigate how vision affects haptic performance when task-relevant visual cues are reduced or excluded. The task was to remember the spatial location of six landmarks that were explored by touch in a tactile map. Here, we use specially designed spectacles that simulate residual peripheral vision, tunnel vision, diffuse light perception, and…

  4. Overview Electrotactile Feedback for Enhancing Human Computer Interface

    NASA Astrophysics Data System (ADS)

    Pamungkas, Daniel S.; Caesarendra, Wahyu

    2018-04-01

    To achieve effective interaction between a human and a computing device or machine, adequate feedback from the computing device or machine is required. Recently, haptic feedback is increasingly being utilised to improve the interactivity of the Human Computer Interface (HCI). Most existing haptic feedback enhancements aim at producing forces or vibrations to enrich the user’s interactive experience. However, these force and/or vibration actuated haptic feedback systems can be bulky and uncomfortable to wear and only capable of delivering a limited amount of information to the user which can limit both their effectiveness and the applications they can be applied to. To address this deficiency, electrotactile feedback is used. This involves delivering haptic sensations to the user by electrically stimulating nerves in the skin via electrodes placed on the surface of the skin. This paper presents a review and explores the capability of electrotactile feedback for HCI applications. In addition, a description of the sensory receptors within the skin for sensing tactile stimulus and electric currents alsoseveral factors which influenced electric signal to transmit to the brain via human skinare explained.

  5. Design of high-fidelity haptic display for one-dimensional force reflection applications

    NASA Astrophysics Data System (ADS)

    Gillespie, Brent; Rosenberg, Louis B.

    1995-12-01

    This paper discusses the development of a virtual reality platform for the simulation of medical procedures which involve needle insertion into human tissue. The paper's focus is the hardware and software requirements for haptic display of a particular medical procedure known as epidural analgesia. To perform this delicate manual procedure, an anesthesiologist must carefully guide a needle through various layers of tissue using only haptic cues for guidance. As a simplifying aspect for the simulator design, all motions and forces involved in the task occur along a fixed line once insertion begins. To create a haptic representation of this procedure, we have explored both physical modeling and perceptual modeling techniques. A preliminary physical model was built based on CT-scan data of the operative site. A preliminary perceptual model was built based on current training techniques for the procedure provided by a skilled instructor. We compare and contrast these two modeling methods and discuss the implications of each. We select and defend the perceptual model as a superior approach for the epidural analgesia simulator.

  6. EMG-based visual-haptic biofeedback: a tool to improve motor control in children with primary dystonia.

    PubMed

    Casellato, Claudia; Pedrocchi, Alessandra; Zorzi, Giovanna; Vernisse, Lea; Ferrigno, Giancarlo; Nardocci, Nardo

    2013-05-01

    New insights suggest that dystonic motor impairments could also involve a deficit of sensory processing. In this framework, biofeedback, making covert physiological processes more overt, could be useful. The present work proposes an innovative integrated setup which provides the user with an electromyogram (EMG)-based visual-haptic biofeedback during upper limb movements (spiral tracking tasks), to test if augmented sensory feedbacks can induce motor control improvement in patients with primary dystonia. The ad hoc developed real-time control algorithm synchronizes the haptic loop with the EMG reading; the brachioradialis EMG values were used to modify visual and haptic features of the interface: the higher was the EMG level, the higher was the virtual table friction and the background color proportionally moved from green to red. From recordings on dystonic and healthy subjects, statistical results showed that biofeedback has a significant impact, correlated with the local impairment, on the dystonic muscular control. These tests pointed out the effectiveness of biofeedback paradigms in gaining a better specific-muscle voluntary motor control. The flexible tool developed here shows promising prospects of clinical applications and sensorimotor rehabilitation.

  7. Haptic display for the VR arthroscopy training simulator

    NASA Astrophysics Data System (ADS)

    Ziegler, Rolf; Brandt, Christoph; Kunstmann, Christian; Mueller, Wolfgang; Werkhaeuser, Holger

    1997-05-01

    A specific desire to find new training methods arose from the new fields called 'minimal invasive surgery.' With the technical advance modern video arthroscopy became the standard procedure in the ORs. Holding the optical system with the video camera in one hand, watching the operation field on the monitor, the other hand was free to guide, e.g., a probe. As arthroscopy became a more common procedure it became obvious that some sort of special training was necessary to guarantee a certain level of qualification of the surgeons. Therefore, a hospital in Frankfurt, Germany approached the Fraunhofer Institute for Computer Graphics to develop a training system for arthroscopy based on VR techniques. At least the main drawback of the developed simulator is the missing of haptic perception, especially of force feedback. In cooperation with the Department of Electro-Mechanical Construction at the Darmstadt Technical University we have designed and built a haptic display for the VR arthroscopy training simulator. In parallel we developed a concept for the integration of the haptic display in a configurable way.

  8. Open Touch/Sound Maps: A system to convey street data through haptic and auditory feedback

    NASA Astrophysics Data System (ADS)

    Kaklanis, Nikolaos; Votis, Konstantinos; Tzovaras, Dimitrios

    2013-08-01

    The use of spatial (geographic) information is becoming ever more central and pervasive in today's internet society but the most of it is currently inaccessible to visually impaired users. However, access in visual maps is severely restricted to visually impaired and people with blindness, due to their inability to interpret graphical information. Thus, alternative ways of a map's presentation have to be explored, in order to enforce the accessibility of maps. Multiple types of sensory perception like touch and hearing may work as a substitute of vision for the exploration of maps. The use of multimodal virtual environments seems to be a promising alternative for people with visual impairments. The present paper introduces a tool for automatic multimodal map generation having haptic and audio feedback using OpenStreetMap data. For a desired map area, an elevation map is being automatically generated and can be explored by touch, using a haptic device. A sonification and a text-to-speech (TTS) mechanism provide also audio navigation information during the haptic exploration of the map.

  9. Topographic modelling of haptic properties of tissue products

    NASA Astrophysics Data System (ADS)

    Rosen, B.-G.; Fall, A.; Rosen, S.; Farbrot, A.; Bergström, P.

    2014-03-01

    The way a product or material feels when touched, haptics, has been shown to be a property that plays an important role when consumers determine the quality of products For tissue products in constant touch with the skin, softness" becomes a primary quality parameter. In the present work, the relationship between topography and the feeling of the surface has been investigated for commercial tissues with varying degree of texture from the low textured crepe tissue to the highly textured embossed- and air-dried tissue products. A trained sensory panel at was used to grade perceived haptic "roughness". The technique used to characterize the topography was Digital light projection (DLP) technique, By the use of multivariate statistics, strong correlations between perceived roughness and topography were found with predictability of above 90 percent even though highly textured products were included. Characterization was made using areal ISO 25178-2 topography parameters in combination with non-contacting topography measurement. The best prediction ability was obtained when combining haptic properties with the topography parameters auto-correlation length (Sal), peak material volume (Vmp), core roughness depth (Sk) and the maximum height of the surface (Sz).

  10. Inhomogeneous Point-Processes to Instantaneously Assess Affective Haptic Perception through Heartbeat Dynamics Information

    NASA Astrophysics Data System (ADS)

    Valenza, G.; Greco, A.; Citi, L.; Bianchi, M.; Barbieri, R.; Scilingo, E. P.

    2016-06-01

    This study proposes the application of a comprehensive signal processing framework, based on inhomogeneous point-process models of heartbeat dynamics, to instantaneously assess affective haptic perception using electrocardiogram-derived information exclusively. The framework relies on inverse-Gaussian point-processes with Laguerre expansion of the nonlinear Wiener-Volterra kernels, accounting for the long-term information given by the past heartbeat events. Up to cubic-order nonlinearities allow for an instantaneous estimation of the dynamic spectrum and bispectrum of the considered cardiovascular dynamics, as well as for instantaneous measures of complexity, through Lyapunov exponents and entropy. Short-term caress-like stimuli were administered for 4.3-25 seconds on the forearms of 32 healthy volunteers (16 females) through a wearable haptic device, by selectively superimposing two levels of force, 2 N and 6 N, and two levels of velocity, 9.4 mm/s and 65 mm/s. Results demonstrated that our instantaneous linear and nonlinear features were able to finely characterize the affective haptic perception, with a recognition accuracy of 69.79% along the force dimension, and 81.25% along the velocity dimension.

  11. Haptic guidance of overt visual attention.

    PubMed

    List, Alexandra; Iordanescu, Lucica; Grabowecky, Marcia; Suzuki, Satoru

    2014-11-01

    Research has shown that information accessed from one sensory modality can influence perceptual and attentional processes in another modality. Here, we demonstrated a novel crossmodal influence of haptic-shape information on visual attention. Participants visually searched for a target object (e.g., an orange) presented among distractor objects, fixating the target as quickly as possible. While searching for the target, participants held (never viewed and out of sight) an item of a specific shape in their hands. In two experiments, we demonstrated that the time for the eyes to reach a target-a measure of overt visual attention-was reduced when the shape of the held item (e.g., a sphere) was consistent with the shape of the visual target (e.g., an orange), relative to when the held shape was unrelated to the target (e.g., a hockey puck) or when no shape was held. This haptic-to-visual facilitation occurred despite the fact that the held shapes were not predictive of the visual targets' shapes, suggesting that the crossmodal influence occurred automatically, reflecting shape-specific haptic guidance of overt visual attention.

  12. Human-arm-and-hand-dynamic model with variability analyses for a stylus-based haptic interface.

    PubMed

    Fu, Michael J; Cavuşoğlu, M Cenk

    2012-12-01

    Haptic interface research benefits from accurate human arm models for control and system design. The literature contains many human arm dynamic models but lacks detailed variability analyses. Without accurate measurements, variability is modeled in a very conservative manner, leading to less than optimal controller and system designs. This paper not only presents models for human arm dynamics but also develops inter- and intrasubject variability models for a stylus-based haptic device. Data from 15 human subjects (nine male, six female, ages 20-32) were collected using a Phantom Premium 1.5a haptic device for system identification. In this paper, grip-force-dependent models were identified for 1-3-N grip forces in the three spatial axes. Also, variability due to human subjects and grip-force variation were modeled as both structured and unstructured uncertainties. For both forms of variability, the maximum variation, 95 %, and 67 % confidence interval limits were examined. All models were in the frequency domain with force as input and position as output. The identified models enable precise controllers targeted to a subset of possible human operator dynamics.

  13. Surgical virtual reality - highlights in developing a high performance surgical haptic device.

    PubMed

    Custură-Crăciun, D; Cochior, D; Constantinoiu, S; Neagu, C

    2013-01-01

    Just like simulators are a standard in aviation and aerospace sciences, we expect for surgical simulators to soon become a standard in medical applications. These will correctly instruct future doctors in surgical techniques without there being a need for hands on patient instruction. Using virtual reality by digitally transposing surgical procedures changes surgery in are volutionary manner by offering possibilities for implementing new, much more efficient, learning methods, by allowing the practice of new surgical techniques and by improving surgeon abilities and skills. Perfecting haptic devices has opened the door to a series of opportunities in the fields of research,industry, nuclear science and medicine. Concepts purely theoretical at first, such as telerobotics, telepresence or telerepresentation,have become a practical reality as calculus techniques, telecommunications and haptic devices evolved,virtual reality taking a new leap. In the field of surgery barrier sand controversies still remain, regarding implementation and generalization of surgical virtual simulators. These obstacles remain connected to the high costs of this yet fully sufficiently developed technology, especially in the domain of haptic devices. Celsius.

  14. Obstacle Crossing Differences Between Blind and Blindfolded Subjects After Haptic Exploration.

    PubMed

    Forner-Cordero, Arturo; Garcia, Valéria D; Rodrigues, Sérgio T; Duysens, Jacques

    2016-01-01

    Little is known about the ability of blind people to cross obstacles after they have explored haptically their size and position. Long-term absence of vision may affect spatial cognition in the blind while their extensive experience with the use of haptic information for guidance may lead to compensation strategies. Seven blind and 7 sighted participants (with vision available and blindfolded) walked along a flat pathway and crossed an obstacle after a haptic exploration. Blind and blindfolded subjects used different strategies to cross the obstacle. After the first 20 trials the blindfolded subjects reduced the distance between the foot and the obstacle at the toe-off instant, while the blind behaved as the subjects with full vision. Blind and blindfolded participants showed larger foot clearance than participants with vision. At foot landing the hip was more behind the foot in the blindfolded condition, while there were no differences between the blind and the vision conditions. For several parameters of the obstacle crossing task, blind people were more similar to subjects with full vision indicating that the blind subjects were able to compensate for the lack of vision.

  15. Real-time interactive virtual tour on the World Wide Web (WWW)

    NASA Astrophysics Data System (ADS)

    Yoon, Sanghyuk; Chen, Hai-jung; Hsu, Tom; Yoon, Ilmi

    2003-12-01

    Web-based Virtual Tour has become a desirable and demanded application, yet challenging due to the nature of web application's running environment such as limited bandwidth and no guarantee of high computation power on the client side. Image-based rendering approach has attractive advantages over traditional 3D rendering approach in such Web Applications. Traditional approach, such as VRML, requires labor-intensive 3D modeling process, high bandwidth and computation power especially for photo-realistic virtual scenes. QuickTime VR and IPIX as examples of image-based approach, use panoramic photos and the virtual scenes that can be generated from photos directly skipping the modeling process. But, these image-based approaches may require special cameras or effort to take panoramic views and provide only one fixed-point look-around and zooming in-out rather than 'walk around', that is a very important feature to provide immersive experience to virtual tourists. The Web-based Virtual Tour using Tour into the Picture employs pseudo 3D geometry with image-based rendering approach to provide viewers with immersive experience of walking around the virtual space with several snap shots of conventional photos.

  16. Physically Based Rendering in the Nightshade NG Visualization Platform

    NASA Astrophysics Data System (ADS)

    Berglund, Karrie; Larey-Williams, Trystan; Spearman, Rob; Bogard, Arthur

    2015-01-01

    This poster describes our work on creating a physically based rendering model in Nightshade NG planetarium simulation and visualization software (project website: NightshadeSoftware.org). We discuss techniques used for rendering realistic scenes in the universe and dealing with astronomical distances in real time on consumer hardware. We also discuss some of the challenges of rewriting the software from scratch, a project which began in 2011.Nightshade NG can be a powerful tool for sharing data and visualizations. The desktop version of the software is free for anyone to download, use, and modify; it runs on Windows and Linux (and eventually Mac). If you are looking to disseminate your data or models, please stop by to discuss how we can work together.Nightshade software is used in literally hundreds of digital planetarium systems worldwide. Countless teachers and astronomy education groups run the software on flat screens. This wide use makes Nightshade an effective tool for dissemination to educators and the public.Nightshade NG is an especially powerful visualization tool when projected on a dome. We invite everyone to enter our inflatable dome in the exhibit hall to see this software in a 3D environment.

  17. Reducing the motor response in haptic parallel matching eliminates the typically observed gender difference.

    PubMed

    van Mier, Hanneke I

    2016-01-01

    When making two bars haptically parallel to each other, large deviations have been observed, most likely caused by the bias of a hand-centered egocentric reference frame. A consistent finding is that women show significantly larger deviations than men when performing this task. It has been suggested that this difference might be due to the fact that women are more egocentrically oriented than men or are less efficient in overcoming the egocentric bias of the hand. If this is indeed the case, reducing the bias of the egocentric reference frame should eliminate the above-mentioned gender difference. This was investigated in the current study. Sixty participants (30 men, 30 women) were instructed to haptically match (task HP) the orientation of a test bar with the dominant hand to the orientation of a reference bar that was perceived with the non-dominant hand. In a haptic visual task (task HV), in which only the reference bar and exploring hand were out of view, no motor response was required, but participants had to "match" the perceived orientation by verbally naming the parallel orientation that was read out on a test protractor. Both females and males performed better in the HV task than in the HP task. Significant gender effects were only found in the haptic parallelity task (HP), corroborating the idea that women perform at the same level as men when the egocentric bias of the hand is reduced.

  18. Enhanced visuo-haptic integration for the non-dominant hand.

    PubMed

    Yalachkov, Yavor; Kaiser, Jochen; Doehrmann, Oliver; Naumer, Marcus J

    2015-07-21

    Visuo-haptic integration contributes essentially to object shape recognition. Although there has been a considerable advance in elucidating the neural underpinnings of multisensory perception, it is still unclear whether seeing an object and exploring it with the dominant hand elicits the same brain response as compared to the non-dominant hand. Using fMRI to measure brain activation in right-handed participants, we found that for both left- and right-hand stimulation the left lateral occipital complex (LOC) and anterior cerebellum (aCER) were involved in visuo-haptic integration of familiar objects. These two brain regions were then further investigated in another study, where unfamiliar, novel objects were presented to a different group of right-handers. Here the left LOC and aCER were more strongly activated by bimodal than unimodal stimuli only when the left but not the right hand was used. A direct comparison indicated that the multisensory gain of the fMRI activation was significantly higher for the left than the right hand. These findings are in line with the principle of "inverse effectiveness", implying that processing of bimodally presented stimuli is particularly enhanced when the unimodal stimuli are weak. This applies also when right-handed subjects see and simultaneously touch unfamiliar objects with their non-dominant left hand. Thus, the fMRI signal in the left LOC and aCER induced by visuo-haptic stimulation is dependent on which hand was employed for haptic exploration. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Towards a Teleoperated Needle Driver Robot with Haptic Feedback for RFA of Breast Tumors under Continuous MRI1

    PubMed Central

    Kokes, Rebecca; Lister, Kevin; Gullapalli, Rao; Zhang, Bao; MacMillan, Alan; Richard, Howard; Desai, Jaydev P.

    2009-01-01

    Objective The purpose of this paper is to explore the feasibility of developing a MRI-compatible needle driver system for radiofrequency ablation (RFA) of breast tumors under continuous MRI imaging while being teleoperated by a haptic feedback device from outside the scanning room. The developed needle driver prototype was designed and tested for both tumor targeting capability as well as RFA. Methods The single degree-of-freedom (DOF) prototype was interfaced with a PHANToM haptic device controlled from outside the scanning room. Experiments were performed to demonstrate MRI-compatibility and position control accuracy with hydraulic actuation, along with an experiment to determine the PHANToM’s ability to guide the RFA tool to a tumor nodule within a phantom breast tissue model while continuously imaging within the MRI and receiving force feedback from the RFA tool. Results Hydraulic actuation is shown to be a feasible actuation technique for operation in an MRI environment. The design is MRI-compatible in all aspects except for force sensing in the directions perpendicular to the direction of motion. Experiments confirm that the user is able to detect healthy vs. cancerous tissue in a phantom model when provided with both visual (imaging) feedback and haptic feedback. Conclusion The teleoperated 1-DOF needle driver system presented in this paper demonstrates the feasibility of implementing a MRI-compatible robot for RFA of breast tumors with haptic feedback capability. PMID:19303805

  20. When Neuroscience ‘Touches’ Architecture: From Hapticity to a Supramodal Functioning of the Human Brain

    PubMed Central

    Papale, Paolo; Chiesi, Leonardo; Rampinini, Alessandra C.; Pietrini, Pietro; Ricciardi, Emiliano

    2016-01-01

    In the last decades, the rapid growth of functional brain imaging methodologies allowed cognitive neuroscience to address open questions in philosophy and social sciences. At the same time, novel insights from cognitive neuroscience research have begun to influence various disciplines, leading to a turn to cognition and emotion in the fields of planning and architectural design. Since 2003, the Academy of Neuroscience for Architecture has been supporting ‘neuro-architecture’ as a way to connect neuroscience and the study of behavioral responses to the built environment. Among the many topics related to multisensory perceptual integration and embodiment, the concept of hapticity was recently introduced, suggesting a pivotal role of tactile perception and haptic imagery in architectural appraisal. Arguments have thus risen in favor of the existence of shared cognitive foundations between hapticity and the supramodal functional architecture of the human brain. Precisely, supramodality refers to the functional feature of defined brain regions to process and represent specific information content in a more abstract way, independently of the sensory modality conveying such information to the brain. Here, we highlight some commonalities and differences between the concepts of hapticity and supramodality according to the distinctive perspectives of architecture and cognitive neuroscience. This comparison and connection between these two different approaches may lead to novel observations in regard to people–environment relationships, and even provide empirical foundations for a renewed evidence-based design theory. PMID:27375542

  1. Modeling and Design of an Electro-Rheological Fluid Based Haptic System for Tele-Operation of Space Robots

    NASA Technical Reports Server (NTRS)

    Mavroidis, Constantinos; Pfeiffer, Charles; Paljic, Alex; Celestino, James; Lennon, Jamie; Bar-Cohen, Yoseph

    2000-01-01

    For many years, the robotic community sought to develop robots that can eventually operate autonomously and eliminate the need for human operators. However, there is an increasing realization that there are some tasks that human can perform significantly better but, due to associated hazards, distance, physical limitations and other causes, only robot can be employed to perform these tasks. Remotely performing these types of tasks requires operating robots as human surrogates. While current "hand master" haptic systems are able to reproduce the feeling of rigid objects, they present great difficulties in emulating the feeling of remote/virtual stiffness. In addition, they tend to be heavy, cumbersome and usually they only allow limited operator workspace. In this paper a novel haptic interface is presented to enable human-operators to "feel" and intuitively mirror the stiffness/forces at remote/virtual sites enabling control of robots as human-surrogates. This haptic interface is intended to provide human operators intuitive feeling of the stiffness and forces at remote or virtual sites in support of space robots performing dexterous manipulation tasks (such as operating a wrench or a drill). Remote applications are referred to the control of actual robots whereas virtual applications are referred to simulated operations. The developed haptic interface will be applicable to IVA operated robotic EVA tasks to enhance human performance, extend crew capability and assure crew safety. The electrically controlled stiffness is obtained using constrained ElectroRheological Fluids (ERF), which changes its viscosity under electrical stimulation. Forces applied at the robot end-effector due to a compliant environment will be reflected to the user using this ERF device where a change in the system viscosity will occur proportionally to the force to be transmitted. In this paper, we will present the results of our modeling, simulation, and initial testing of such an electrorheological fluid (ERF) based haptic device.

  2. Modeling and test of a kinaesthetic actuator based on MR fluid for haptic applications.

    PubMed

    Yang, Tae-Heon; Koo, Jeong-Hoi; Kim, Sang-Youn; Kwon, Dong-Soo

    2017-03-01

    Haptic display units have been widely used for conveying button sensations to users, primarily employing vibrotactile actuators. However, the human feeling for pressing buttons mainly relies on kinaesthetic sensations (rather than vibrotactile sensations), and little studies exist on small-scale kinaesthetic haptic units. Thus, the primary goals of this paper are to design a miniature kinaesthetic actuator based on Magneto-Rheological (MR) fluid that can convey various button-clicking sensations and to experimentally evaluate its haptic performance. The design focuses of the proposed actuator were to produce sufficiently large actuation forces (resistive forces) for human users in a given size constraint and to offer a wide range of actuation forces for conveying vivid haptic sensations to users. To this end, this study first performed a series of parametric studies using mathematical force models for multiple operating modes of MR fluid in conjunction with finite element electromagnetism analysis. After selecting design parameters based on parametric studies, a prototype actuator was constructed, and its performance was evaluated using a dynamic mechanical analyzer. It measured the actuator's resistive force with a varying stroke (pressed depth) up to 1 mm and a varying input current from 0 A to 200 mA. The results show that the proposed actuator creates a wide range of resistive forces from around 2 N (off-state) to over 9.5 N at 200 mA. In order to assess the prototype's performance in the terms of the haptic application prospective, a maximum force rate was calculated to determine just noticeable difference in force changes for the 1 mm stoke of the actuator. The results show that the force rate is sufficient to mimic various levels of button sensations, indicating that the proposed kinaesthetic actuator can offer a wide range of resistive force changes that can be conveyed to human operators.

  3. Human haptic perception is interrupted by explorative stops of milliseconds

    PubMed Central

    Grunwald, Martin; Muniyandi, Manivannan; Kim, Hyun; Kim, Jung; Krause, Frank; Mueller, Stephanie; Srinivasan, Mandayam A.

    2014-01-01

    Introduction: The explorative scanning movements of the hands have been compared to those of the eyes. The visual process is known to be composed of alternating phases of saccadic eye movements and fixation pauses. Descriptive results suggest that during the haptic exploration of objects short movement pauses occur as well. The goal of the present study was to detect these “explorative stops” (ES) during one-handed and two-handed haptic explorations of various objects and patterns, and to measure their duration. Additionally, the associations between the following variables were analyzed: (a) between mean exploration time and duration of ES, (b) between certain stimulus features and ES frequency, and (c) the duration of ES during the course of exploration. Methods: Five different Experiments were used. The first two Experiments were classical recognition tasks of unknown haptic stimuli (A) and of common objects (B). In Experiment C space-position information of angle legs had to be perceived and reproduced. For Experiments D and E the PHANToM haptic device was used for the exploration of virtual (D) and real (E) sunken reliefs. Results: In each Experiment we observed explorative stops of different average durations. For Experiment A: 329.50 ms, Experiment B: 67.47 ms, Experiment C: 189.92 ms, Experiment D: 186.17 ms and Experiment E: 140.02 ms. Significant correlations were observed between exploration time and the duration of the ES. Also, ES occurred more frequently, but not exclusively, at defined stimulus features like corners, curves and the endpoints of lines. However, explorative stops do not occur every time a stimulus feature is explored. Conclusions: We assume that ES are a general aspect of human haptic exploration processes. We have tried to interpret the occurrence and duration of ES with respect to the Hypotheses-Rebuild-Model and the Limited Capacity Control System theory. PMID:24782797

  4. Multimodal Interaction with Speech, Gestures and Haptic Feedback in a Media Center Application

    NASA Astrophysics Data System (ADS)

    Turunen, Markku; Hakulinen, Jaakko; Hella, Juho; Rajaniemi, Juha-Pekka; Melto, Aleksi; Mäkinen, Erno; Rantala, Jussi; Heimonen, Tomi; Laivo, Tuuli; Soronen, Hannu; Hansen, Mervi; Valkama, Pellervo; Miettinen, Toni; Raisamo, Roope

    We demonstrate interaction with a multimodal media center application. Mobile phone-based interface includes speech and gesture input and haptic feedback. The setup resembles our long-term public pilot study, where a living room environment containing the application was constructed inside a local media museum allowing visitors to freely test the system.

  5. Grounded Learning Experience: Helping Students Learn Physics through Visuo-Haptic Priming and Instruction

    ERIC Educational Resources Information Center

    Huang, Shih-Chieh Douglas

    2013-01-01

    In this dissertation, I investigate the effects of a grounded learning experience on college students' mental models of physics systems. The grounded learning experience consisted of a priming stage and an instruction stage, and within each stage, one of two different types of visuo-haptic representation was applied: visuo-gestural simulation…

  6. Study of Co-Located and Distant Collaboration with Symbolic Support via a Haptics-Enhanced Virtual Reality Task

    ERIC Educational Resources Information Center

    Yeh, Shih-Ching; Hwang, Wu-Yuin; Wang, Jin-Liang; Zhan, Shi-Yi

    2013-01-01

    This study intends to investigate how multi-symbolic representations (text, digits, and colors) could effectively enhance the completion of co-located/distant collaborative work in a virtual reality context. Participants' perceptions and behaviors were also studied. A haptics-enhanced virtual reality task was developed to conduct…

  7. Virtual reality robotic telesurgery simulations using MEMICA haptic system

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph; Mavroidis, Constantinos; Bouzit, Mourad; Dolgin, Benjamin; Harm, Deborah L.; Kopchok, George E.; White, Rodney

    2001-01-01

    The authors conceived a haptic mechanism called MEMICA (Remote Mechanical Mirroring using Controlled stiffness and Actuators) that can enable the design of high dexterity, rapid response, and large workspace haptic system. The development of a novel MEMICA gloves and virtual reality models are being explored to allow simulation of telesurgery and other applications. The MEMICA gloves are being designed to provide intuitive mirroring of the conditions at a virtual site where a robot simulates the presence of a human operator. The key components of MEMICA are miniature electrically controlled stiffness (ECS) elements and electrically controlled force and stiffness (ECFS) actuators that are based on the use of Electro-Rheological Fluids (ERF. In this paper the design of the MEMICA system and initial experimental results are presented.

  8. Inertial constraints on limb proprioception are independent of visual calibration.

    PubMed

    Riley, M A; Turvey, M T

    2001-04-01

    When the coincidence of a limb's spatial axes and inertial eigenvectors is broken, haptic proprioception of the limb's position conforms to the eigenvectors. Additionally, when prisms break the coincidence between an arm's visual and actual positions, haptic proprioception is shifted toward the visual-spatial direction. In 3 experiments, variation of the arm's mass distribution was combined with prism adaptation to investigate the hypothesis that the proprioceptive effects of inertial and visual manipulations are additive. This hypothesis was supported across manipulations of plane of motion, body posture, proprioceptive target, and proprioceptive experience during prism adaptation. Haptic proprioception seems to depend on local, physical reference frames that are relative to the physical reference frames for the body's environmental position and orientation.

  9. Development of Quasi-3DOF upper limb rehabilitation system using ER brake: PLEMO-P1

    NASA Astrophysics Data System (ADS)

    Kikuchi, T.; Fukushima, K.; Furusho, J.; Ozawa, T.

    2009-02-01

    In recent years, many researchers have studied the potential of using robotics technology to assist and quantify the motor functions for neuron-rehabilitation. Some kinds of haptic devices have been developed and evaluated its efficiency with clinical tests, for example, upper limb training for patients with spasticity after stroke. However, almost all the devices are active-type (motor-driven) haptic devices and they basically require high-cost safety system compared to passive-type (brake-based) devices. In this study, we developed a new practical haptic device 'PLEMO-P1'; this system adopted ER brakes as its force generators. In this paper, the mechanism of PLEMO-P1 and its software for a reaching rehabilitation are described.

  10. An augmented reality haptic training simulator for spinal needle procedures.

    PubMed

    Sutherland, Colin; Hashtrudi-Zaad, Keyvan; Sellens, Rick; Abolmaesumi, Purang; Mousavi, Parvin

    2013-11-01

    This paper presents the prototype for an augmented reality haptic simulation system with potential for spinal needle insertion training. The proposed system is composed of a torso mannequin, a MicronTracker2 optical tracking system, a PHANToM haptic device, and a graphical user interface to provide visual feedback. The system allows users to perform simulated needle insertions on a physical mannequin overlaid with an augmented reality cutaway of patient anatomy. A tissue model based on a finite-element model provides force during the insertion. The system allows for training without the need for the presence of a trained clinician or access to live patients or cadavers. A pilot user study demonstrates the potential and functionality of the system.

  11. Three Dimensional Modeling via Photographs for Documentation of a Village Bath

    NASA Astrophysics Data System (ADS)

    Balta, H. B.; Hamamcioglu-Turan, M.; Ocali, O.

    2013-07-01

    The aim of this study is supporting the conceptual discussions of architectural restoration with three dimensional modeling of monuments based on photogrammetric survey. In this study, a 16th century village bath in Ulamış, Seferihisar, and Izmir is modeled for documentation. Ulamış is one of the historical villages within which Turkish population first settled in the region of Seferihisar - Urla. The methodology was tested on an antique monument; a bath with a cubical form. Within the limits of this study, only the exterior of the bath was modeled. The presentation scale for the bath was determined as 1 / 50, considering the necessities of designing structural interventions and architectural ones within the scope of a restoration project. The three dimensional model produced is a realistic document presenting the present situation of the ruin. Traditional plan, elevation and perspective drawings may be produced from the model, in addition to the realistic textured renderings and wireframe representations. The model developed in this study provides opportunity for presenting photorealistic details of historical morphologies in scale. Compared to conventional drawings, the renders based on the 3d models provide an opportunity for conceiving architectural details such as color, material and texture. From these documents, relatively more detailed restitution hypothesis can be developed and intervention decisions can be taken. Finally, the principles derived from the case study can be used for 3d documentation of historical structures with irregular surfaces.

  12. Drawing disability in Japanese manga: visual politics, embodied masculinity, and wheelchair basketball in Inoue Takehiko's REAL.

    PubMed

    Wood, Andrea

    2013-12-01

    This work explores disability in the cultural context of contemporary Japanese comics. In contrast to Western comics, Japanese manga have permeated the social fabric of Japan to the extent that vast numbers of people read manga on a daily basis. It has, in fact, become such a popular medium for visual communication that the Japanese government and education systems utilize manga as a social acculturation and teaching tool. This multibillion dollar industry is incredibly diverse, and one particularly popular genre is sports manga. However, Inoue Takehiko's award-winning manga series REAL departs from more conventional sports manga, which typically focus on able-bodied characters with sometimes exaggerated superhuman physical abilities, by adopting a more realistic approach to the world of wheelchair basketball and the people who play it. At the same time REAL explores cultural attitudes toward disability in Japanese culture-where disability is at times rendered "invisible" either through accessibility problems or lingering associations of disability and shame. It is therefore extremely significant that manga, a visual medium, is rendering disability visible-the ultimate movement from margin to center. REAL devotes considerable attention to realistically illustrating the lived experiences of its characters both on and off the court. Consequently, the series not only educates readers about wheelchair basketball but also provides compelling insight into Japanese cultural notions about masculinity, family, responsibility, and identity. The basketball players-at first marginalized by their disability-join together in the unity of a sport typically characterized by its "abledness."

  13. Introducing GAMER: A fast and accurate method for ray-tracing galaxies using procedural noise

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Groeneboom, N. E.; Dahle, H., E-mail: nicolaag@astro.uio.no

    2014-03-10

    We developed a novel approach for fast and accurate ray-tracing of galaxies using procedural noise fields. Our method allows for efficient and realistic rendering of synthetic galaxy morphologies, where individual components such as the bulge, disk, stars, and dust can be synthesized in different wavelengths. These components follow empirically motivated overall intensity profiles but contain an additional procedural noise component that gives rise to complex natural patterns that mimic interstellar dust and star-forming regions. These patterns produce more realistic-looking galaxy images than using analytical expressions alone. The method is fully parallelized and creates accurate high- and low- resolution images thatmore » can be used, for example, in codes simulating strong and weak gravitational lensing. In addition to having a user-friendly graphical user interface, the C++ software package GAMER is easy to implement into an existing code.« less

  14. CG2Real: Improving the Realism of Computer Generated Images Using a Large Collection of Photographs.

    PubMed

    Johnson, Micah K; Dale, Kevin; Avidan, Shai; Pfister, Hanspeter; Freeman, William T; Matusik, Wojciech

    2011-09-01

    Computer-generated (CG) images have achieved high levels of realism. This realism, however, comes at the cost of long and expensive manual modeling, and often humans can still distinguish between CG and real images. We introduce a new data-driven approach for rendering realistic imagery that uses a large collection of photographs gathered from online repositories. Given a CG image, we retrieve a small number of real images with similar global structure. We identify corresponding regions between the CG and real images using a mean-shift cosegmentation algorithm. The user can then automatically transfer color, tone, and texture from matching regions to the CG image. Our system only uses image processing operations and does not require a 3D model of the scene, making it fast and easy to integrate into digital content creation workflows. Results of a user study show that our hybrid images appear more realistic than the originals.

  15. Introducing GAMER: A Fast and Accurate Method for Ray-tracing Galaxies Using Procedural Noise

    NASA Astrophysics Data System (ADS)

    Groeneboom, N. E.; Dahle, H.

    2014-03-01

    We developed a novel approach for fast and accurate ray-tracing of galaxies using procedural noise fields. Our method allows for efficient and realistic rendering of synthetic galaxy morphologies, where individual components such as the bulge, disk, stars, and dust can be synthesized in different wavelengths. These components follow empirically motivated overall intensity profiles but contain an additional procedural noise component that gives rise to complex natural patterns that mimic interstellar dust and star-forming regions. These patterns produce more realistic-looking galaxy images than using analytical expressions alone. The method is fully parallelized and creates accurate high- and low- resolution images that can be used, for example, in codes simulating strong and weak gravitational lensing. In addition to having a user-friendly graphical user interface, the C++ software package GAMER is easy to implement into an existing code.

  16. The Direct Lighting Computation in Global Illumination Methods

    NASA Astrophysics Data System (ADS)

    Wang, Changyaw Allen

    1994-01-01

    Creating realistic images is a computationally expensive process, but it is very important for applications such as interior design, product design, education, virtual reality, and movie special effects. To generate realistic images, state-of-art rendering techniques are employed to simulate global illumination, which accounts for the interreflection of light among objects. In this document, we formalize the global illumination problem into a eight -dimensional integral and discuss various methods that can accelerate the process of approximating this integral. We focus on the direct lighting computation, which accounts for the light reaching the viewer from the emitting sources after exactly one reflection, Monte Carlo sampling methods, and light source simplification. Results include a new sample generation method, a framework for the prediction of the total number of samples used in a solution, and a generalized Monte Carlo approach for computing the direct lighting from an environment which for the first time makes ray tracing feasible for highly complex environments.

  17. Eye Tracking of Occluded Self-Moved Targets: Role of Haptic Feedback and Hand-Target Dynamics.

    PubMed

    Danion, Frederic; Mathew, James; Flanagan, J Randall

    2017-01-01

    Previous studies on smooth pursuit eye movements have shown that humans can continue to track the position of their hand, or a target controlled by the hand, after it is occluded, thereby demonstrating that arm motor commands contribute to the prediction of target motion driving pursuit eye movements. Here, we investigated this predictive mechanism by manipulating both the complexity of the hand-target mapping and the provision of haptic feedback. Two hand-target mappings were used, either a rigid (simple) one in which hand and target motion matched perfectly or a nonrigid (complex) one in which the target behaved as a mass attached to the hand by means of a spring. Target animation was obtained by asking participants to oscillate a lightweight robotic device that provided (or not) haptic feedback consistent with the target dynamics. Results showed that as long as 7 s after target occlusion, smooth pursuit continued to be the main contributor to total eye displacement (∼60%). However, the accuracy of eye-tracking varied substantially across experimental conditions. In general, eye-tracking was less accurate under the nonrigid mapping, as reflected by higher positional and velocity errors. Interestingly, haptic feedback helped to reduce the detrimental effects of target occlusion when participants used the nonrigid mapping, but not when they used the rigid one. Overall, we conclude that the ability to maintain smooth pursuit in the absence of visual information can extend to complex hand-target mappings, but the provision of haptic feedback is critical for the maintenance of accurate eye-tracking performance.

  18. Eye Tracking of Occluded Self-Moved Targets: Role of Haptic Feedback and Hand-Target Dynamics

    PubMed Central

    Mathew, James

    2017-01-01

    Abstract Previous studies on smooth pursuit eye movements have shown that humans can continue to track the position of their hand, or a target controlled by the hand, after it is occluded, thereby demonstrating that arm motor commands contribute to the prediction of target motion driving pursuit eye movements. Here, we investigated this predictive mechanism by manipulating both the complexity of the hand-target mapping and the provision of haptic feedback. Two hand-target mappings were used, either a rigid (simple) one in which hand and target motion matched perfectly or a nonrigid (complex) one in which the target behaved as a mass attached to the hand by means of a spring. Target animation was obtained by asking participants to oscillate a lightweight robotic device that provided (or not) haptic feedback consistent with the target dynamics. Results showed that as long as 7 s after target occlusion, smooth pursuit continued to be the main contributor to total eye displacement (∼60%). However, the accuracy of eye-tracking varied substantially across experimental conditions. In general, eye-tracking was less accurate under the nonrigid mapping, as reflected by higher positional and velocity errors. Interestingly, haptic feedback helped to reduce the detrimental effects of target occlusion when participants used the nonrigid mapping, but not when they used the rigid one. Overall, we conclude that the ability to maintain smooth pursuit in the absence of visual information can extend to complex hand-target mappings, but the provision of haptic feedback is critical for the maintenance of accurate eye-tracking performance. PMID:28680964

  19. Improving manual skills in persons with disabilities (PWD) through a multimodal assistance system.

    PubMed

    Covarrubias, Mario; Gatti, Elia; Bordegoni, Monica; Cugini, Umberto; Mansutti, Alessandro

    2014-07-01

    In this research work, we present a Multimodal Guidance System (MGS) whose aim is to provide dynamic assistance to persons with disabilities (PWD) while performing manual activities such as drawing, coloring in and foam-cutting tasks. The MGS provides robotic assistance in the execution of 2D tasks through haptic and sound interactions. Haptic technology provides the virtual path of 2D shapes through the point-based approach, while sound technology provides audio feedback inputs related to the hand's velocity while sketching and filling or cutting operations. By combining this Multimodal System with the haptic assistance, we have created a new approach with possible applications to such diverse fields as physical rehabilitation, scientific investigation of sensorimotor learning and assessment of hand movements in PWD. The MGS has been tested by people with specific disorders affecting coordination, such as Down syndrome and developmental disabilities, under the supervision of their teachers and care assistants inside their learning environment. A Graphic User Interface has been designed for teachers and care assistants in order to provide training during the test sessions. Our results provide conclusive evidence that the effect of using the MGS increases the accuracy in the tasks operations. The Multimodal Guidance System (MGS) is an interface that offers haptic and sound feedback while performing manual tasks. Several studies demonstrated that the haptic guidance systems can help people in recovering cognitive function at different levels of complexity and impairment. The applications supported by our device could also have an important role in supporting physical therapist and cognitive psychologist in helping patients to recover motor and visuo-spatial abilities.

  20. Should drivers be operating within an automation-free bandwidth? Evaluating haptic steering support systems with different levels of authority.

    PubMed

    Petermeijer, Sebastiaan M; Abbink, David A; de Winter, Joost C F

    2015-02-01

    The aim of this study was to compare continuous versus bandwidth haptic steering guidance in terms of lane-keeping behavior, aftereffects, and satisfaction. An important human factors question is whether operators should be supported continuously or only when tolerance limits are exceeded. We aimed to clarify this issue for haptic steering guidance by investigating costs and benefits of both approaches in a driving simulator. Thirty-two participants drove five trials, each with a different level of haptic support: no guidance (Manual); guidance outside a 0.5-m bandwidth (Band1); a hysteresis version of Band1, which guided back to the lane center once triggered (Band2); continuous guidance (Cont); and Cont with double feedback gain (ContS). Participants performed a reaction time task while driving. Toward the end of each trial, the guidance was unexpectedly disabled to investigate aftereffects. All four guidance systems prevented large lateral errors (>0.7 m). Cont and especially ContS yielded smaller lateral errors and higher time to line crossing than Manual, Band1, and Band2. Cont and ContS yielded short-lasting aftereffects, whereas Band1 and Band2 did not. Cont yielded higher self-reported satisfaction and faster reaction times than Band1. Continuous and bandwidth guidance both prevent large driver errors. Continuous guidance yields improved performance and satisfaction over bandwidth guidance at the cost of aftereffects and variability in driver torque (indicating human-automation conflicts). The presented results are useful for designers of haptic guidance systems and support critical thinking about the costs and benefits of automation support systems.

Top