Design of a haptic device with grasp and push-pull force feedback for a master-slave surgical robot.
Hu, Zhenkai; Yoon, Chae-Hyun; Park, Samuel Byeongjun; Jo, Yung-Ho
2016-07-01
We propose a portable haptic device providing grasp (kinesthetic) and push-pull (cutaneous) sensations for optical-motion-capture master interfaces. Although optical-motion-capture master interfaces for surgical robot systems can overcome the stiffness, friction, and coupling problems of mechanical master interfaces, it is difficult to add haptic feedback to an optical-motion-capture master interface without constraining the free motion of the operator's hands. Therefore, we utilized a Bowden cable-driven mechanism to provide the grasp and push-pull sensation while retaining the free hand motion of the optical-motion capture master interface. To evaluate the haptic device, we construct a 2-DOF force sensing/force feedback system. We compare the sensed force and the reproduced force of the haptic device. Finally, a needle insertion test was done to evaluate the performance of the haptic interface in the master-slave system. The results demonstrate that both the grasp force feedback and the push-pull force feedback provided by the haptic interface closely matched with the sensed forces of the slave robot. We successfully apply our haptic interface in the optical-motion-capture master-slave system. The results of the needle insertion test showed that our haptic feedback can provide more safety than merely visual observation. We develop a suitable haptic device to produce both kinesthetic grasp force feedback and cutaneous push-pull force feedback. Our future research will include further objective performance evaluations of the optical-motion-capture master-slave robot system with our haptic interface in surgical scenarios.
Role of combined tactile and kinesthetic feedback in minimally invasive surgery.
Lim, Soo-Chul; Lee, Hyung-Kew; Park, Joonah
2014-10-18
Haptic feedback is of critical importance in surgical tasks. However, conventional surgical robots do not provide haptic feedback to surgeons during surgery. Thus, in this study, a combined tactile and kinesthetic feedback system was developed to provide haptic feedback to surgeons during robotic surgery. To assess haptic feasibility, the effects of two types of haptic feedback were examined empirically - kinesthetic and tactile feedback - to measure object-pulling force with a telesurgery robotics system at two desired pulling forces (1 N and 2 N). Participants answered a set of questionnaires after experiments. The experimental results reveal reductions in force error (39.1% and 40.9%) when using haptic feedback during 1 N and 2 N pulling tasks. Moreover, survey analyses show the effectiveness of the haptic feedback during teleoperation. The combined tactile and kinesthetic feedback of the master device in robotic surgery improves the surgeon's ability to control the interaction force applied to the tissue. Copyright © 2014 John Wiley & Sons, Ltd. Copyright © 2014 John Wiley & Sons, Ltd.
Jin, Seung-A Annie
2010-06-01
This study gauged the effects of force feedback in the Novint Falcon haptics system on the sensory and cognitive dimensions of a virtual test-driving experience. First, in order to explore the effects of tactile stimuli with force feedback on users' sensory experience, feelings of physical presence (the extent to which virtual physical objects are experienced as actual physical objects) were measured after participants used the haptics interface. Second, to evaluate the effects of force feedback on the cognitive dimension of consumers' virtual experience, this study investigated brand personality perception. The experiment utilized the Novint Falcon haptics controller to induce immersive virtual test-driving through tactile stimuli. The author designed a two-group (haptics stimuli with force feedback versus no force feedback) comparison experiment (N = 238) by manipulating the level of force feedback. Users in the force feedback condition were exposed to tactile stimuli involving various force feedback effects (e.g., terrain effects, acceleration, and lateral forces) while test-driving a rally car. In contrast, users in the control condition test-drove the rally car using the Novint Falcon but were not given any force feedback. Results of ANOVAs indicated that (a) users exposed to force feedback felt stronger physical presence than those in the no force feedback condition, and (b) users exposed to haptics stimuli with force feedback perceived the brand personality of the car to be more rugged than those in the control condition. Managerial implications of the study for product trial in the business world are discussed.
Haptic Feedback in Robot-Assisted Minimally Invasive Surgery
Okamura, Allison M.
2009-01-01
Purpose of Review Robot-assisted minimally invasive surgery (RMIS) holds great promise for improving the accuracy and dexterity of a surgeon while minimizing trauma to the patient. However, widespread clinical success with RMIS has been marginal. It is hypothesized that the lack of haptic (force and tactile) feedback presented to the surgeon is a limiting factor. This review explains the technical challenges of creating haptic feedback for robot-assisted surgery and provides recent results that evaluate the effectiveness of haptic feedback in mock surgical tasks. Recent Findings Haptic feedback systems for RMIS are still under development and evaluation. Most provide only force feedback, with limited fidelity. The major challenge at this time is sensing forces applied to the patient. A few tactile feedback systems for RMIS have been created, but their practicality for clinical implementation needs to be shown. It is particularly difficult to sense and display spatially distributed tactile information. The cost-benefit ratio for haptic feedback in RMIS has not been established. Summary The designs of existing commercial RMIS systems are not conducive for force feedback, and creative solutions are needed to create compelling tactile feedback systems. Surgeons, engineers, and neuroscientists should work together to develop effective solutions for haptic feedback in RMIS. PMID:19057225
A kinesthetic washout filter for force-feedback rendering.
Danieau, Fabien; Lecuyer, Anatole; Guillotel, Philippe; Fleureau, Julien; Mollet, Nicolas; Christie, Marc
2015-01-01
Today haptic feedback can be designed and associated to audiovisual content (haptic-audiovisuals or HAV). Although there are multiple means to create individual haptic effects, the issue of how to properly adapt such effects on force-feedback devices has not been addressed and is mostly a manual endeavor. We propose a new approach for the haptic rendering of HAV, based on a washout filter for force-feedback devices. A body model and an inverse kinematics algorithm simulate the user's kinesthetic perception. Then, the haptic rendering is adapted in order to handle transitions between haptic effects and to optimize the amplitude of effects regarding the device capabilities. Results of a user study show that this new haptic rendering can successfully improve the HAV experience.
Detection of Membrane Puncture with Haptic Feedback using a Tip-Force Sensing Needle.
Elayaperumal, Santhi; Bae, Jung Hwa; Daniel, Bruce L; Cutkosky, Mark R
2014-09-01
This paper presents calibration and user test results of a 3-D tip-force sensing needle with haptic feedback. The needle is a modified MRI-compatible biopsy needle with embedded fiber Bragg grating (FBG) sensors for strain detection. After calibration, the needle is interrogated at 2 kHz, and dynamic forces are displayed remotely with a voice coil actuator. The needle is tested in a single-axis master/slave system, with the voice coil haptic display at the master, and the needle at the slave end. Tissue phantoms with embedded membranes were used to determine the ability of the tip-force sensors to provide real-time haptic feedback as compared to external sensors at the needle base during needle insertion via the master/slave system. Subjects were able to determine the position of the embedded membranes with significantly better accuracy using FBG tip feedback than with base feedback using a commercial force/torque sensor (p = 0.045) or with no added haptic feedback (p = 0.0024).
Detection of Membrane Puncture with Haptic Feedback using a Tip-Force Sensing Needle
Elayaperumal, Santhi; Bae, Jung Hwa; Daniel, Bruce L.; Cutkosky, Mark R.
2015-01-01
This paper presents calibration and user test results of a 3-D tip-force sensing needle with haptic feedback. The needle is a modified MRI-compatible biopsy needle with embedded fiber Bragg grating (FBG) sensors for strain detection. After calibration, the needle is interrogated at 2 kHz, and dynamic forces are displayed remotely with a voice coil actuator. The needle is tested in a single-axis master/slave system, with the voice coil haptic display at the master, and the needle at the slave end. Tissue phantoms with embedded membranes were used to determine the ability of the tip-force sensors to provide real-time haptic feedback as compared to external sensors at the needle base during needle insertion via the master/slave system. Subjects were able to determine the position of the embedded membranes with significantly better accuracy using FBG tip feedback than with base feedback using a commercial force/torque sensor (p = 0.045) or with no added haptic feedback (p = 0.0024). PMID:26509101
A design of hardware haptic interface for gastrointestinal endoscopy simulation.
Gu, Yunjin; Lee, Doo Yong
2011-01-01
Gastrointestinal endoscopy simulations have been developed to train endoscopic procedures which require hundreds of practices to be competent in the skills. Even though realistic haptic feedback is important to provide realistic sensation to the user, most of previous simulations including commercialized simulation have mainly focused on providing realistic visual feedback. In this paper, we propose a novel design of portable haptic interface, which provides 2DOF force feedback, for the gastrointestinal endoscopy simulation. The haptic interface consists of translational and rotational force feedback mechanism which are completely decoupled, and gripping mechanism for controlling connection between the endoscope and the force feedback mechanism.
NASA Astrophysics Data System (ADS)
Choi, Seung-Hyun; Kim, Soomin; Kim, Pyunghwa; Park, Jinhyuk; Choi, Seung-Bok
2015-06-01
In this study, we developed a novel four-degrees-of-freedom haptic master using controllable magnetorheological (MR) fluid. We also integrated the haptic master with a vision device with image processing for robot-assisted minimally invasive surgery (RMIS). The proposed master can be used in RMIS as a haptic interface to provide the surgeon with a sense of touch by using both kinetic and kinesthetic information. The slave robot, which is manipulated with a proportional-integrative-derivative controller, uses a force sensor to obtain the desired forces from tissue contact, and these desired repulsive forces are then embodied through the MR haptic master. To verify the effectiveness of the haptic master, the desired force and actual force are compared in the time domain. In addition, a visual feedback system is implemented in the RMIS experiment to distinguish between the tumor and organ more clearly and provide better visibility to the operator. The hue-saturation-value color space is adopted for the image processing since it is often more intuitive than other color spaces. The image processing and haptic feedback are realized on surgery performance. In this work, tumor-cutting experiments are conducted under four different operating conditions: haptic feedback on, haptic feedback off, image processing on, and image processing off. The experimental realization shows that the performance index, which is a function of pixels, is different in the four operating conditions.
Haptic Stylus and Empirical Studies on Braille, Button, and Texture Display
Kyung, Ki-Uk; Lee, Jun-Young; Park, Junseok
2008-01-01
This paper presents a haptic stylus interface with a built-in compact tactile display module and an impact module as well as empirical studies on Braille, button, and texture display. We describe preliminary evaluations verifying the tactile display's performance indicating that it can satisfactorily represent Braille numbers for both the normal and the blind. In order to prove haptic feedback capability of the stylus, an experiment providing impact feedback mimicking the click of a button has been conducted. Since the developed device is small enough to be attached to a force feedback device, its applicability to combined force and tactile feedback display in a pen-held haptic device is also investigated. The handle of pen-held haptic interface was replaced by the pen-like interface to add tactile feedback capability to the device. Since the system provides combination of force, tactile and impact feedback, three haptic representation methods for texture display have been compared on surface with 3 texture groups which differ in direction, groove width, and shape. In addition, we evaluate its capacity to support touch screen operations by providing tactile sensations when a user rubs against an image displayed on a monitor. PMID:18317520
Haptic stylus and empirical studies on braille, button, and texture display.
Kyung, Ki-Uk; Lee, Jun-Young; Park, Junseok
2008-01-01
This paper presents a haptic stylus interface with a built-in compact tactile display module and an impact module as well as empirical studies on Braille, button, and texture display. We describe preliminary evaluations verifying the tactile display's performance indicating that it can satisfactorily represent Braille numbers for both the normal and the blind. In order to prove haptic feedback capability of the stylus, an experiment providing impact feedback mimicking the click of a button has been conducted. Since the developed device is small enough to be attached to a force feedback device, its applicability to combined force and tactile feedback display in a pen-held haptic device is also investigated. The handle of pen-held haptic interface was replaced by the pen-like interface to add tactile feedback capability to the device. Since the system provides combination of force, tactile and impact feedback, three haptic representation methods for texture display have been compared on surface with 3 texture groups which differ in direction, groove width, and shape. In addition, we evaluate its capacity to support touch screen operations by providing tactile sensations when a user rubs against an image displayed on a monitor.
Mechatronic design of haptic forceps for robotic surgery.
Rizun, P; Gunn, D; Cox, B; Sutherland, G
2006-12-01
Haptic feedback increases operator performance and comfort during telerobotic manipulation. Feedback of grasping pressure is critical in many microsurgical tasks, yet no haptic interface for surgical tools is commercially available. Literature on the psychophysics of touch was reviewed to define the spectrum of human touch perception and the fidelity requirements of an ideal haptic interface. Mechanical design and control literature was reviewed to translate the psychophysical requirements to engineering specification. High-fidelity haptic forceps were then developed through an iterative process between engineering and surgery. The forceps are a modular device that integrate with a haptic hand controller to add force feedback for tool actuation in telerobotic or virtual surgery. Their overall length is 153 mm and their mass is 125 g. A contact-free voice coil actuator generates force feedback at frequencies up to 800 Hz. Maximum force output is 6 N (2N continuous) and the force resolution is 4 mN. The forceps employ a contact-free magnetic position sensor as well as micro-machined accelerometers to measure opening/closing acceleration. Position resolution is 0.6 microm with 1.3 microm RMS noise. The forceps can simulate stiffness greater than 20N/mm or impedances smaller than 15 g with no noticeable haptic artifacts or friction. As telerobotic surgery evolves, haptics will play an increasingly important role. Copyright 2006 John Wiley & Sons, Ltd.
Investigating Students' Ideas About Buoyancy and the Influence of Haptic Feedback
NASA Astrophysics Data System (ADS)
Minogue, James; Borland, David
2016-04-01
While haptics (simulated touch) represents a potential breakthrough technology for science teaching and learning, there is relatively little research into its differential impact in the context of teaching and learning. This paper describes the testing of a haptically enhanced simulation (HES) for learning about buoyancy. Despite a lifetime of everyday experiences, a scientifically sound explanation of buoyancy remains difficult to construct for many. It requires the integration of domain-specific knowledge regarding density, fluid, force, gravity, mass, weight, and buoyancy. Prior studies suggest that novices often focus on only one dimension of the sinking and floating phenomenon. Our HES was designed to promote the integration of the subconcepts of density and buoyant forces and stresses the relationship between the object itself and the surrounding fluid. The study employed a randomized pretest-posttest control group research design and a suite of measures including an open-ended prompt and objective content questions to provide insights into the influence of haptic feedback on undergraduate students' thinking about buoyancy. A convenience sample (n = 40) was drawn from a university's population of undergraduate elementary education majors. Two groups were formed from haptic feedback (n = 22) and no haptic feedback (n = 18). Through content analysis, discernible differences were seen in the posttest explanations sinking and floating across treatment groups. Learners that experienced the haptic feedback made more frequent use of "haptically grounded" terms (e.g., mass, gravity, buoyant force, pushing), leading us to begin to build a local theory of language-mediated haptic cognition.
A study on haptic collaborative game in shared virtual environment
NASA Astrophysics Data System (ADS)
Lu, Keke; Liu, Guanyang; Liu, Lingzhi
2013-03-01
A study on collaborative game in shared virtual environment with haptic feedback over computer networks is introduced in this paper. A collaborative task was used where the players located at remote sites and played the game together. The player can feel visual and haptic feedback in virtual environment compared to traditional networked multiplayer games. The experiment was desired in two conditions: visual feedback only and visual-haptic feedback. The goal of the experiment is to assess the impact of force feedback on collaborative task performance. Results indicate that haptic feedback is beneficial for performance enhancement for collaborative game in shared virtual environment. The outcomes of this research can have a powerful impact on the networked computer games.
van der Meijden, O A J; Schijven, M P
2009-06-01
Virtual reality (VR) as surgical training tool has become a state-of-the-art technique in training and teaching skills for minimally invasive surgery (MIS). Although intuitively appealing, the true benefits of haptic (VR training) platforms are unknown. Many questions about haptic feedback in the different areas of surgical skills (training) need to be answered before adding costly haptic feedback in VR simulation for MIS training. This study was designed to review the current status and value of haptic feedback in conventional and robot-assisted MIS and training by using virtual reality simulation. A systematic review of the literature was undertaken using PubMed and MEDLINE. The following search terms were used: Haptic feedback OR Haptics OR Force feedback AND/OR Minimal Invasive Surgery AND/OR Minimal Access Surgery AND/OR Robotics AND/OR Robotic Surgery AND/OR Endoscopic Surgery AND/OR Virtual Reality AND/OR Simulation OR Surgical Training/Education. The results were assessed according to level of evidence as reflected by the Oxford Centre of Evidence-based Medicine Levels of Evidence. In the current literature, no firm consensus exists on the importance of haptic feedback in performing minimally invasive surgery. Although the majority of the results show positive assessment of the benefits of force feedback, results are ambivalent and not unanimous on the subject. Benefits are least disputed when related to surgery using robotics, because there is no haptic feedback in currently used robotics. The addition of haptics is believed to reduce surgical errors resulting from a lack of it, especially in knot tying. Little research has been performed in the area of robot-assisted endoscopic surgical training, but results seem promising. Concerning VR training, results indicate that haptic feedback is important during the early phase of psychomotor skill acquisition.
Haptic feedback in OP:Sense - augmented reality in telemanipulated robotic surgery.
Beyl, T; Nicolai, P; Mönnich, H; Raczkowksy, J; Wörn, H
2012-01-01
In current research, haptic feedback in robot assisted interventions plays an important role. However most approaches to haptic feedback only regard the mapping of the current forces at the surgical instrument to the haptic input devices, whereas surgeons demand a combination of medical imaging and telemanipulated robotic setups. In this paper we describe how this feature is integrated in our robotic research platform OP:Sense. The proposed method allows the automatic transfer of segmented imaging data to the haptic renderer and therefore allows enriching the haptic feedback with virtual fixtures based on imaging data. Anatomical structures are extracted from pre-operative generated medical images or virtual walls are defined by the surgeon inside the imaging data. Combining real forces with virtual fixtures can guide the surgeon to the regions of interest as well as helps to prevent the risk of damage to critical structures inside the patient. We believe that the combination of medical imaging and telemanipulation is a crucial step for the next generation of MIRS-systems.
Yoon, Han U.; Anil Kumar, Namita; Hur, Pilwon
2017-01-01
Cutaneous sensory feedback can be used to provide additional sensory cues to a person performing a motor task where vision is a dominant feedback signal. A haptic joystick has been widely used to guide a user by providing force feedback. However, the benefit of providing force feedback is still debatable due to performance dependency on factors such as the user's skill-level, task difficulty. Meanwhile, recent studies have shown the feasibility of improving a motor task performance by providing skin-stretch feedback. Therefore, a combination of two aforementioned feedback types is deemed to be promising to promote synergistic effects to consistently improve the person's motor performance. In this study, we aimed at identifying the effect of the combined haptic and skin-stretch feedbacks on the aged person's driving motor performance. For the experiment, 15 healthy elderly subjects (age 72.8 ± 6.6 years) were recruited and were instructed to drive a virtual power-wheelchair through four different courses with obstacles. Four augmented sensory feedback conditions were tested: no feedback, force feedback, skin-stretch feedback, and a combination of both force and skin-stretch feedbacks. While the haptic force was provided to the hand by the joystick, the skin-stretch was provided to the steering forearm by a custom-designed wearable skin-stretch device. We tested two hypotheses: (i) an elderly individual's motor control would benefit from receiving information about a desired trajectory from multiple sensory feedback sources, and (ii) the benefit does not depend on task difficulty. Various metrics related to skills and safety were used to evaluate the control performance. Repeated measure ANOVA was performed for those metrics with two factors: task scenario and the type of the augmented sensory feedback. The results revealed that elderly subjects' control performance significantly improved when the combined feedback of both haptic force and skin-stretch feedback was applied. The proposed approach suggest the feasibility to improve people's task performance by the synergistic effects of multiple augmented sensory feedback modalities. PMID:28690514
Invited Article: A review of haptic optical tweezers for an interactive microworld exploration
NASA Astrophysics Data System (ADS)
Pacoret, Cécile; Régnier, Stéphane
2013-08-01
This paper is the first review of haptic optical tweezers, a new technique which associates force feedback teleoperation with optical tweezers. This technique allows users to explore the microworld by sensing and exerting picoNewton-scale forces with trapped microspheres. Haptic optical tweezers also allow improved dexterity of micromanipulation and micro-assembly. One of the challenges of this technique is to sense and magnify picoNewton-scale forces by a factor of 1012 to enable human operators to perceive interactions that they have never experienced before, such as adhesion phenomena, extremely low inertia, and high frequency dynamics of extremely small objects. The design of optical tweezers for high quality haptic feedback is challenging, given the requirements for very high sensitivity and dynamic stability. The concept, design process, and specification of optical tweezers reviewed here are focused on those intended for haptic teleoperation. In this paper, two new specific designs as well as the current state-of-the-art are presented. Moreover, the remaining important issues are identified for further developments. The initial results obtained are promising and demonstrate that optical tweezers have a significant potential for haptic exploration of the microworld. Haptic optical tweezers will become an invaluable tool for force feedback micromanipulation of biological samples and nano- and micro-assembly parts.
Portable haptic interface with omni-directional movement and force capability.
Avizzano, Carlo Alberto; Satler, Massimo; Ruffaldi, Emanuele
2014-01-01
We describe the design of a new mobile haptic interface that employs wheels for force rendering. The interface, consisting of an omni-directional Killough type platform, provides 2DOF force feedback with different control modalities. The system autonomously performs sensor fusion for localization and force rendering. This paper explains the relevant choices concerning the functional aspects, the control design, the mechanical and electronic solution. Experimental results for force feedback characterization are reported.
Enhancing the Performance of Passive Teleoperation Systems via Cutaneous Feedback.
Pacchierotti, Claudio; Tirmizi, Asad; Bianchini, Gianni; Prattichizzo, Domenico
2015-01-01
We introduce a novel method to improve the performance of passive teleoperation systems with force reflection. It consists of integrating kinesthetic haptic feedback provided by common grounded haptic interfaces with cutaneous haptic feedback. The proposed approach can be used on top of any time-domain control technique that ensures a stable interaction by scaling down kinesthetic feedback when this is required to satisfy stability conditions (e.g., passivity) at the expense of transparency. Performance is recovered by providing a suitable amount of cutaneous force through custom wearable cutaneous devices. The viability of the proposed approach is demonstrated through an experiment of perceived stiffness and an experiment of teleoperated needle insertion in soft tissue.
Ehrampoosh, Shervin; Dave, Mohit; Kia, Michael A; Rablau, Corneliu; Zadeh, Mehrdad H
2013-01-01
This paper presents an enhanced haptic-enabled master-slave teleoperation system which can be used to provide force feedback to surgeons in minimally invasive surgery (MIS). One of the research goals was to develop a combined-control architecture framework that included both direct force reflection (DFR) and position-error-based (PEB) control strategies. To achieve this goal, it was essential to measure accurately the direct contact forces between deformable bodies and a robotic tool tip. To measure the forces at a surgical tool tip and enhance the performance of the teleoperation system, an optical force sensor was designed, prototyped, and added to a robot manipulator. The enhanced teleoperation architecture was formulated by developing mathematical models for the optical force sensor, the extended slave robot manipulator, and the combined-control strategy. Human factor studies were also conducted to (a) examine experimentally the performance of the enhanced teleoperation system with the optical force sensor, and (b) study human haptic perception during the identification of remote object deformability. The first experiment was carried out to discriminate deformability of objects when human subjects were in direct contact with deformable objects by means of a laparoscopic tool. The control parameters were then tuned based on the results of this experiment using a gain-scheduling method. The second experiment was conducted to study the effectiveness of the force feedback provided through the enhanced teleoperation system. The results show that the force feedback increased the ability of subjects to correctly identify materials of different deformable types. In addition, the virtual force feedback provided by the teleoperation system comes close to the real force feedback experienced in direct MIS. The experimental results provide design guidelines for choosing and validating the control architecture and the optical force sensor.
Performance evaluation of a robot-assisted catheter operating system with haptic feedback.
Song, Yu; Guo, Shuxiang; Yin, Xuanchun; Zhang, Linshuai; Hirata, Hideyuki; Ishihara, Hidenori; Tamiya, Takashi
2018-06-20
In this paper, a novel robot-assisted catheter operating system (RCOS) has been proposed as a method to reduce physical stress and X-ray exposure time to physicians during endovascular procedures. The unique design of this system allows the physician to apply conventional bedside catheterization skills (advance, retreat and rotate) to an input catheter, which is placed at the master side to control another patient catheter placed at the slave side. For this purpose, a magnetorheological (MR) fluids-based master haptic interface has been developed to measure the axial and radial motions of an input catheter, as well as to provide the haptic feedback to the physician during the operation. In order to achieve a quick response of the haptic force in the master haptic interface, a hall sensor-based closed-loop control strategy is employed. In slave side, a catheter manipulator is presented to deliver the patient catheter, according to position commands received from the master haptic interface. The contact forces between the patient catheter and blood vessel system can be measured by designed force sensor unit of catheter manipulator. Four levels of haptic force are provided to make the operator aware of the resistance encountered by the patient catheter during the insertion procedure. The catheter manipulator was evaluated for precision positioning. The time lag from the sensed motion to replicated motion is tested. To verify the efficacy of the proposed haptic feedback method, the evaluation experiments in vitro are carried out. The results demonstrate that the proposed system has the ability to enable decreasing the contact forces between the catheter and vasculature.
Haptic Foot Pedal: Influence of Shoe Type, Age, and Gender on Subjective Pulse Perception.
Geitner, Claudia; Birrell, Stewart; Krehl, Claudia; Jennings, Paul
2018-06-01
This study investigates the influence of shoe type (sneakers and safety boots), age, and gender on the perception of haptic pulse feedback provided by a prototype accelerator pedal in a running stationary vehicle. Haptic feedback can be a less distracting alternative to traditionally visual and auditory in-vehicle feedback. However, to be effective, the device delivering the haptic feedback needs to be in contact with the person. Factors such as shoe type vary naturally over the season and could render feedback that is perceived well in one situation, unnoticeable in another. In this study, we evaluate factors that can influence the subjective perception of haptic feedback in a stationary but running car: shoe type, age, and gender. Thirty-six drivers within three age groups (≤39, 40-59, and ≥60) took part. For each haptic feedback, participants rated intensity, urgency, and comfort via a questionnaire. The perception of the haptic feedback is significantly influenced by the interaction between the pulse's duration and force amplitude and the participant's age and gender but not shoe type. The results indicate that it is important to consider different age groups and gender in the evaluation of haptic feedback. Future research might also look into approaches to adapt haptic feedback to the individual driver's preferences. Findings from this study can be applied to the design of an accelerator pedal in a car, for example, for a nonvisual in-vehicle warning, but also to plan user studies with a haptic pedal in general.
NASA Astrophysics Data System (ADS)
Hwang, Donghyun; Lee, Jaemin; Kim, Keehoon
2017-10-01
This paper proposes a miniature haptic ring that can display touch/pressure and shearing force to the user’s fingerpad. For practical use and wider application of the device, it is developed with the aim of achieving high wearability and mobility/portability as well as cutaneous force feedback functionality. A main body of the device is designed as a ring-shaped lightweight structure with a simple driving mechanism, and thin shape memory alloy (SMA) wires having high energy density are applied as actuating elements. Also, based on a band-type wireless control unit including a wireless data communication module, the whole device could be realized as a wearable mobile haptic device system. These features enable the device to take diverse advantages on functional performances and to provide users with significant usability. In this work, the proposed miniature haptic ring is systematically designed, and its working performances are experimentally evaluated with a fabricated functional prototype. The experimental results obviously demonstrate that the proposed device exhibits higher force-to-weight ratio than conventional finger-wearable haptic devices for cutaneous force feedback. Also, it is investigated that operational performances of the device are strongly influenced by electro-thermomechanical behaviors of the SMA actuator. In addition to the experiments for performance evaluation, we conduct a preliminary user test to assess practical feasibility and usability based on user’s qualitative feedback.
Palpation simulator with stable haptic feedback.
Kim, Sang-Youn; Ryu, Jee-Hwan; Lee, WooJeong
2015-01-01
The main difficulty in constructing palpation simulators is to compute and to generate stable and realistic haptic feedback without vibration. When a user haptically interacts with highly non-homogeneous soft tissues through a palpation simulator, a sudden change of stiffness in target tissues causes unstable interaction with the object. We propose a model consisting of a virtual adjustable damper and an energy measuring element. The energy measuring element gauges energy which is stored in a palpation simulator and the virtual adjustable damper dissipates the energy to achieve stable haptic interaction. To investigate the haptic behavior of the proposed method, impulse and continuous inputs are provided to target tissues. If a haptic interface point meets with the hardest portion in the target tissues modeled with a conventional method, we observe unstable motion and feedback force. However, when the target tissues are modeled with the proposed method, a palpation simulator provides stable interaction without vibration. The proposed method overcomes a problem in conventional haptic palpation simulators where unstable force or vibration can be generated if there is a big discrepancy in material property between an element and its neighboring elements in target tissues.
Veras, Eduardo J; De Laurentis, Kathryn J; Dubey, Rajiv
2008-01-01
This paper describes the design and implementation of a control system that integrates visual and haptic information to give assistive force feedback through a haptic controller (Omni Phantom) to the user. A sensor-based assistive function and velocity scaling program provides force feedback that helps the user complete trajectory following exercises for rehabilitation purposes. This system also incorporates a PUMA robot for teleoperation, which implements a camera and a laser range finder, controlled in real time by a PC, were implemented into the system to help the user to define the intended path to the selected target. The real-time force feedback from the remote robot to the haptic controller is made possible by using effective multithreading programming strategies in the control system design and by novel sensor integration. The sensor-based assistant function concept applied to teleoperation as well as shared control enhances the motion range and manipulation capabilities of the users executing rehabilitation exercises such as trajectory following along a sensor-based defined path. The system is modularly designed to allow for integration of different master devices and sensors. Furthermore, because this real-time system is versatile the haptic component can be used separately from the telerobotic component; in other words, one can use the haptic device for rehabilitation purposes for cases in which assistance is needed to perform tasks (e.g., stroke rehab) and also for teleoperation with force feedback and sensor assistance in either supervisory or automatic modes.
Overview Electrotactile Feedback for Enhancing Human Computer Interface
NASA Astrophysics Data System (ADS)
Pamungkas, Daniel S.; Caesarendra, Wahyu
2018-04-01
To achieve effective interaction between a human and a computing device or machine, adequate feedback from the computing device or machine is required. Recently, haptic feedback is increasingly being utilised to improve the interactivity of the Human Computer Interface (HCI). Most existing haptic feedback enhancements aim at producing forces or vibrations to enrich the user’s interactive experience. However, these force and/or vibration actuated haptic feedback systems can be bulky and uncomfortable to wear and only capable of delivering a limited amount of information to the user which can limit both their effectiveness and the applications they can be applied to. To address this deficiency, electrotactile feedback is used. This involves delivering haptic sensations to the user by electrically stimulating nerves in the skin via electrodes placed on the surface of the skin. This paper presents a review and explores the capability of electrotactile feedback for HCI applications. In addition, a description of the sensory receptors within the skin for sensing tactile stimulus and electric currents alsoseveral factors which influenced electric signal to transmit to the brain via human skinare explained.
Effects of Grip-Force, Contact, and Acceleration Feedback on a Teleoperated Pick-and-Place Task.
Khurshid, Rebecca P; Fitter, Naomi T; Fedalei, Elizabeth A; Kuchenbecker, Katherine J
2017-01-01
The multifaceted human sense of touch is fundamental to direct manipulation, but technical challenges prevent most teleoperation systems from providing even a single modality of haptic feedback, such as force feedback. This paper postulates that ungrounded grip-force, fingertip-contact-and-pressure, and high-frequency acceleration haptic feedback will improve human performance of a teleoperated pick-and-place task. Thirty subjects used a teleoperation system consisting of a haptic device worn on the subject's right hand, a remote PR2 humanoid robot, and a Vicon motion capture system to move an object to a target location. Each subject completed the pick-and-place task 10 times under each of the eight haptic conditions obtained by turning on and off grip-force feedback, contact feedback, and acceleration feedback. To understand how object stiffness affects the utility of the feedback, half of the subjects completed the task with a flexible plastic cup, and the others used a rigid plastic block. The results indicate that the addition of grip-force feedback with gain switching enables subjects to hold both the flexible and rigid objects more stably, and it also allowed subjects who manipulated the rigid block to hold the object more delicately and to better control the motion of the remote robot's hand. Contact feedback improved the ability of subjects who manipulated the flexible cup to move the robot's arm in space, but it deteriorated this ability for subjects who manipulated the rigid block. Contact feedback also caused subjects to hold the flexible cup less stably, but the rigid block more securely. Finally, adding acceleration feedback slightly improved the subject's performance when setting the object down, as originally hypothesized; interestingly, it also allowed subjects to feel vibrations produced by the robot's motion, causing them to be more careful when completing the task. This study supports the utility of grip-force and high-frequency acceleration feedback in teleoperation systems and motivates further improvements to fingertip-contact-and-pressure feedback.
Development of a Robotic Colonoscopic Manipulation System, Using Haptic Feedback Algorithm.
Woo, Jaehong; Choi, Jae Hyuk; Seo, Jong Tae; Kim, Tae Il; Yi, Byung Ju
2017-01-01
Colonoscopy is one of the most effective diagnostic and therapeutic tools for colorectal diseases. We aim to propose a master-slave robotic colonoscopy that is controllable in remote site using conventional colonoscopy. The master and slave robot were developed to use conventional flexible colonoscopy. The robotic colonoscopic procedure was performed using a colonoscope training model by one expert endoscopist and two unexperienced engineers. To provide the haptic sensation, the insertion force and the rotating torque were measured and sent to the master robot. A slave robot was developed to hold the colonoscopy and its knob, and perform insertion, rotation, and two tilting motions of colonoscope. A master robot was designed to teach motions of the slave robot. These measured force and torque were scaled down by one tenth to provide the operator with some reflection force and torque at the haptic device. The haptic sensation and feedback system was successful and helpful to feel the constrained force or torque in colon. The insertion time using robotic system decreased with repeated procedures. This work proposed a robotic approach for colonoscopy using haptic feedback algorithm, and this robotic device would effectively perform colonoscopy with reduced burden and comparable safety for patients in remote site.
Prevailing Trends in Haptic Feedback Simulation for Minimally Invasive Surgery.
Pinzon, David; Byrns, Simon; Zheng, Bin
2016-08-01
Background The amount of direct hand-tool-tissue interaction and feedback in minimally invasive surgery varies from being attenuated in laparoscopy to being completely absent in robotic minimally invasive surgery. The role of haptic feedback during surgical skill acquisition and its emphasis in training have been a constant source of controversy. This review discusses the major developments in haptic simulation as they relate to surgical performance and the current research questions that remain unanswered. Search Strategy An in-depth review of the literature was performed using PubMed. Results A total of 198 abstracts were returned based on our search criteria. Three major areas of research were identified, including advancements in 1 of the 4 components of haptic systems, evaluating the effectiveness of haptic integration in simulators, and improvements to haptic feedback in robotic surgery. Conclusions Force feedback is the best method for tissue identification in minimally invasive surgery and haptic feedback provides the greatest benefit to surgical novices in the early stages of their training. New technology has improved our ability to capture, playback and enhance to utility of haptic cues in simulated surgery. Future research should focus on deciphering how haptic training in surgical education can increase performance, safety, and improve training efficiency. © The Author(s) 2016.
Evaluation of stiffness feedback for hard nodule identification on a phantom silicone model
Konstantinova, Jelizaveta; Xu, Guanghua; He, Bo; Aminzadeh, Vahid; Xie, Jun; Wurdemann, Helge; Althoefer, Kaspar
2017-01-01
Haptic information in robotic surgery can significantly improve clinical outcomes and help detect hard soft-tissue inclusions that indicate potential abnormalities. Visual representation of tissue stiffness information is a cost-effective technique. Meanwhile, direct force feedback, although considerably more expensive than visual representation, is an intuitive method of conveying information regarding tissue stiffness to surgeons. In this study, real-time visual stiffness feedback by sliding indentation palpation is proposed, validated, and compared with force feedback involving human subjects. In an experimental tele-manipulation environment, a dynamically updated color map depicting the stiffness of probed soft tissue is presented via a graphical interface. The force feedback is provided, aided by a master haptic device. The haptic device uses data acquired from an F/T sensor attached to the end-effector of a tele-manipulated robot. Hard nodule detection performance is evaluated for 2 modes (force feedback and visual stiffness feedback) of stiffness feedback on an artificial organ containing buried stiff nodules. From this artificial organ, a virtual-environment tissue model is generated based on sliding indentation measurements. Employing this virtual-environment tissue model, we compare the performance of human participants in distinguishing differently sized hard nodules by force feedback and visual stiffness feedback. Results indicate that the proposed distributed visual representation of tissue stiffness can be used effectively for hard nodule identification. The representation can also be used as a sufficient substitute for force feedback in tissue palpation. PMID:28248996
Evaluation of stiffness feedback for hard nodule identification on a phantom silicone model.
Li, Min; Konstantinova, Jelizaveta; Xu, Guanghua; He, Bo; Aminzadeh, Vahid; Xie, Jun; Wurdemann, Helge; Althoefer, Kaspar
2017-01-01
Haptic information in robotic surgery can significantly improve clinical outcomes and help detect hard soft-tissue inclusions that indicate potential abnormalities. Visual representation of tissue stiffness information is a cost-effective technique. Meanwhile, direct force feedback, although considerably more expensive than visual representation, is an intuitive method of conveying information regarding tissue stiffness to surgeons. In this study, real-time visual stiffness feedback by sliding indentation palpation is proposed, validated, and compared with force feedback involving human subjects. In an experimental tele-manipulation environment, a dynamically updated color map depicting the stiffness of probed soft tissue is presented via a graphical interface. The force feedback is provided, aided by a master haptic device. The haptic device uses data acquired from an F/T sensor attached to the end-effector of a tele-manipulated robot. Hard nodule detection performance is evaluated for 2 modes (force feedback and visual stiffness feedback) of stiffness feedback on an artificial organ containing buried stiff nodules. From this artificial organ, a virtual-environment tissue model is generated based on sliding indentation measurements. Employing this virtual-environment tissue model, we compare the performance of human participants in distinguishing differently sized hard nodules by force feedback and visual stiffness feedback. Results indicate that the proposed distributed visual representation of tissue stiffness can be used effectively for hard nodule identification. The representation can also be used as a sufficient substitute for force feedback in tissue palpation.
Meli, Leonardo; Pacchierotti, Claudio; Prattichizzo, Domenico
2014-04-01
This study presents a novel approach to force feedback in robot-assisted surgery. It consists of substituting haptic stimuli, composed of a kinesthetic component and a skin deformation, with cutaneous stimuli only. The force generated can then be thought as a subtraction between the complete haptic interaction, cutaneous, and kinesthetic, and the kinesthetic part of it. For this reason, we refer to this approach as sensory subtraction. Sensory subtraction aims at outperforming other nonkinesthetic feedback techniques in teleoperation (e.g., sensory substitution) while guaranteeing the stability and safety of the system. We tested the proposed approach in a challenging 7-DoF bimanual teleoperation task, similar to the Pegboard experiment of the da Vinci Skills Simulator. Sensory subtraction showed improved performance in terms of completion time, force exerted, and total displacement of the rings with respect to two popular sensory substitution techniques. Moreover, it guaranteed a stable interaction in the presence of a communication delay in the haptic loop.
Lin, Yanping; Chen, Huajiang; Yu, Dedong; Zhang, Ying; Yuan, Wen
2017-01-01
Bone drilling simulators with virtual and haptic feedback provide a safe, cost-effective and repeatable alternative to traditional surgical training methods. To develop such a simulator, accurate haptic rendering based on a force model is required to feedback bone drilling forces based on user input. Current predictive bone drilling force models based on bovine bones with various drilling conditions and parameters are not representative of the bone drilling process in bone surgery. The objective of this study was to provide a bone drilling force model for haptic rendering based on calibration and validation experiments in fresh cadaveric bones with different bone densities. Using a commonly used drill bit geometry (2 mm diameter), feed rates (20-60 mm/min) and spindle speeds (4000-6000 rpm) in orthognathic surgeries, the bone drilling forces of specimens from two groups were measured and the calibration coefficients of the specific normal and frictional pressures were determined. The comparison of the predicted forces and the measured forces from validation experiments with a large range of feed rates and spindle speeds demonstrates that the proposed bone drilling forces can predict the trends and average forces well. The presented bone drilling force model can be used for haptic rendering in surgical simulators.
The Efficacy of Surface Haptics and Force Feedback in Education
ERIC Educational Resources Information Center
Gorlewicz, Jenna Lynn
2013-01-01
This dissertation bridges the fields of haptics, engineering, and education to realize some of the potential benefits haptic devices may have in Science, Technology, Engineering, and Math (STEM) education. Specifically, this dissertation demonstrates the development, implementation, and assessment of two haptic devices in engineering and math…
Incorporating Haptic Feedback in Simulation for Learning Physics
ERIC Educational Resources Information Center
Han, Insook; Black, John B.
2011-01-01
The purpose of this study was to investigate the effectiveness of a haptic augmented simulation in learning physics. The results indicate that haptic augmented simulations, both the force and kinesthetic and the purely kinesthetic simulations, were more effective than the equivalent non-haptic simulation in providing perceptual experiences and…
Development of a Robotic Colonoscopic Manipulation System, Using Haptic Feedback Algorithm
Woo, Jaehong; Choi, Jae Hyuk; Seo, Jong Tae
2017-01-01
Purpose Colonoscopy is one of the most effective diagnostic and therapeutic tools for colorectal diseases. We aim to propose a master-slave robotic colonoscopy that is controllable in remote site using conventional colonoscopy. Materials and Methods The master and slave robot were developed to use conventional flexible colonoscopy. The robotic colonoscopic procedure was performed using a colonoscope training model by one expert endoscopist and two unexperienced engineers. To provide the haptic sensation, the insertion force and the rotating torque were measured and sent to the master robot. Results A slave robot was developed to hold the colonoscopy and its knob, and perform insertion, rotation, and two tilting motions of colonoscope. A master robot was designed to teach motions of the slave robot. These measured force and torque were scaled down by one tenth to provide the operator with some reflection force and torque at the haptic device. The haptic sensation and feedback system was successful and helpful to feel the constrained force or torque in colon. The insertion time using robotic system decreased with repeated procedures. Conclusion This work proposed a robotic approach for colonoscopy using haptic feedback algorithm, and this robotic device would effectively perform colonoscopy with reduced burden and comparable safety for patients in remote site. PMID:27873506
Object discrimination using optimized multi-frequency auditory cross-modal haptic feedback.
Gibson, Alison; Artemiadis, Panagiotis
2014-01-01
As the field of brain-machine interfaces and neuro-prosthetics continues to grow, there is a high need for sensor and actuation mechanisms that can provide haptic feedback to the user. Current technologies employ expensive, invasive and often inefficient force feedback methods, resulting in an unrealistic solution for individuals who rely on these devices. This paper responds through the development, integration and analysis of a novel feedback architecture where haptic information during the neural control of a prosthetic hand is perceived through multi-frequency auditory signals. Through representing force magnitude with volume and force location with frequency, the feedback architecture can translate the haptic experiences of a robotic end effector into the alternative sensory modality of sound. Previous research with the proposed cross-modal feedback method confirmed its learnability, so the current work aimed to investigate which frequency map (i.e. frequency-specific locations on the hand) is optimal in helping users distinguish between hand-held objects and tasks associated with them. After short use with the cross-modal feedback during the electromyographic (EMG) control of a prosthetic hand, testing results show that users are able to use audial feedback alone to discriminate between everyday objects. While users showed adaptation to three different frequency maps, the simplest map containing only two frequencies was found to be the most useful in discriminating between objects. This outcome provides support for the feasibility and practicality of the cross-modal feedback method during the neural control of prosthetics.
NASA Astrophysics Data System (ADS)
Yang, Tae-Heon; Koo, Jeong-Hoi
2017-12-01
Humans can experience a realistic and vivid haptic sensations by the sense of touch. In order to have a fully immersive haptic experience, both kinaesthetic and vibrotactile information must be presented to human users. Currently, little haptic research has been performed on small haptic actuators that can covey both vibrotactile feedback based on the frequency of vibrations up to the human-perceivable limit and multiple levels of kinaesthetic feedback rapidly. Therefore, this study intends to design a miniature haptic device based on MR fluid and experimentally evaluate its ability to convey vibrotactile feedback up to 300 Hz along with kinaesthetic feedback. After constructing a prototype device, a series of testing was performed to evaluate its performance of the prototype using an experimental setup, consisting of a precision dynamic mechanical analyzer and an accelerometer. The kinaesthetic testing results show that the prototype device can provide the force rate up to 89% at 5 V (360 mA), which can be discretized into multiple levels of ‘just noticeable difference’ force rate, indicating that the device can convey a wide range of kinaesthetic sensations. To evaluate the high frequency vibrotactile feedback performance of the device, its acceleration responses were measured and processed using the FFT analysis. The results indicate that the device can convey high frequency vibrotactile sensations up to 300 Hz with the sufficiently large intensity of accelerations that human can feel.
The Role of Direct and Visual Force Feedback in Suturing Using a 7-DOF Dual-Arm Teleoperated System.
Talasaz, Ali; Trejos, Ana Luisa; Patel, Rajni V
2017-01-01
The lack of haptic feedback in robotics-assisted surgery can result in tissue damage or accidental tool-tissue hits. This paper focuses on exploring the effect of haptic feedback via direct force reflection and visual presentation of force magnitudes on performance during suturing in robotics-assisted minimally invasive surgery (RAMIS). For this purpose, a haptics-enabled dual-arm master-slave teleoperation system capable of measuring tool-tissue interaction forces in all seven Degrees-of-Freedom (DOFs) was used. Two suturing tasks, tissue puncturing and knot-tightening, were chosen to assess user skills when suturing on phantom tissue. Sixteen subjects participated in the trials and their performance was evaluated from various points of view: force consistency, number of accidental hits with tissue, amount of tissue damage, quality of the suture knot, and the time required to accomplish the task. According to the results, visual force feedback was not very useful during the tissue puncturing task as different users needed different amounts of force depending on the penetration of the needle into the tissue. Direct force feedback, however, was more useful for this task to apply less force and to minimize the amount of damage to the tissue. Statistical results also reveal that both visual and direct force feedback were required for effective knot tightening: direct force feedback could reduce the number of accidental hits with the tissue and also the amount of tissue damage, while visual force feedback could help to securely tighten the suture knots and maintain force consistency among different trials/users. These results provide evidence of the importance of 7-DOF force reflection when performing complex tasks in a RAMIS setting.
Human-computer interface including haptically controlled interactions
Anderson, Thomas G.
2005-10-11
The present invention provides a method of human-computer interfacing that provides haptic feedback to control interface interactions such as scrolling or zooming within an application. Haptic feedback in the present method allows the user more intuitive control of the interface interactions, and allows the user's visual focus to remain on the application. The method comprises providing a control domain within which the user can control interactions. For example, a haptic boundary can be provided corresponding to scrollable or scalable portions of the application domain. The user can position a cursor near such a boundary, feeling its presence haptically (reducing the requirement for visual attention for control of scrolling of the display). The user can then apply force relative to the boundary, causing the interface to scroll the domain. The rate of scrolling can be related to the magnitude of applied force, providing the user with additional intuitive, non-visual control of scrolling.
A real-time haptic interface for interventional radiology procedures.
Moix, Thomas; Ilic, Dejan; Fracheboud, Blaise; Zoethout, Jurjen; Bleuler, Hannes
2005-01-01
Interventional Radiology (IR) is a minimally-invasive surgery technique (MIS) where guidewires and catheters are steered in the vascular system under X-ray imaging. In order to perform these procedures, a radiologist has to be correctly trained to master hand-eye coordination, instrument manipulation and procedure protocols. This paper proposes a computer-assisted training environment dedicated to IR. The system is composed of a virtual reality (VR) simulation of the anatomy of the patient linked to a robotic interface providing haptic force feedback.The paper focuses on the requirements, design and prototyping of a specific part of the haptic interface dedicated to catheters. Translational tracking and force feedback on the catheter is provided by two cylinders forming a friction drive arrangement. The whole friction can be set in rotation with an additional motor providing torque feedback. A force and a torque sensor are integrated in the cylinders for direct measurement on the catheter enabling disturbance cancellation with a close-loop force control strategy.
Haptic Paddle Enhancements and a Formal Assessment of Student Learning in System Dynamics
ERIC Educational Resources Information Center
Gorlewicz, Jenna L.; Kratchman, Louis B.; Webster, Robert J., III
2014-01-01
The haptic paddle is a force-feedback joystick used at several universities in teaching System Dynamics, a core mechanical engineering undergraduate course where students learn to model dynamic systems in several domains. A second goal of the haptic paddle is to increase the accessibility of robotics and haptics by providing a low-cost device for…
Kokes, Rebecca; Lister, Kevin; Gullapalli, Rao; Zhang, Bao; MacMillan, Alan; Richard, Howard; Desai, Jaydev P.
2009-01-01
Objective The purpose of this paper is to explore the feasibility of developing a MRI-compatible needle driver system for radiofrequency ablation (RFA) of breast tumors under continuous MRI imaging while being teleoperated by a haptic feedback device from outside the scanning room. The developed needle driver prototype was designed and tested for both tumor targeting capability as well as RFA. Methods The single degree-of-freedom (DOF) prototype was interfaced with a PHANToM haptic device controlled from outside the scanning room. Experiments were performed to demonstrate MRI-compatibility and position control accuracy with hydraulic actuation, along with an experiment to determine the PHANToM’s ability to guide the RFA tool to a tumor nodule within a phantom breast tissue model while continuously imaging within the MRI and receiving force feedback from the RFA tool. Results Hydraulic actuation is shown to be a feasible actuation technique for operation in an MRI environment. The design is MRI-compatible in all aspects except for force sensing in the directions perpendicular to the direction of motion. Experiments confirm that the user is able to detect healthy vs. cancerous tissue in a phantom model when provided with both visual (imaging) feedback and haptic feedback. Conclusion The teleoperated 1-DOF needle driver system presented in this paper demonstrates the feasibility of implementing a MRI-compatible robot for RFA of breast tumors with haptic feedback capability. PMID:19303805
Study on real-time force feedback for a master-slave interventional surgical robotic system.
Guo, Shuxiang; Wang, Yuan; Xiao, Nan; Li, Youxiang; Jiang, Yuhua
2018-04-13
In robot-assisted catheterization, haptic feedback is important, but is currently lacking. In addition, conventional interventional surgical robotic systems typically employ a master-slave architecture with an open-loop force feedback, which results in inaccurate control. We develop herein a novel real-time master-slave (RTMS) interventional surgical robotic system with a closed-loop force feedback that allows a surgeon to sense the true force during remote operation, provide adequate haptic feedback, and improve control accuracy in robot-assisted catheterization. As part of this system, we also design a unique master control handle that measures the true force felt by a surgeon, providing the basis for the closed-loop control of the entire system. We use theoretical and empirical methods to demonstrate that the proposed RTMS system provides a surgeon (using the master control handle) with a more accurate and realistic force sensation, which subsequently improves the precision of the master-slave manipulation. The experimental results show a substantial increase in the control accuracy of the force feedback and an increase in operational efficiency during surgery.
Design of a 7-DOF haptic master using a magneto-rheological devices for robot surgery
NASA Astrophysics Data System (ADS)
Kang, Seok-Rae; Choi, Seung-Bok; Hwang, Yong-Hoon; Cha, Seung-Woo
2017-04-01
This paper presents a 7 degrees-of-freedom (7-DOF) haptic master which is applicable to the robot-assisted minimally invasive surgery (RMIS). By utilizing a controllable magneto-rheological (MR) fluid, the haptic master can provide force information to the surgeon during surgery. The proposed haptic master consists of three degrees motions of X, Y, Z and four degrees motions of the pitch, yaw, roll and grasping. All of them have force feedback capability. The proposed haptic master can generate the repulsive forces or torques by activating MR clutch and MR brake. Both MR clutch and MR brake are designed and manufactured with consideration of the size and output torque which is usable to the robotic surgery. A proportional-integral-derivative (PID) controller is then designed and implemented to achieve torque/force tracking trajectories. It is verified that the proposed haptic master can track well the desired torque and force occurred in the surgical place by controlling the input current applied to MR clutch and brake.
Robot-assisted microsurgical forceps with haptic feedback for transoral laser microsurgery.
Deshpande, Nikhil; Chauhan, Manish; Pacchierotti, Claudio; Prattichizzo, Domenico; Caldwell, Darwin G; Mattos, Leonardo S
2016-08-01
In this paper, a novel, motorized, multi-degrees-of-freedom (DoF), microsurgical forceps tool is presented, which is based on a master-slave teleoperation architecture. The slave device is a 7-DoF manipulator with: (i) 6-DoF positioning and orientation, (ii) 1 open/close gripper DoF; and (iii) an integrated force/torque sensor for tissue grip-force measurement. The master device is a 7-DoF haptic interface which teleoperates the slave device, and provides haptic feedback in its gripper interface. The combination of the device and the surgeon interface replaces the manual, hand-held device providing easy-to-use and ergonomic tissue control, simplifying the surgical tasks. This makes the system suitable to real surgical scenarios in the operating room (OR). The performance of the system was analysed through the evaluation of teleoperation control and characterization of gripping force. The new system offers an overall positioning error of less than 400 μm demonstrating its safety and accuracy. Improved system precision, usability, and ergonomics point to the potential suitability of the device for the OR and its ability to advance haptic-feedback-enhanced transoral laser microsurgeries.
Human-Centered Design and Evaluation of Haptic Cueing for Teleoperation of Multiple Mobile Robots.
Son, Hyoung Il; Franchi, Antonio; Chuang, Lewis L; Kim, Junsuk; Bulthoff, Heinrich H; Giordano, Paolo Robuffo
2013-04-01
In this paper, we investigate the effect of haptic cueing on a human operator's performance in the field of bilateral teleoperation of multiple mobile robots, particularly multiple unmanned aerial vehicles (UAVs). Two aspects of human performance are deemed important in this area, namely, the maneuverability of mobile robots and the perceptual sensitivity of the remote environment. We introduce metrics that allow us to address these aspects in two psychophysical studies, which are reported here. Three fundamental haptic cue types were evaluated. The Force cue conveys information on the proximity of the commanded trajectory to obstacles in the remote environment. The Velocity cue represents the mismatch between the commanded and actual velocities of the UAVs and can implicitly provide a rich amount of information regarding the actual behavior of the UAVs. Finally, the Velocity+Force cue is a linear combination of the two. Our experimental results show that, while maneuverability is best supported by the Force cue feedback, perceptual sensitivity is best served by the Velocity cue feedback. In addition, we show that large gains in the haptic feedbacks do not always guarantee an enhancement in the teleoperator's performance.
Perception of force and stiffness in the presence of low-frequency haptic noise
Gurari, Netta; Okamura, Allison M.; Kuchenbecker, Katherine J.
2017-01-01
Objective This work lays the foundation for future research on quantitative modeling of human stiffness perception. Our goal was to develop a method by which a human’s ability to perceive suprathreshold haptic force stimuli and haptic stiffness stimuli can be affected by adding haptic noise. Methods Five human participants performed a same-different task with a one-degree-of-freedom force-feedback device. Participants used the right index finger to actively interact with variations of force (∼5 and ∼8 N) and stiffness (∼290 N/m) stimuli that included one of four scaled amounts of haptically rendered noise (None, Low, Medium, High). The haptic noise was zero-mean Gaussian white noise that was low-pass filtered with a 2 Hz cut-off frequency; the resulting low-frequency signal was added to the force rendered while the participant interacted with the force and stiffness stimuli. Results We found that the precision with which participants could identify the magnitude of both the force and stiffness stimuli was affected by the magnitude of the low-frequency haptically rendered noise added to the haptic stimulus, as well as the magnitude of the haptic stimulus itself. The Weber fraction strongly correlated with the standard deviation of the low-frequency haptic noise with a Pearson product-moment correlation coefficient of ρ > 0.83. The mean standard deviation of the low-frequency haptic noise in the haptic stimuli ranged from 0.184 N to 1.111 N across the four haptically rendered noise levels, and the corresponding mean Weber fractions spanned between 0.042 and 0.101. Conclusions The human ability to perceive both suprathreshold haptic force and stiffness stimuli degrades in the presence of added low-frequency haptic noise. Future work can use the reported methods to investigate how force perception and stiffness perception may relate, with possible applications in haptic watermarking and in the assessment of the functionality of peripheral pathways in individuals with haptic impairments. PMID:28575068
Adaptive displays and controllers using alternative feedback.
Repperger, D W
2004-12-01
Investigations on the design of haptic (force reflecting joystick or force display) controllers were conducted by viewing the display of force information within the context of several different paradigms. First, using analogies from electrical and mechanical systems, certain schemes of the haptic interface were hypothesized which may improve the human-machine interaction with respect to various criteria. A discussion is given on how this interaction benefits the electrical and mechanical system. To generalize this concept to the design of human-machine interfaces, three studies with haptic mechanisms were then synthesized and analyzed.
Ergonomic evaluation of 3D plane positioning using a mouse and a haptic device.
Paul, Laurent; Cartiaux, Olivier; Docquier, Pierre-Louis; Banse, Xavier
2009-12-01
Preoperative planning and intraoperative assistance are needed to improve accuracy in tumour surgery. To be accepted, these processes must be efficient. An experiment was conducted to compare a mouse and a haptic device, with and without force feedback, to perform plan positioning in a 3D space. Ergonomics and performance factors were investigated during the experiment. Positioning strategies were observed. The task completion time, number of 3D orientations and failure rate were analysed. A questionnaire on ergonomics was filled out by each participant. The haptic device showed a significantly lower failure rate and was quicker and more ergonomic than the mouse. The force feedback was not beneficial to the accomplishment of the task. The haptic device is intuitive, ergonomic and more efficient than the mouse for positioning a 3D plane into a 3D space. Useful observations regarding positioning strategies will improve the integration of haptic devices into medical applications. Copyright (c) 2009 John Wiley & Sons, Ltd.
Effects of Visual Force Feedback on Robot-Assisted Surgical Task Performance
Reiley, Carol E.; Akinbiyi, Takintope; Burschka, Darius; Chang, David C.; Okamura, Allison M.; Yuh, David D.
2009-01-01
Background Direct haptic (force or tactile) feedback is negligible in current surgical robotic systems. The relevance of haptic feedback in robot-assisted performances of surgical tasks is controversial. We studied the effects of visual force feedback (VFF), a haptic feedback surrogate, on tying surgical knots with fine sutures similar to those used in cardiovascular surgery. Methods Using a modified da Vinci robotic system (Intuitive Surgical, Inc.) equipped with force-sensing instrument tips and real-time VFF overlays in the console image, ten surgeons each tied 10 knots with and 10 knots without VFF. Four surgeons had significant prior da Vinci experience while the remaining six surgeons did not. Performance parameters, including suture breakage and secure knots, peak and standard deviation of applied forces, and completion times using 5-0 silk sutures were recorded. Chi-square and Student’s t-test analyses determined differences between groups. Results Among surgeon subjects with robotic experience, no differences in measured performance parameters were found between robot-assisted knot ties executed with and without VFF. Among surgeons without robotic experience, however, VFF was associated with lower suture breakage rates, peak applied forces, and standard deviations of applied forces. VFF did not impart differences in knot completion times or loose knots for either surgeon group. Conclusions VFF resulted in reduced suture breakage, lower forces, and decreased force inconsistencies among novice robotic surgeons, although elapsed time and knot quality were unaffected. In contrast, VFF did not affect these metrics among experienced da Vinci surgeons. These results suggest that VFF primarily benefits novice robot-assisted surgeons, with diminishing benefits among experienced surgeons. PMID:18179942
Ottensmeyer, M P; Ben-Ur, E; Salisbury, J K
2000-01-01
Current efforts in surgical simulation very often focus on creating realistic graphical feedback, but neglect some or all tactile and force (haptic) feedback that a surgeon would normally receive. Simulations that do include haptic feedback do not typically use real tissue compliance properties, favoring estimates and user feedback to determine realism. When tissue compliance data are used, there are virtually no in vivo property measurements to draw upon. Together with the Center for Innovative Minimally Invasive Therapy at the Massachusetts General Hospital, the Haptics Group is developing tools to introduce more comprehensive haptic feedback in laparoscopy simulators and to provide biological tissue material property data for our software simulation. The platform for providing haptic feedback is a PHANToM Haptic Interface, produced by SensAble Technologies, Inc. Our devices supplement the PHANToM to provide for grasping and optionally, for the roll axis of the tool. Together with feedback from the PHANToM, which provides the pitch, yaw and thrust axes of a typical laparoscopy tool, we can recreate all of the haptic sensations experienced during laparoscopy. The devices integrate real laparoscopy toolhandles and a compliant torso model to complete the set of visual and tactile sensations. Biological tissues are known to exhibit non-linear mechanical properties, and change their properties dramatically when removed from a living organism. To measure the properties in vivo, two devices are being developed. The first is a small displacement, 1-D indenter. It will measure the linear tissue compliance (stiffness and damping) over a wide range of frequencies. These data will be used as inputs to a finite element or other model. The second device will be able to deflect tissues in 3-D over a larger range, so that the non-linearities due to changes in the tissue geometry will be measured. This will allow us to validate the performance of the model on large tissue deformations. Both devices are designed to pass through standard 12 mm laparoscopy trocars, and will be suitable for use during open or minimally invasive procedures. We plan to acquire data from pigs used by surgeons for training purposes, but conceivably, the tools could be refined for use on humans undergoing surgery. Our work will provide the necessary data input for surgical simulations to accurately model the force interactions that a surgeon would have with tissue, and will provide the force output to create a truly realistic simulation of minimally invasive surgery.
Evaluation of Wearable Haptic Systems for the Fingers in Augmented Reality Applications.
Maisto, Maurizio; Pacchierotti, Claudio; Chinello, Francesco; Salvietti, Gionata; De Luca, Alessandro; Prattichizzo, Domenico
2017-01-01
Although Augmented Reality (AR) has been around for almost five decades, only recently we have witnessed AR systems and applications entering in our everyday life. Representative examples of this technological revolution are the smartphone games "Pokémon GO" and "Ingress" or the Google Translate real-time sign interpretation app. Even if AR applications are already quite compelling and widespread, users are still not able to physically interact with the computer-generated reality. In this respect, wearable haptics can provide the compelling illusion of touching the superimposed virtual objects without constraining the motion or the workspace of the user. In this paper, we present the experimental evaluation of two wearable haptic interfaces for the fingers in three AR scenarios, enrolling 38 participants. In the first experiment, subjects were requested to write on a virtual board using a real chalk. The haptic devices provided the interaction forces between the chalk and the board. In the second experiment, subjects were asked to pick and place virtual and real objects. The haptic devices provided the interaction forces due to the weight of the virtual objects. In the third experiment, subjects were asked to balance a virtual sphere on a real cardboard. The haptic devices provided the interaction forces due to the weight of the virtual sphere rolling on the cardboard. Providing haptic feedback through the considered wearable device significantly improved the performance of all the considered tasks. Moreover, subjects significantly preferred conditions providing wearable haptic feedback.
Physical Student-Robot Interaction with the ETHZ Haptic Paddle
ERIC Educational Resources Information Center
Gassert, R.; Metzger, J.; Leuenberger, K.; Popp, W. L.; Tucker, M. R.; Vigaru, B.; Zimmermann, R.; Lambercy, O.
2013-01-01
Haptic paddles--low-cost one-degree-of-freedom force feedback devices--have been used with great success at several universities throughout the US to teach the basic concepts of dynamic systems and physical human-robot interaction (pHRI) to students. The ETHZ haptic paddle was developed for a new pHRI course offered in the undergraduate…
Advanced Maintenance Simulation by Means of Hand-Based Haptic Interfaces
NASA Astrophysics Data System (ADS)
Nappi, Michele; Paolino, Luca; Ricciardi, Stefano; Sebillo, Monica; Vitiello, Giuliana
Aerospace industry has been involved in virtual simulation for design and testing since the birth of virtual reality. Today this industry is showing a growing interest in the development of haptic-based maintenance training applications, which represent the most advanced way to simulate maintenance and repair tasks within a virtual environment by means of a visual-haptic approach. The goal is to allow the trainee to experiment the service procedures not only as a workflow reproduced at a visual level but also in terms of the kinaesthetic feedback involved with the manipulation of tools and components. This study, conducted in collaboration with aerospace industry specialists, is aimed to the development of an immersive virtual capable of immerging the trainees into a virtual environment where mechanics and technicians can perform maintenance simulation or training tasks by directly manipulating 3D virtual models of aircraft parts while perceiving force feedback through the haptic interface. The proposed system is based on ViRstperson, a virtual reality engine under development at the Italian Center for Aerospace Research (CIRA) to support engineering and technical activities such as design-time maintenance procedure validation, and maintenance training. This engine has been extended to support haptic-based interaction, enabling a more complete level of interaction, also in terms of impedance control, and thus fostering the development of haptic knowledge in the user. The user’s “sense of touch” within the immersive virtual environment is simulated through an Immersion CyberForce® hand-based force-feedback device. Preliminary testing of the proposed system seems encouraging.
A perspective on the role and utility of haptic feedback in laparoscopic skills training.
Singapogu, Ravikiran; Burg, Timothy; Burg, Karen J L; Smith, Dane E; Eckenrode, Amanda H
2014-01-01
Laparoscopic surgery is a minimally invasive surgical technique with significant potential benefits to the patient, including shorter recovery time, less scarring, and decreased costs. There is a growing need to teach surgical trainees this emerging surgical technique. Simulators, ranging from simple "box" trainers to complex virtual reality (VR) trainers, have emerged as the most promising method for teaching basic laparoscopic surgical skills. Current box trainers require oversight from an expert surgeon for both training and assessing skills. VR trainers decrease the dependence on expert teachers during training by providing objective, real-time feedback and automatic skills evaluation. However, current VR trainers generally have limited credibility as a means to prepare new surgeons and have often fallen short of educators' expectations. Several researchers have speculated that the missing component in modern VR trainers is haptic feedback, which refers to the range of touch sensations encountered during surgery. These force types and ranges need to be adequately rendered by simulators for a more complete training experience. This article presents a perspective of the role and utility of haptic feedback during laparoscopic surgery and laparoscopic skills training by detailing the ranges and types of haptic sensations felt by the operating surgeon, along with quantitative studies of how this feedback is used. Further, a number of research studies that have documented human performance effects as a result of the presence of haptic feedback are critically reviewed. Finally, key research directions in using haptic feedback for laparoscopy training simulators are identified.
High-fidelity bilateral teleoperation systems and the effect of multimodal haptics.
Tavakoli, Mahdi; Aziminejad, Arash; Patel, Rajni V; Moallem, Mehrdad
2007-12-01
In master-slave teleoperation applications that deal with a delicate and sensitive environment, it is important to provide haptic feedback of slave/environment interactions to the user's hand as it improves task performance and teleoperation transparency (fidelity), which is the extent of telepresence of the remote environment available to the user through the master-slave system. For haptic teleoperation, in addition to a haptics-capable master interface, often one or more force sensors are also used, which warrant new bilateral control architectures while increasing the cost and the complexity of the teleoperation system. In this paper, we investigate the added benefits of using force sensors that measure hand/master and slave/environment interactions and of utilizing local feedback loops on the teleoperation transparency. We compare the two-channel and the four-channel bilateral control systems in terms of stability and transparency, and study the stability and performance robustness of the four-channel method against nonidealities that arise during bilateral control implementation, which include master-slave communication latency and changes in the environment dynamics. The next issue addressed in the paper deals with the case where the master interface is not haptics capable, but the slave is equipped with a force sensor. In the context of robotics-assisted soft-tissue surgical applications, we explore through human factors experiments whether slave/environment force measurements can be of any help with regard to improving task performance. The last problem we study is whether slave/environment force information, with and without haptic capability in the master interface, can help improve outcomes under degraded visual conditions.
OzBot and haptics: remote surveillance to physical presence
NASA Astrophysics Data System (ADS)
Mullins, James; Fielding, Mick; Nahavandi, Saeid
2009-05-01
This paper reports on robotic and haptic technologies and capabilities developed for the law enforcement and defence community within Australia by the Centre for Intelligent Systems Research (CISR). The OzBot series of small and medium surveillance robots have been designed in Australia and evaluated by law enforcement and defence personnel to determine suitability and ruggedness in a variety of environments. Using custom developed digital electronics and featuring expandable data busses including RS485, I2C, RS232, video and Ethernet, the robots can be directly connected to many off the shelf payloads such as gas sensors, x-ray sources and camera systems including thermal and night vision. Differentiating the OzBot platform from its peers is its ability to be integrated directly with haptic technology or the 'haptic bubble' developed by CISR. Haptic interfaces allow an operator to physically 'feel' remote environments through position-force control and experience realistic force feedback. By adding the capability to remotely grasp an object, feel its weight, texture and other physical properties in real-time from the remote ground control unit, an operator's situational awareness is greatly improved through Haptic augmentation in an environment where remote-system feedback is often limited.
Development of optical FBG force measurement system for the medical application
NASA Astrophysics Data System (ADS)
Song, Hoseok; Kim, Kiyoung; Suh, Jungwook; Lee, Jungju
2010-03-01
Haptic feedback plays a very important role in medical surgery. In minimally invasive surgery (MIS), however, very long and stiff bar of instruments take haptic feeling away from the surgeon. In minimally invasive robotic surgery (MIRS), moreover, haptic feelings are totally eliminated. Previous researchers have reported that the absence of force feedback increased the average force magnitude applied to the tissue by at least 50%, and increased the peakforce magnitude by at least a factor of two. Therefore, it is very important to provide haptic information in MIRS. Recently, many sensors are being developed for MIS or MIRS, but they have some obstacles in their application to real situations of medical surgery. The most critical problems are size limit and sterilizability. Optical fiber sensors are one of the most suitable sensors for this environment. Especially, optical fiber Bragg grating (FBG) sensor has one additional advantage than the other optical fiber sensors. FBG sensor is not influenced by intensity of light source. In this paper, we would like to present the initial results of study on the application of the FBG sensor to measure reflected forces in MIRS environments and then suggest the possibility of successful application to the MIRS systems.
Development of optical FBG force measurement system for the medical application
NASA Astrophysics Data System (ADS)
Song, Hoseok; Kim, Kiyoung; Suh, Jungwook; Lee, Jungju
2009-12-01
Haptic feedback plays a very important role in medical surgery. In minimally invasive surgery (MIS), however, very long and stiff bar of instruments take haptic feeling away from the surgeon. In minimally invasive robotic surgery (MIRS), moreover, haptic feelings are totally eliminated. Previous researchers have reported that the absence of force feedback increased the average force magnitude applied to the tissue by at least 50%, and increased the peakforce magnitude by at least a factor of two. Therefore, it is very important to provide haptic information in MIRS. Recently, many sensors are being developed for MIS or MIRS, but they have some obstacles in their application to real situations of medical surgery. The most critical problems are size limit and sterilizability. Optical fiber sensors are one of the most suitable sensors for this environment. Especially, optical fiber Bragg grating (FBG) sensor has one additional advantage than the other optical fiber sensors. FBG sensor is not influenced by intensity of light source. In this paper, we would like to present the initial results of study on the application of the FBG sensor to measure reflected forces in MIRS environments and then suggest the possibility of successful application to the MIRS systems.
A “virtually minimal” visuo-haptic training of attention in severe traumatic brain injury
2013-01-01
Background Although common during the early stages of recovery from severe traumatic brain injury (TBI), attention deficits have been scarcely investigated. Encouraging evidence suggests beneficial effects of attention training in more chronic and higher functioning patients. Interactive technology may provide new opportunities for rehabilitation in inpatients who are earlier in their recovery. Methods We designed a “virtually minimal” approach using robot-rendered haptics in a virtual environment to train severely injured inpatients in the early stages of recovery to sustain attention to a visuo-motor task. 21 inpatients with severe TBI completed repetitive reaching toward targets that were both seen and felt. Patients were tested over two consecutive days, experiencing 3 conditions (no haptic feedback, a break-through force, and haptic nudge) in 12 successive, 4-minute blocks. Results The interactive visuo-haptic environments were well-tolerated and engaging. Patients typically remained attentive to the task. However, patients exhibited attention loss both before (prolonged initiation) and during (pauses during motion) a movement. Compared to no haptic feedback, patients benefited from haptic nudge cues but not break-through forces. As training progressed, patients increased the number of targets acquired and spontaneously improved from one day to the next. Conclusions Interactive visuo-haptic environments could be beneficial for attention training for severe TBI patients in the early stages of recovery and warrants further and more prolonged clinical testing. PMID:23938101
A "virtually minimal" visuo-haptic training of attention in severe traumatic brain injury.
Dvorkin, Assaf Y; Ramaiya, Milan; Larson, Eric B; Zollman, Felise S; Hsu, Nancy; Pacini, Sonia; Shah, Amit; Patton, James L
2013-08-09
Although common during the early stages of recovery from severe traumatic brain injury (TBI), attention deficits have been scarcely investigated. Encouraging evidence suggests beneficial effects of attention training in more chronic and higher functioning patients. Interactive technology may provide new opportunities for rehabilitation in inpatients who are earlier in their recovery. We designed a "virtually minimal" approach using robot-rendered haptics in a virtual environment to train severely injured inpatients in the early stages of recovery to sustain attention to a visuo-motor task. 21 inpatients with severe TBI completed repetitive reaching toward targets that were both seen and felt. Patients were tested over two consecutive days, experiencing 3 conditions (no haptic feedback, a break-through force, and haptic nudge) in 12 successive, 4-minute blocks. The interactive visuo-haptic environments were well-tolerated and engaging. Patients typically remained attentive to the task. However, patients exhibited attention loss both before (prolonged initiation) and during (pauses during motion) a movement. Compared to no haptic feedback, patients benefited from haptic nudge cues but not break-through forces. As training progressed, patients increased the number of targets acquired and spontaneously improved from one day to the next. Interactive visuo-haptic environments could be beneficial for attention training for severe TBI patients in the early stages of recovery and warrants further and more prolonged clinical testing.
Dibble, Edward; Zivanovic, Aleksandar; Davies, Brian
2004-01-01
This paper presents the results of several early studies relating to human haptic perception sensitivity when probing a virtual object. A 1 degree of freedom (DoF) rotary haptic system, that was designed and built for this purpose, is also presented. The experiments were to assess the maximum forces applied in a minimally invasive surgery (MIS) procedure, quantify the compliance sensitivity threshold when probing virtual tissue and identify the haptic system loop rate necessary for haptic feedback to feel realistic.
Dennerlein, J T; Yang, M C
2001-01-01
Pointing devices, essential input tools for the graphical user interface (GUI) of desktop computers, require precise motor control and dexterity to use. Haptic force-feedback devices provide the human operator with tactile cues, adding the sense of touch to existing visual and auditory interfaces. However, the performance enhancements, comfort, and possible musculoskeletal loading of using a force-feedback device in an office environment are unknown. Hypothesizing that the time to perform a task and the self-reported pain and discomfort of the task improve with the addition of force feedback, 26 people ranging in age from 22 to 44 years performed a point-and-click task 540 times with and without an attractive force field surrounding the desired target. The point-and-click movements were approximately 25% faster with the addition of force feedback (paired t-tests, p < 0.001). Perceived user discomfort and pain, as measured through a questionnaire, were also smaller with the addition of force feedback (p < 0.001). However, this difference decreased as additional distracting force fields were added to the task environment, simulating a more realistic work situation. These results suggest that for a given task, use of a force-feedback device improves performance, and potentially reduces musculoskeletal loading during mouse use. Actual or potential applications of this research include human-computer interface design, specifically that of the pointing device extensively used for the graphical user interface.
Discriminating Tissue Stiffness with a Haptic Catheter: Feeling the Inside of the Beating Heart.
Kesner, Samuel B; Howe, Robert D
2011-01-01
Catheter devices allow physicians to access the inside of the human body easily and painlessly through natural orifices and vessels. Although catheters allow for the delivery of fluids and drugs, the deployment of devices, and the acquisition of the measurements, they do not allow clinicians to assess the physical properties of tissue inside the body due to the tissue motion and transmission limitations of the catheter devices, including compliance, friction, and backlash. The goal of this research is to increase the tactile information available to physicians during catheter procedures by providing haptic feedback during palpation procedures. To accomplish this goal, we have developed the first motion compensated actuated catheter system that enables haptic perception of fast moving tissue structures. The actuated catheter is instrumented with a distal tip force sensor and a force feedback interface that allows users to adjust the position of the catheter while experiencing the forces on the catheter tip. The efficacy of this device and interface is evaluated through a psychophyisical study comparing how accurately users can differentiate various materials attached to a cardiac motion simulator using the haptic device and a conventional manual catheter. The results demonstrate that haptics improves a user's ability to differentiate material properties and decreases the total number of errors by 50% over the manual catheter system.
A novel shape-changing haptic table-top display
NASA Astrophysics Data System (ADS)
Wang, Jiabin; Zhao, Lu; Liu, Yue; Wang, Yongtian; Cai, Yi
2018-01-01
A shape-changing table-top display with haptic feedback allows its users to perceive 3D visual and texture displays interactively. Since few existing devices are developed as accurate displays with regulatory haptic feedback, a novel attentive and immersive shape changing mechanical interface (SCMI) consisting of image processing unit and transformation unit was proposed in this paper. In order to support a precise 3D table-top display with an offset of less than 2 mm, a custommade mechanism was developed to form precise surface and regulate the feedback force. The proposed image processing unit was capable of extracting texture data from 2D picture for rendering shape-changing surface and realizing 3D modeling. The preliminary evaluation result proved the feasibility of the proposed system.
Using the PhysX engine for physics-based virtual surgery with force feedback.
Maciel, Anderson; Halic, Tansel; Lu, Zhonghua; Nedel, Luciana P; De, Suvranu
2009-09-01
The development of modern surgical simulators is highly challenging, as they must support complex simulation environments. The demand for higher realism in such simulators has driven researchers to adopt physics-based models, which are computationally very demanding. This poses a major problem, since real-time interactions must permit graphical updates of 30 Hz and a much higher rate of 1 kHz for force feedback (haptics). Recently several physics engines have been developed which offer multi-physics simulation capabilities, including rigid and deformable bodies, cloth and fluids. While such physics engines provide unique opportunities for the development of surgical simulators, their higher latencies, compared to what is necessary for real-time graphics and haptics, offer significant barriers to their use in interactive simulation environments. In this work, we propose solutions to this problem and demonstrate how a multimodal surgical simulation environment may be developed based on NVIDIA's PhysX physics library. Hence, models that are undergoing relatively low-frequency updates in PhysX can exist in an environment that demands much higher frequency updates for haptics. We use a collision handling layer to interface between the physical response provided by PhysX and the haptic rendering device to provide both real-time tissue response and force feedback. Our simulator integrates a bimanual haptic interface for force feedback and per-pixel shaders for graphics realism in real time. To demonstrate the effectiveness of our approach, we present the simulation of the laparoscopic adjustable gastric banding (LAGB) procedure as a case study. To develop complex and realistic surgical trainers with realistic organ geometries and tissue properties demands stable physics-based deformation methods, which are not always compatible with the interaction level required for such trainers. We have shown that combining different modelling strategies for behaviour, collision and graphics is possible and desirable. Such multimodal environments enable suitable rates to simulate the major steps of the LAGB procedure.
Integration of soft tissue model and open haptic device for medical training simulator
NASA Astrophysics Data System (ADS)
Akasum, G. F.; Ramdhania, L. N.; Suprijanto; Widyotriatmo, A.
2016-03-01
Minimally Invasive Surgery (MIS) has been widely used to perform any surgical procedures nowadays. Currently, MIS has been applied in some cases in Indonesia. Needle insertion is one of simple MIS procedure that can be used for some purposes. Before the needle insertion technique used in the real situation, it essential to train this type of medical student skills. The research has developed an open platform of needle insertion simulator with haptic feedback that providing the medical student a realistic feel encountered during the actual procedures. There are three main steps in build the training simulator, which are configure hardware system, develop a program to create soft tissue model and the integration of hardware and software. For evaluating its performance, haptic simulator was tested by 24 volunteers on a scenario of soft tissue model. Each volunteer must insert the needle on simulator until rearch the target point with visual feedback that visualized on the monitor. From the result it can concluded that the soft tissue model can bring the sensation of touch through the perceived force feedback on haptic actuator by looking at the different force in accordance with different stiffness in each layer.
Haptic interface of the KAIST-Ewha colonoscopy simulator II.
Woo, Hyun Soo; Kim, Woo Seok; Ahn, Woojin; Lee, Doo Yong; Yi, Sun Young
2008-11-01
This paper presents an improved haptic interface for the Korea Advanced Institute of Science and Technology Ewha Colonoscopy Simulator II. The haptic interface enables the distal portion of the colonoscope to be freely bent while guaranteeing sufficient workspace and reflective forces for colonoscopy simulation. Its force-torque sensor measures the profiles of the user. Manipulation of the colonoscope tip is monitored by four deflection sensors and triggers computations to render accurate graphic images corresponding to the rotation of the angle knob. Tack sensors are attached to the valve-actuation buttons of the colonoscope to simulate air injection or suction as well as the corresponding deformation of the colon. A survey study for face validation was conducted, and the result shows that the developed haptic interface provides realistic haptic feedback for colonoscopy simulations.
Enhanced operator perception through 3D vision and haptic feedback
NASA Astrophysics Data System (ADS)
Edmondson, Richard; Light, Kenneth; Bodenhamer, Andrew; Bosscher, Paul; Wilkinson, Loren
2012-06-01
Polaris Sensor Technologies (PST) has developed a stereo vision upgrade kit for TALON® robot systems comprised of a replacement gripper camera and a replacement mast zoom camera on the robot, and a replacement display in the Operator Control Unit (OCU). Harris Corporation has developed a haptic manipulation upgrade for TALON® robot systems comprised of a replacement arm and gripper and an OCU that provides haptic (force) feedback. PST and Harris have recently collaborated to integrate the 3D vision system with the haptic manipulation system. In multiple studies done at Fort Leonard Wood, Missouri it has been shown that 3D vision and haptics provide more intuitive perception of complicated scenery and improved robot arm control, allowing for improved mission performance and the potential for reduced time on target. This paper discusses the potential benefits of these enhancements to robotic systems used for the domestic homeland security mission.
[Haptic tracking control for minimally invasive robotic surgery].
Xu, Zhaohong; Song, Chengli; Wu, Wenwu
2012-06-01
Haptic feedback plays a significant role in minimally invasive robotic surgery (MIRS). A major deficiency of the current MIRS is the lack of haptic perception for the surgeon, including the commercially available robot da Vinci surgical system. In this paper, a dynamics model of a haptic robot is established based on Newton-Euler method. Because it took some period of time in exact dynamics solution, we used a digital PID arithmetic dependent on robot dynamics to ensure real-time bilateral control, and it could improve tracking precision and real-time control efficiency. To prove the proposed method, an experimental system in which two Novint Falcon haptic devices acting as master-slave system has been developed. Simulations and experiments showed proposed methods could give instrument force feedbacks to operator, and bilateral control strategy is an effective method to master-slave MIRS. The proposed methods could be used to tele-robotic system.
Do Haptic Representations Help Complex Molecular Learning?
ERIC Educational Resources Information Center
Bivall, Petter; Ainsworth, Shaaron; Tibell, Lena A. E.
2011-01-01
This study explored whether adding a haptic interface (that provides users with somatosensory information about virtual objects by force and tactile feedback) to a three-dimensional (3D) chemical model enhanced students' understanding of complex molecular interactions. Two modes of the model were compared in a between-groups pre- and posttest…
Feel, imagine and learn! - Haptic augmented simulation and embodied instruction in physics learning
NASA Astrophysics Data System (ADS)
Han, In Sook
The purpose of this study was to investigate the potentials and effects of an embodied instructional model in abstract concept learning. This embodied instructional process included haptic augmented educational simulation as an instructional tool to provide perceptual experiences as well as further instruction to activate those previous experiences with perceptual simulation. In order to verify the effectiveness of this instructional model, haptic augmented simulation with three different haptic levels (force and kinesthetic, kinesthetic, and non-haptic) and instructional materials (narrative and expository) were developed and their effectiveness tested. 220 fifth grade students were recruited to participate in the study from three elementary schools located in lower SES neighborhoods in Bronx, New York. The study was conducted for three consecutive weeks in regular class periods. The data was analyzed using ANCOVA, ANOVA, and MANOVA. The result indicates that haptic augmented simulations, both the force and kinesthetic and the kinesthetic simulations, was more effective than the non-haptic simulation in providing perceptual experiences and helping elementary students to create multimodal representations about machines' movements. However, in most cases, force feedback was needed to construct a fully loaded multimodal representation that could be activated when the instruction with less sensory modalities was being given. In addition, the force and kinesthetic simulation was effective in providing cognitive grounding to comprehend a new learning content based on the multimodal representation created with enhanced force feedback. Regarding the instruction type, it was found that the narrative and the expository instructions did not make any difference in activating previous perceptual experiences. These findings suggest that it is important to help students to make a solid cognitive ground with perceptual anchor. Also, sequential abstraction process would deepen students' understanding by providing an opportunity to practice their mental simulation by removing sensory modalities used one by one and to gradually reach abstract level of understanding where students can imagine the machine's movements and working mechanisms with only abstract language without any perceptual supports.
Experimental Study on the Perception Characteristics of Haptic Texture by Multidimensional Scaling.
Wu, Juan; Li, Na; Liu, Wei; Song, Guangming; Zhang, Jun
2015-01-01
Recent works regarding real texture perception demonstrate that physical factors such as stiffness and spatial period play a fundamental role in texture perception. This research used a multidimensional scaling (MDS) analysis to further characterize and quantify the effects of the simulation parameters on haptic texture rendering and perception. In a pilot experiment, 12 haptic texture samples were generated by using a 3-degrees-of-freedom (3-DOF) force-feedback device with varying spatial period, height, and stiffness coefficient parameter values. The subjects' perceptions of the virtual textures indicate that roughness, denseness, flatness and hardness are distinguishing characteristics of texture. In the main experiment, 19 participants rated the dissimilarities of the textures and estimated the magnitudes of their characteristics. The MDS method was used to recover the underlying perceptual space and reveal the significance of the space from the recorded data. The physical parameters and their combinations have significant effects on the perceptual characteristics. A regression model was used to quantitatively analyze the parameters and their effects on the perceptual characteristics. This paper is to illustrate that haptic texture perception based on force feedback can be modeled in two- or three-dimensional space and provide suggestions on improving perception-based haptic texture rendering.
Evaluation of haptic interfaces for simulation of drill vibration in virtual temporal bone surgery.
Ghasemloonia, Ahmad; Baxandall, Shalese; Zareinia, Kourosh; Lui, Justin T; Dort, Joseph C; Sutherland, Garnette R; Chan, Sonny
2016-11-01
Surgical training is evolving from an observership model towards a new paradigm that includes virtual-reality (VR) simulation. In otolaryngology, temporal bone dissection has become intimately linked with VR simulation as the complexity of anatomy demands a high level of surgeon aptitude and confidence. While an adequate 3D visualization of the surgical site is available in current simulators, the force feedback rendered during haptic interaction does not convey vibrations. This lack of vibration rendering limits the simulation fidelity of a surgical drill such as that used in temporal bone dissection. In order to develop an immersive simulation platform capable of haptic force and vibration feedback, the efficacy of hand controllers for rendering vibration in different drilling circumstances needs to be investigated. In this study, the vibration rendering ability of four different haptic hand controllers were analyzed and compared to find the best commercial haptic hand controller. A test-rig was developed to record vibrations encountered during temporal bone dissection and a software was written to render the recorded signals without adding hardware to the system. An accelerometer mounted on the end-effector of each device recorded the rendered vibration signals. The newly recorded vibration signal was compared with the input signal in both time and frequency domains by coherence and cross correlation analyses to quantitatively measure the fidelity of these devices in terms of rendering vibrotactile drilling feedback in different drilling conditions. This method can be used to assess the vibration rendering ability in VR simulation systems and selection of ideal haptic devices. Copyright © 2016 Elsevier Ltd. All rights reserved.
Prasad, M S Raghu; Manivannan, Muniyandi; Manoharan, Govindan; Chandramohan, S M
2016-01-01
Most of the commercially available virtual reality-based laparoscopic simulators do not effectively evaluate combined psychomotor and force-based laparoscopic skills. Consequently, the lack of training on these critical skills leads to intraoperative errors. To assess the effectiveness of the novel virtual reality-based simulator, this study analyzed the combined psychomotor (i.e., motion or movement) and force skills of residents and expert surgeons. The study also examined the effectiveness of real-time visual force feedback and tool motion during training. Bimanual fundamental (i.e., probing, pulling, sweeping, grasping, and twisting) and complex tasks (i.e., tissue dissection) were evaluated. In both tasks, visual feedback on applied force and tool motion were provided. The skills of the participants while performing the early tasks were assessed with and without visual feedback. Participants performed 5 repetitions of fundamental and complex tasks. Reaction force and instrument acceleration were used as metrics. Surgical Gastroenterology, Government Stanley Medical College and Hospital; Institute of Surgical Gastroenterology, Madras Medical College and Rajiv Gandhi Government General Hospital. Residents (N = 25; postgraduates and surgeons with <2 years of laparoscopic surgery) and expert surgeons (N = 25; surgeons with >4 and ≤10 years of laparoscopic surgery). Residents applied large forces compared with expert surgeons and performed abrupt tool movements (p < 0.001). However, visual + haptic feedback improved the performance of residents (p < 0.001). In complex tasks, visual + haptic feedback did not influence the applied force of expert surgeons, but influenced their tool motion (p < 0.001). Furthermore, in complex tissue sweeping task, expert surgeons applied more force, but were within the tissue damage limits. In both groups, exertion of large forces and abrupt tool motion were observed during grasping, probing or pulling, and tissue sweeping maneuvers (p < 0.001). Modern day curriculum-based training should evaluate the skills of residents with robust force and psychomotor-based exercises for proficient laparoscopy. Visual feedback on force and motion during training has the potential to enhance the learning curve of residents. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Discriminating Tissue Stiffness with a Haptic Catheter: Feeling the Inside of the Beating Heart
Kesner, Samuel B.; Howe, Robert D.
2011-01-01
Catheter devices allow physicians to access the inside of the human body easily and painlessly through natural orifices and vessels. Although catheters allow for the delivery of fluids and drugs, the deployment of devices, and the acquisition of the measurements, they do not allow clinicians to assess the physical properties of tissue inside the body due to the tissue motion and transmission limitations of the catheter devices, including compliance, friction, and backlash. The goal of this research is to increase the tactile information available to physicians during catheter procedures by providing haptic feedback during palpation procedures. To accomplish this goal, we have developed the first motion compensated actuated catheter system that enables haptic perception of fast moving tissue structures. The actuated catheter is instrumented with a distal tip force sensor and a force feedback interface that allows users to adjust the position of the catheter while experiencing the forces on the catheter tip. The efficacy of this device and interface is evaluated through a psychophyisical study comparing how accurately users can differentiate various materials attached to a cardiac motion simulator using the haptic device and a conventional manual catheter. The results demonstrate that haptics improves a user's ability to differentiate material properties and decreases the total number of errors by 50% over the manual catheter system. PMID:25285321
Design and Calibration of a New 6 DOF Haptic Device
Qin, Huanhuan; Song, Aiguo; Liu, Yuqing; Jiang, Guohua; Zhou, Bohe
2015-01-01
For many applications such as tele-operational robots and interactions with virtual environments, it is better to have performance with force feedback than without. Haptic devices are force reflecting interfaces. They can also track human hand positions simultaneously. A new 6 DOF (degree-of-freedom) haptic device was designed and calibrated in this study. It mainly contains a double parallel linkage, a rhombus linkage, a rotating mechanical structure and a grasping interface. Benefited from the unique design, it is a hybrid structure device with a large workspace and high output capability. Therefore, it is capable of multi-finger interactions. Moreover, with an adjustable base, operators can change different postures without interrupting haptic tasks. To investigate the performance regarding position tracking accuracy and static output forces, we conducted experiments on a three-dimensional electric sliding platform and a digital force gauge, respectively. Displacement errors and force errors are calculated and analyzed. To identify the capability and potential of the device, four application examples were programmed. PMID:26690449
Optimal haptic feedback control of artificial muscles
NASA Astrophysics Data System (ADS)
Chen, Daniel; Besier, Thor; Anderson, Iain; McKay, Thomas
2014-03-01
As our population ages, and trends in obesity continue to grow, joint degenerative diseases like osteoarthritis (OA) are becoming increasingly prevalent. With no cure currently in sight, the only effective treatments for OA are orthopaedic surgery and prolonged rehabilitation, neither of which is guaranteed to succeed. Gait retraining has tremendous potential to alter the contact forces in the joints due to walking, reducing the risk of one developing hip and knee OA. Dielectric Elastomer Actuators (DEAs) are being explored as a potential way of applying intuitive haptic feedback to alter a patient's walking gait. The main challenge with the use of DEAs in this application is producing large enough forces and strains to induce sensation when coupled to a patient's skin. A novel controller has been proposed to solve this issue. The controller uses simultaneous capacitive self-sensing and actuation which will optimally apply a haptic sensation to the patient's skin independent of variability in DEAs and patient geometries.
User Acceptance of a Haptic Interface for Learning Anatomy
ERIC Educational Resources Information Center
Yeom, Soonja; Choi-Lundberg, Derek; Fluck, Andrew; Sale, Arthur
2013-01-01
Visualizing the structure and relationships in three dimensions (3D) of organs is a challenge for students of anatomy. To provide an alternative way of learning anatomy engaging multiple senses, we are developing a force-feedback (haptic) interface for manipulation of 3D virtual organs, using design research methodology, with iterations of system…
Real-time mandibular angle reduction surgical simulation with haptic rendering.
Wang, Qiong; Chen, Hui; Wu, Wen; Jin, Hai-Yang; Heng, Pheng-Ann
2012-11-01
Mandibular angle reduction is a popular and efficient procedure widely used to alter the facial contour. The primary surgical instruments, the reciprocating saw and the round burr, employed in the surgery have a common feature: operating at a high-speed. Generally, inexperienced surgeons need a long-time practice to learn how to minimize the risks caused by the uncontrolled contacts and cutting motions in manipulation of instruments with high-speed reciprocation or rotation. A virtual reality-based surgical simulator for the mandibular angle reduction was designed and implemented on a CUDA-based platform in this paper. High-fidelity visual and haptic feedbacks are provided to enhance the perception in a realistic virtual surgical environment. The impulse-based haptic models were employed to simulate the contact forces and torques on the instruments. It provides convincing haptic sensation for surgeons to control the instruments under different reciprocation or rotation velocities. The real-time methods for bone removal and reconstruction during surgical procedures have been proposed to support realistic visual feedbacks. The simulated contact forces were verified by comparing against the actual force data measured through the constructed mechanical platform. An empirical study based on the patient-specific data was conducted to evaluate the ability of the proposed system in training surgeons with various experiences. The results confirm the validity of our simulator.
A pervasive visual-haptic framework for virtual delivery training.
Abate, Andrea F; Acampora, Giovanni; Loia, Vincenzo; Ricciardi, Stefano; Vasilakos, Athanasios V
2010-03-01
Thanks to the advances of voltage regulator (VR) technologies and haptic systems, virtual simulators are increasingly becoming a viable alternative to physical simulators in medicine and surgery, though many challenges still remain. In this study, a pervasive visual-haptic framework aimed to the training of obstetricians and midwives to vaginal delivery is described. The haptic feedback is provided by means of two hand-based haptic devices able to reproduce force-feedbacks on fingers and arms, thus enabling a much more realistic manipulation respect to stylus-based solutions. The interactive simulation is not solely driven by an approximated model of complex forces and physical constraints but, instead, is approached by a formal modeling of the whole labor and of the assistance/intervention procedures performed by means of a timed automata network and applied to a parametrical 3-D model of the anatomy, able to mimic a wide range of configurations. This novel methodology is able to represent not only the sequence of the main events associated to either a spontaneous or to an operative childbirth process, but also to help in validating the manual intervention as the actions performed by the user during the simulation are evaluated according to established medical guidelines. A discussion on the first results as well as on the challenges still unaddressed is included.
Chae, Sanghoon; Jung, Sung-Weon
2018-01-01
A survey of 67 experienced orthopedic surgeons indicated that precise portal placement was the most important skill in arthroscopic surgery. However, none of the currently available virtual reality simulators include simulation / training in portal placement, including haptic feedback of the necessary puncture force. This study aimed to: (1) measure the in vivo force and stiffness during a portal placement procedure in an actual operating room and (2) implement active haptic simulation of a portal placement procedure using the measured in vivo data. We measured the force required for port placement and the stiffness of the joint capsule during portal placement procedures performed by an experienced arthroscopic surgeon. Based on the acquired mechanical property values, we developed a cable-driven active haptic simulator designed to train the portal placement skill and evaluated the validity of the simulated haptics. Ten patients diagnosed with rotator cuff tears were enrolled in this experiment. The maximum peak force and joint capsule stiffness during posterior portal placement procedures were 66.46 (±10.76N) and 2560.82(±252.92) N/m, respectively. We then designed an active haptic simulator using the acquired data. Our cable-driven mechanism structure had a friction force of 3.763 ± 0.341 N, less than 6% of the mean puncture force. Simulator performance was evaluated by comparing the target stiffness and force with the stiffness and force reproduced by the device. R-squared values were 0.998 for puncture force replication and 0.902 for stiffness replication, indicating that the in vivo data can be used to implement a realistic haptic simulator. PMID:29494691
A 3-RSR Haptic Wearable Device for Rendering Fingertip Contact Forces.
Leonardis, Daniele; Solazzi, Massimiliano; Bortone, Ilaria; Frisoli, Antonio
2017-01-01
A novel wearable haptic device for modulating contact forces at the fingertip is presented. Rendering of forces by skin deformation in three degrees of freedom (DoF), with contact-no contact capabilities, was implemented through rigid parallel kinematics. The novel asymmetrical three revolute-spherical-revolute (3-RSR) configuration allowed compact dimensions with minimum encumbrance of the hand workspace. The device was designed to render constant to low frequency deformation of the fingerpad in three DoF, combining light weight with relatively high output forces. A differential method for solving the non-trivial inverse kinematics is proposed and implemented in real time for controlling the device. The first experimental activity evaluated discrimination of different fingerpad stretch directions in a group of five subjects. The second experiment, enrolling 19 subjects, evaluated cutaneous feedback provided in a virtual pick-and-place manipulation task. Stiffness of the fingerpad plus device was measured and used to calibrate the physics of the virtual environment. The third experiment with 10 subjects evaluated interaction forces in a virtual lift-and-hold task. Although with different performance in the two manipulation experiments, overall results show that participants better controlled interaction forces when the cutaneous feedback was active, with significant differences between the visual and visuo-haptic experimental conditions.
Co-located haptic and 3D graphic interface for medical simulations.
Berkelman, Peter; Miyasaka, Muneaki; Bozlee, Sebastian
2013-01-01
We describe a system which provides high-fidelity haptic feedback in the same physical location as a 3D graphical display, in order to enable realistic physical interaction with virtual anatomical tissue during modelled procedures such as needle driving, palpation, and other interventions performed using handheld instruments. The haptic feedback is produced by the interaction between an array of coils located behind a thin flat LCD screen, and permanent magnets embedded in the instrument held by the user. The coil and magnet configuration permits arbitrary forces and torques to be generated on the instrument in real time according to the dynamics of the simulated tissue by activating the coils in combination. A rigid-body motion tracker provides position and orientation feedback of the handheld instrument to the computer simulation, and the 3D display is produced using LCD shutter glasses and a head-tracking system for the user.
Haptic Guidance Improves the Visuo-Manual Tracking of Trajectories
Bluteau, Jérémy; Coquillart, Sabine; Payan, Yohan; Gentaz, Edouard
2008-01-01
Background Learning to perform new movements is usually achieved by following visual demonstrations. Haptic guidance by a force feedback device is a recent and original technology which provides additional proprioceptive cues during visuo-motor learning tasks. The effects of two types of haptic guidances-control in position (HGP) or in force (HGF)–on visuo-manual tracking (“following”) of trajectories are still under debate. Methodology/Principals Findings Three training techniques of haptic guidance (HGP, HGF or control condition, NHG, without haptic guidance) were evaluated in two experiments. Movements produced by adults were assessed in terms of shapes (dynamic time warping) and kinematics criteria (number of velocity peaks and mean velocity) before and after the training sessions. Trajectories consisted of two Arabic and two Japanese-inspired letters in Experiment 1 and ellipses in Experiment 2. We observed that the use of HGF globally improves the fluency of the visuo-manual tracking of trajectories while no significant improvement was found for HGP or NHG. Conclusion/Significance These results show that the addition of haptic information, probably encoded in force coordinates, play a crucial role on the visuo-manual tracking of new trajectories. PMID:18335049
Graphic and haptic simulation system for virtual laparoscopic rectum surgery.
Pan, Jun J; Chang, Jian; Yang, Xiaosong; Zhang, Jian J; Qureshi, Tahseen; Howell, Robert; Hickish, Tamas
2011-09-01
Medical simulators with vision and haptic feedback techniques offer a cost-effective and efficient alternative to the traditional medical trainings. They have been used to train doctors in many specialties of medicine, allowing tasks to be practised in a safe and repetitive manner. This paper describes a virtual-reality (VR) system which will help to influence surgeons' learning curves in the technically challenging field of laparoscopic surgery of the rectum. Data from MRI of the rectum and real operation videos are used to construct the virtual models. A haptic force filter based on radial basis functions is designed to offer realistic and smooth force feedback. To handle collision detection efficiently, a hybrid model is presented to compute the deformation of intestines. Finally, a real-time cutting technique based on mesh is employed to represent the incision operation. Despite numerous research efforts, fast and realistic solutions of soft tissues with large deformation, such as intestines, prove extremely challenging. This paper introduces our latest contribution to this endeavour. With this system, the user can haptically operate with the virtual rectum and simultaneously watch the soft tissue deformation. Our system has been tested by colorectal surgeons who believe that the simulated tactile and visual feedbacks are realistic. It could replace the traditional training process and effectively transfer surgical skills to novices. Copyright © 2011 John Wiley & Sons, Ltd.
The effect of force feedback on student reasoning about gravity, mass, force and motion
NASA Astrophysics Data System (ADS)
Bussell, Linda
The purpose of this study was to examine whether force feedback within a computer simulation had an effect on reasoning by fifth grade students about gravity, mass, force, and motion, concepts which can be difficult for learners to grasp. Few studies have been done on cognitive learning and haptic feedback, particularly with young learners, but there is an extensive base of literature on children's conceptions of science and a number of studies focus specifically on children's conceptions of force and motion. This case study used a computer-based paddleball simulation with guided inquiry as the primary stimulus. Within the simulation, the learner could adjust the mass of the ball and the gravitational force. The experimental group used the simulation with visual and force feedback; the control group used the simulation with visual feedback but without force feedback. The proposition was that there would be differences in reasoning between the experimental and control groups, with force feedback being helpful with concepts that are more obvious when felt. Participants were 34 fifth-grade students from three schools. Students completed a modal (visual, auditory, and haptic) learning preference assessment and a pretest. The sessions, including participant experimentation and interviews, were audio recorded and observed. The interviews were followed by a written posttest. These data were analyzed to determine whether there were differences based on treatment, learning style, demographics, prior gaming experience, force feedback experience, or prior knowledge. Work with the simulation, regardless of group, was found to increase students' understanding of key concepts. The experimental group appeared to benefit from the supplementary help that force feedback provided. Those in the experimental group scored higher on the posttest than those in the control group. The greatest difference between mean group scores was on a question concerning the effects of increased gravitational force.
Mechanical model of orthopaedic drilling for augmented-haptics-based training.
Pourkand, Ashkan; Zamani, Naghmeh; Grow, David
2017-10-01
In this study, augmented-haptic feedback is used to combine a physical object with virtual elements in order to simulate anatomic variability in bone. This requires generating levels of force/torque consistent with clinical bone drilling, which exceed the capabilities of commercially available haptic devices. Accurate total force generation is facilitated by a predictive model of axial force during simulated orthopaedic drilling. This model is informed by kinematic data collected while drilling into synthetic bone samples using an instrumented linkage attached to the orthopaedic drill. Axial force is measured using a force sensor incorporated into the bone fixture. A nonlinear function, relating force to axial position and velocity, was used to fit the data. The normalized root-mean-square error (RMSE) of forces predicted by the model compared to those measured experimentally was 0.11 N across various bones with significant differences in geometry and density. This suggests that a predictive model can be used to capture relevant variations in the thickness and hardness of cortical and cancellous bone. The practical performance of this approach is measured using the Phantom Premium haptic device, with some required customizations. Copyright © 2017 Elsevier Ltd. All rights reserved.
Shared control of a medical robot with haptic guidance.
Xiong, Linfei; Chng, Chin Boon; Chui, Chee Kong; Yu, Peiwu; Li, Yao
2017-01-01
Tele-operation of robotic surgery reduces the radiation exposure during the interventional radiological operations. However, endoscope vision without force feedback on the surgical tool increases the difficulty for precise manipulation and the risk of tissue damage. The shared control of vision and force provides a novel approach of enhanced control with haptic guidance, which could lead to subtle dexterity and better maneuvrability during MIS surgery. The paper provides an innovative shared control method for robotic minimally invasive surgery system, in which vision and haptic feedback are incorporated to provide guidance cues to the clinician during surgery. The incremental potential field (IPF) method is utilized to generate a guidance path based on the anatomy of tissue and surgical tool interaction. Haptic guidance is provided at the master end to assist the clinician during tele-operative surgical robotic task. The approach has been validated with path following and virtual tumor targeting experiments. The experiment results demonstrate that comparing with vision only guidance, the shared control with vision and haptics improved the accuracy and efficiency of surgical robotic manipulation, where the tool-position error distance and execution time are reduced. The validation experiment demonstrates that the shared control approach could help the surgical robot system provide stable assistance and precise performance to execute the designated surgical task. The methodology could also be implemented with other surgical robot with different surgical tools and applications.
Effects of kinesthetic and cutaneous stimulation during the learning of a viscous force field.
Rosati, Giulio; Oscari, Fabio; Pacchierotti, Claudio; Prattichizzo, Domenico
2014-01-01
Haptic stimulation can help humans learn perceptual motor skills, but the precise way in which it influences the learning process has not yet been clarified. This study investigates the role of the kinesthetic and cutaneous components of haptic feedback during the learning of a viscous curl field, taking also into account the influence of visual feedback. We present the results of an experiment in which 17 subjects were asked to make reaching movements while grasping a joystick and wearing a pair of cutaneous devices. Each device was able to provide cutaneous contact forces through a moving platform. The subjects received visual feedback about joystick's position. During the experiment, the system delivered a perturbation through (1) full haptic stimulation, (2) kinesthetic stimulation alone, (3) cutaneous stimulation alone, (4) altered visual feedback, or (5) altered visual feedback plus cutaneous stimulation. Conditions 1, 2, and 3 were also tested with the cancellation of the visual feedback of position error. Results indicate that kinesthetic stimuli played a primary role during motor adaptation to the viscous field, which is a fundamental premise to motor learning and rehabilitation. On the other hand, cutaneous stimulation alone appeared not to bring significant direct or adaptation effects, although it helped in reducing direct effects when used in addition to kinesthetic stimulation. The experimental conditions with visual cancellation of position error showed slower adaptation rates, indicating that visual feedback actively contributes to the formation of internal models. However, modest learning effects were detected when the visual information was used to render the viscous field.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anthony L. Crawford
2012-08-01
Natural movements and force feedback are important elements in using teleoperated equipment if complex and speedy manipulation tasks are to be accomplished in remote and/or hazardous environments, such as hot cells, glove boxes, decommissioning, explosives disarmament, and space to name a few. In order to achieve this end the research presented in this paper has developed an admittance type exoskeleton like multi-fingered haptic hand user interface that secures the user’s palm and provides 3-dimensional force feedback to the user’s fingertips. Atypical to conventional haptic hand user interfaces that limit themselves to integrating the human hand’s characteristics just into the system’smore » mechanical design this system also perpetuates that inspiration into the designed user interface’s controller. This is achieved by manifesting the property differences of manipulation and grasping activities as they pertain to the human hand into a nonlinear master-slave force relationship. The results presented in this paper show that the admittance-type system has sufficient bandwidth that it appears nearly transparent to the user when the user is in free motion and when the system is subjected to a manipulation task, increased performance is achieved using the nonlinear force relationship compared to the traditional linear scaling techniques implemented in the vast majority of systems.« less
A haptic unit designed for magnetic-resonance-guided biopsy.
Tse, Z T H; Elhawary, H; Rea, M; Young, I; Davis, B L; Lamperth, M
2009-02-01
The magnetic fields present in the magnetic resonance (MR) environment impose severe constraints on any mechatronic device present in its midst, requiring alternative actuators, sensors, and materials to those conventionally used in traditional system engineering. In addition the spatial constraints of closed-bore scanners require a physical separation between the radiologist and the imaged region of the patient. This configuration produces a loss of the sense of touch from the target anatomy for the clinician, which often provides useful information. To recover the force feedback from the tissue, an MR-compatible haptic unit, designed to be integrated with a five-degrees-of-freedom mechatronic system for MR-guided prostate biopsy, has been developed which incorporates position control and force feedback to the operator. The haptic unit is designed to be located inside the scanner isocentre with the master console in the control room. MR compatibility of the device has been demonstrated, showing a negligible degradation of the signal-to-noise ratio and virtually no geometric distortion. By combining information from the position encoder and force sensor, tissue stiffness measurement along the needle trajectory is demonstrated in a lamb liver to aid diagnosis of suspected cancerous tissue.
Chen, Guan-Chun; Lin, Chia-Hung; Hsieh, Kai-Sheng; Du, Yi-Chun; Chen, Tainsong
2015-01-01
This study proposes virtual-reality (VR) simulator system for double interventional cardiac catheterization (ICC) using fractional-order vascular access tracker and haptic force producer. An endoscope or a catheter for diagnosis and surgery of cardiovascular disease has been commonly used in minimally invasive surgery. It needs specific skills and experiences for young surgeons or postgraduate year (PGY) students to operate a Berman catheter and a pigtail catheter in the inside of the human body and requires avoiding damaging vessels. To improve the training in inserting catheters, a double-catheter mechanism is designed for the ICC procedures. A fractional-order vascular access tracker is used to trace the senior surgeons' consoled trajectories and transmit the frictional feedback and visual feedback during the insertion of catheters. Based on the clinical feeling through the aortic arch, vein into the ventricle, or tortuous blood vessels, haptic force producer is used to mock the elasticity of the vessel wall using voice coil motors (VCMs). The VR establishment with surgeons' consoled vessel trajectories and hand feeling is achieved, and the experimental results show the effectiveness for the double ICC procedures. PMID:26171419
Haptic simulation framework for determining virtual dental occlusion.
Wu, Wen; Chen, Hui; Cen, Yuhai; Hong, Yang; Khambay, Balvinder; Heng, Pheng Ann
2017-04-01
The surgical treatment of many dentofacial deformities is often complex due to its three-dimensional nature. To determine the dental occlusion in the most stable position is essential for the success of the treatment. Computer-aided virtual planning on individualized patient-specific 3D model can help formulate the surgical plan and predict the surgical change. However, in current computer-aided planning systems, it is not possible to determine the dental occlusion of the digital models in the intuitive way during virtual surgical planning because of absence of haptic feedback. In this paper, a physically based haptic simulation framework is proposed, which can provide surgeons with the intuitive haptic feedback to determine the dental occlusion of the digital models in their most stable position. To provide the physically realistic force feedback when the dental models contact each other during the searching process, the contact model is proposed to describe the dynamic and collision properties of the dental models during the alignment. The simulated impulse/contact-based forces are integrated into the unified simulation framework. A validation study has been conducted on fifteen sets of virtual dental models chosen at random and covering a wide range of the dental relationships found clinically. The dental occlusions obtained by an expert were employed as a benchmark to compare the virtual occlusion results. The mean translational and angular deviations of the virtual occlusion results from the benchmark were small. The experimental results show the validity of our method. The simulated forces can provide valuable insights to determine the virtual dental occlusion. The findings of this work and the validation of proposed concept lead the way for full virtual surgical planning on patient-specific virtual models allowing fully customized treatment plans for the surgical correction of dentofacial deformities.
Augmented kinematic feedback from haptic virtual reality for dental skill acquisition.
Suebnukarn, Siriwan; Haddawy, Peter; Rhienmora, Phattanapon; Jittimanee, Pannapa; Viratket, Piyanuch
2010-12-01
We have developed a haptic virtual reality system for dental skill training. In this study we examined several kinds of kinematic information about the movement provided by the system supplement knowledge of results (KR) in dental skill acquisition. The kinematic variables examined involved force utilization (F) and mirror view (M). This created three experimental conditions that received augmented kinematic feedback (F, M, FM) and one control condition that did not (KR-only). Thirty-two dental students were randomly assigned to four groups. Their task was to perform access opening on the upper first molar with the haptic virtual reality system. An acquisition session consisted of two days of ten trials of practice in which augmented kinematic feedback was provided for the appropriate experimental conditions after each trial. One week after, a retention test consisting of two trials without augmented feedback was completed. The results showed that the augmented kinematic feedback groups had larger mean performance scores than the KR-only group in Day 1 of the acquisition and retention sessions (ANOVA, p<0.05). The apparent differences among feedback groups were not significant in Day 2 of the acquisition session (ANOVA, p>0.05). The trends in acquisition and retention sessions suggest that the augmented kinematic feedback can enhance the performance earlier in the skill acquisition and retention sessions.
Development of a novel haptic glove for improving finger dexterity in poststroke rehabilitation.
Lin, Chi-Ying; Tsai, Chia-Min; Shih, Pei-Cheng; Wu, Hsiao-Ching
2015-01-01
Almost all stroke patients experience a certain degree of fine motor impairment, and impeded finger movement may limit activities in daily life. Thus, to improve the quality of life of stroke patients, designing an efficient training device for fine motor rehabilitation is crucial. This study aimed to develop a novel fine motor training glove that integrates a virtual-reality based interactive environment with vibrotactile feedback for more effective post stroke hand rehabilitation. The proposed haptic rehabilitation device is equipped with small DC vibration motors for vibrotactile feedback stimulation and piezoresistive thin-film force sensors for motor function evaluation. Two virtual-reality based games ``gopher hitting'' and ``musical note hitting'' were developed as a haptic interface. According to the designed rehabilitation program, patients intuitively push and practice their fingers to improve the finger isolation function. Preliminary tests were conducted to assess the feasibility of the developed haptic rehabilitation system and to identify design concerns regarding the practical use in future clinical testing.
NASA Astrophysics Data System (ADS)
Jones, M. G.; Andre, T.; Kubasko, D.; Bokinsky, A.; Tretter, T.; Negishi, A.; Taylor, R.; Superfine, R.
2004-01-01
This study examined hands-on experiences in the context of an investigation of viruses and explored how and why hands-on experiences may be effective. We sought to understand whether or not touching and manipulating materials and objects could lead to a deeper, more effective type of knowing than that we obtain from sight or sound alone. Four classes of high school biology students and four classes of seventh graders participated in the study that examined students' use of remote microscopy with a new scientific tool called the nanoManipulator, which enabled them to reach out and touch live viruses inside an atomic force microscope. Half of the students received full haptic (tactile and kinesthetic) feedback from a haptic joystick, whereas half of the students were able to use the haptic joystick to manipulate viruses but the tactile feedback was blocked. Results showed that there were significant gains from pre- to postinstruction across treatment groups for knowledge and attitudes. Students in both treatment groups developed conceptual models of viruses that were more consistent with current scientific research, including a move from a two-dimensional to a three-dimensional understanding of virus morphology. There were significant changes in students' understandings of scale; after instruction, students were more likely to identify examples of nanosized objects and be able to describe the degree to which a human would have to be shrunk to reach the size of a virus. Students who received full-haptic feedback had significantly better attitudes suggesting that the increased sensory feedback and stimulation may have made the experience more engaging and motivating to students.
A haptic sensing upgrade for the current EOD robotic fleet
NASA Astrophysics Data System (ADS)
Rowe, Patrick
2014-06-01
The past decade and a half has seen a tremendous rise in the use of mobile manipulator robotic platforms for bomb inspection and disposal, explosive ordnance disposal, and other extremely hazardous tasks in both military and civilian settings. Skilled operators are able to control these robotic vehicles in amazing ways given the very limited situational awareness obtained from a few on-board camera views. Future generations of robotic platforms will, no doubt, provide some sort of additional force or haptic sensor feedback to further enhance the operator's interaction with the robot, especially when dealing with fragile, unstable, and explosive objects. Unfortunately, the robot operators need this capability today. This paper discusses an approach to provide existing (and future) robotic mobile manipulator platforms, with which trained operators are already familiar and highly proficient, this desired haptic and force feedback capability. The goals of this technology are to be rugged, reliable, and affordable. It should also be able to be applied to a wide range of existing robots with a wide variety of manipulator/gripper sizes and styles. Finally, the presentation of the haptic information to the operator is discussed, given the fact that control devices that physically interact with the operators are not widely available and still in the research stages.
Integration of Haptics in Agricultural Robotics
NASA Astrophysics Data System (ADS)
Kannan Megalingam, Rajesh; Sreekanth, M. M.; Sivanantham, Vinu; Sai Kumar, K.; Ghanta, Sriharsha; Surya Teja, P.; Reddy, Rajesh G.
2017-08-01
Robots can differentiate with open loop system and closed loop system robots. We face many problems when we do not have a feedback from robots. In this research paper, we are discussing all possibilities to achieve complete closed loop system for Multiple-DOF Robotic Arm, which is used in a coconut tree climbing and cutting robot by introducing a Haptic device. We are working on various sensors like tactile, vibration, force and proximity sensors for getting feedback. For monitoring the robotic arm achieved by graphical user interference software which simulates the working of the robotic arm, send the feedback of all the real time analog values which are produced by various sensors and provide real-time graphs for estimate the efficiency of the Robot.
Research of the master-slave robot surgical system with the function of force feedback.
Shi, Yunyong; Zhou, Chaozheng; Xie, Le; Chen, Yongjun; Jiang, Jun; Zhang, Zhenfeng; Deng, Ze
2017-12-01
Surgical robots lack force feedback, which may lead to operation errors. In order to improve surgical outcomes, this research developed a new master-slave surgical robot, which was designed with an integrated force sensor. The new structure designed for the master-slave robot employs a force feedback mechanism. A six-dimensional force sensor was mounted on the tip of the slave robot's actuator. Sliding model control was adopted to control the slave robot. According to the movement of the master system manipulated by the surgeon, the slave's movement and the force feedback function were validated. The motion was completed, the standard deviation was calculated, and the force data were detected. Hence, force feedback was realized in the experiment. The surgical robot can help surgeons to complete trajectory motions with haptic sensation. Copyright © 2017 John Wiley & Sons, Ltd.
Haptics-based dynamic implicit solid modeling.
Hua, Jing; Qin, Hong
2004-01-01
This paper systematically presents a novel, interactive solid modeling framework, Haptics-based Dynamic Implicit Solid Modeling, which is founded upon volumetric implicit functions and powerful physics-based modeling. In particular, we augment our modeling framework with a haptic mechanism in order to take advantage of additional realism associated with a 3D haptic interface. Our dynamic implicit solids are semi-algebraic sets of volumetric implicit functions and are governed by the principles of dynamics, hence responding to sculpting forces in a natural and predictable manner. In order to directly manipulate existing volumetric data sets as well as point clouds, we develop a hierarchical fitting algorithm to reconstruct and represent discrete data sets using our continuous implicit functions, which permit users to further design and edit those existing 3D models in real-time using a large variety of haptic and geometric toolkits, and visualize their interactive deformation at arbitrary resolution. The additional geometric and physical constraints afford more sophisticated control of the dynamic implicit solids. The versatility of our dynamic implicit modeling enables the user to easily modify both the geometry and the topology of modeled objects, while the inherent physical properties can offer an intuitive haptic interface for direct manipulation with force feedback.
A Review of Simulators with Haptic Devices for Medical Training.
Escobar-Castillejos, David; Noguez, Julieta; Neri, Luis; Magana, Alejandra; Benes, Bedrich
2016-04-01
Medical procedures often involve the use of the tactile sense to manipulate organs or tissues by using special tools. Doctors require extensive preparation in order to perform them successfully; for example, research shows that a minimum of 750 operations are needed to acquire sufficient experience to perform medical procedures correctly. Haptic devices have become an important training alternative and they have been considered to improve medical training because they let users interact with virtual environments by adding the sense of touch to the simulation. Previous articles in the field state that haptic devices enhance the learning of surgeons compared to current training environments used in medical schools (corpses, animals, or synthetic skin and organs). Consequently, virtual environments use haptic devices to improve realism. The goal of this paper is to provide a state of the art review of recent medical simulators that use haptic devices. In particular we focus on stitching, palpation, dental procedures, endoscopy, laparoscopy, and orthopaedics. These simulators are reviewed and compared from the viewpoint of used technology, the number of degrees of freedom, degrees of force feedback, perceived realism, immersion, and feedback provided to the user. In the conclusion, several observations per area and suggestions for future work are provided.
Virtual reality simulation: basic concepts and use in endoscopic neurosurgery training.
Cohen, Alan R; Lohani, Subash; Manjila, Sunil; Natsupakpong, Suriya; Brown, Nathan; Cavusoglu, M Cenk
2013-08-01
Virtual reality simulation is a promising alternative to training surgical residents outside the operating room. It is also a useful aide to anatomic study, residency training, surgical rehearsal, credentialing, and recertification. Surgical simulation is based on a virtual reality with varying degrees of immersion and realism. Simulators provide a no-risk environment for harmless and repeatable practice. Virtual reality has three main components of simulation: graphics/volume rendering, model behavior/tissue deformation, and haptic feedback. The challenge of accurately simulating the forces and tactile sensations experienced in neurosurgery limits the sophistication of a virtual simulator. The limited haptic feedback available in minimally invasive neurosurgery makes it a favorable subject for simulation. Virtual simulators with realistic graphics and force feedback have been developed for ventriculostomy, intraventricular surgery, and transsphenoidal pituitary surgery, thus allowing preoperative study of the individual anatomy and increasing the safety of the procedure. The authors also present experiences with their own virtual simulation of endoscopic third ventriculostomy.
Bornhoft, J M; Strabala, K W; Wortman, T D; Lehman, A C; Oleynikov, D; Farritor, S M
2011-01-01
The objective of this research is to study the effectiveness of using a stereoscopic visualization system for performing remote surgery. The use of stereoscopic vision has become common with the advent of the da Vinci® system (Intuitive, Sunnyvale CA). This system creates a virtual environment that consists of a 3-D display for visual feedback and haptic tactile feedback, together providing an intuitive environment for remote surgical applications. This study will use simple in vivo robotic surgical devices and compare the performance of surgeons using the stereoscopic interfacing system to the performance of surgeons using one dimensional monitors. The stereoscopic viewing system consists of two cameras, two monitors, and four mirrors. The cameras are mounted to a multi-functional miniature in vivo robot; and mimic the depth perception of the actual human eyes. This is done by placing the cameras at a calculated angle and distance apart. Live video streams from the left and right cameras are displayed on the left and right monitors, respectively. A system of angled mirrors allows the left and right eyes to see the video stream from the left and right monitor, respectively, creating the illusion of depth. The haptic interface consists of two PHANTOM Omni® (SensAble, Woburn Ma) controllers. These controllers measure the position and orientation of a pen-like end effector with three degrees of freedom. As the surgeon uses this interface, they see a 3-D image and feel force feedback for collision and workspace limits. The stereoscopic viewing system has been used in several surgical training tests and shows a potential improvement in depth perception and 3-D vision. The haptic system accurately gives force feedback that aids in surgery. Both have been used in non-survival animal surgeries, and have successfully been used in suturing and gallbladder removal. Bench top experiments using the interfacing system have also been conducted. A group of participants completed two different surgical training tasks using both a two dimensional visual system and the stereoscopic visual system. Results suggest that the stereoscopic visual system decreased the amount of time taken to complete the tasks. All participants also reported that the stereoscopic system was easier to utilize than the two dimensional system. Haptic controllers combined with stereoscopic vision provides for a more intuitive virtual environment. This system provides the surgeon with 3-D vision, depth perception, and the ability to receive feedback through forces applied in the haptic controller while performing surgery. These capabilities potentially enable the performance of more complex surgeries with a higher level of precision.
NASA Astrophysics Data System (ADS)
Oh, Jong-Seok; Choi, Seung-Hyun; Choi, Seung-Bok
2014-01-01
This paper presents control performances of a new type of four-degrees-of-freedom (4-DOF) haptic master that can be used for robot-assisted minimally invasive surgery (RMIS). By adopting a controllable electrorheological (ER) fluid, the function of the proposed master is realized as a haptic feedback as well as remote manipulation. In order to verify the efficacy of the proposed master and method, an experiment is conducted with deformable objects featuring human organs. Since the use of real human organs is difficult for control due to high cost and moral hazard, an excellent alternative method, the virtual reality environment, is used for control in this work. In order to embody a human organ in the virtual space, the experiment adopts a volumetric deformable object represented by a shape-retaining chain linked (S-chain) model which has salient properties such as fast and realistic deformation of elastic objects. In haptic architecture for RMIS, the desired torque/force and desired position originating from the object of the virtual slave and operator of the haptic master are transferred to each other. In order to achieve the desired torque/force trajectories, a sliding mode controller (SMC) which is known to be robust to uncertainties is designed and empirically implemented. Tracking control performances for various torque/force trajectories from the virtual slave are evaluated and presented in the time domain.
Control of a Robot Dancer for Enhancing Haptic Human-Robot Interaction in Waltz.
Hongbo Wang; Kosuge, K
2012-01-01
Haptic interaction between a human leader and a robot follower in waltz is studied in this paper. An inverted pendulum model is used to approximate the human's body dynamics. With the feedbacks from the force sensor and laser range finders, the robot is able to estimate the human leader's state by using an extended Kalman filter (EKF). To reduce interaction force, two robot controllers, namely, admittance with virtual force controller, and inverted pendulum controller, are proposed and evaluated in experiments. The former controller failed the experiment; reasons for the failure are explained. At the same time, the use of the latter controller is validated by experiment results.
Force, Torque and Stiffness: Interactions in Perceptual Discrimination
Wu, Bing; Klatzky, Roberta L.; Hollis, Ralph L.
2011-01-01
Three experiments investigated whether force and torque cues interact in haptic discrimination of force, torque and stiffness, and if so, how. The statistical relation between force and torque was manipulated across four experimental conditions: Either one type of cue varied while the other was constant, or both varied so as to be positively correlated, negatively correlated, or uncorrelated. Experiment 1 showed that the subjects’ ability to discriminate force was improved by positively correlated torque but impaired with uncorrelated torque, as compared to the constant torque condition. Corresponding effects were found in Experiment 2 for the influence of force on torque discrimination. These findings indicate that force and torque are integrated in perception, rather than being processed as separate dimensions. A further experiment demonstrated facilitation of stiffness discrimination by correlated force and torque, whether the correlation was positive or negative. The findings suggest new means of augmenting haptic feedback to facilitate perception of the properties of soft objects. PMID:21359137
Augmented reality and haptic interfaces for robot-assisted surgery.
Yamamoto, Tomonori; Abolhassani, Niki; Jung, Sung; Okamura, Allison M; Judkins, Timothy N
2012-03-01
Current teleoperated robot-assisted minimally invasive surgical systems do not take full advantage of the potential performance enhancements offered by various forms of haptic feedback to the surgeon. Direct and graphical haptic feedback systems can be integrated with vision and robot control systems in order to provide haptic feedback to improve safety and tissue mechanical property identification. An interoperable interface for teleoperated robot-assisted minimally invasive surgery was developed to provide haptic feedback and augmented visual feedback using three-dimensional (3D) graphical overlays. The software framework consists of control and command software, robot plug-ins, image processing plug-ins and 3D surface reconstructions. The feasibility of the interface was demonstrated in two tasks performed with artificial tissue: palpation to detect hard lumps and surface tracing, using vision-based forbidden-region virtual fixtures to prevent the patient-side manipulator from entering unwanted regions of the workspace. The interoperable interface enables fast development and successful implementation of effective haptic feedback methods in teleoperation. Copyright © 2011 John Wiley & Sons, Ltd.
Electrorheological Fluid Based Force Feedback Device
NASA Technical Reports Server (NTRS)
Pfeiffer, Charles; Bar-Cohen, Yoseph; Mavroidis, Constantinos; Dolgin, Benjamin
1999-01-01
Parallel to the efforts to develop fully autonomous robots, it is increasingly being realized that there are applications where it is essential to have a fully controlled robot and "feel" its operating conditions, i.e. telepresence. This trend is a result of the increasing efforts to address tasks where humans can perform significantly better but, due to associated hazards, distance, physical limitations and other causes, only robots can be employed to perform these tasks. Such robots need to be assisted by a human that remotely controls the operation. To address the goal of operating robots as human surrogates, the authors launched a study of mechanisms that provide mechanical feedback. For this purpose, electrorheological fluids (ERF) are being investigated for the potential application as miniature haptic devices. This family of electroactive fluids has the property of changing the viscosity during electrical stimulation. Consequently, ERF can be used to produce force feedback haptic devices for tele-operated control of medical and space robotic systems. Forces applied at the robot end-effector due to a compliant environment are reflected to the user using an ERF device where a change in the system viscosity will occur proportionally to the transmitted force. Analytical model and control algorithms are being developed taking into account the non-linearities of these type of devices. This paper will describe the concept and the developed mechanism of ERF based force feedback. The test process and the physical properties of this device will be described and the results of preliminary tests will be presented.
Haptic augmentation of science instruction: Does touch matter?
NASA Astrophysics Data System (ADS)
Jones, M. Gail; Minogue, James; Tretter, Thomas R.; Negishi, Atsuko; Taylor, Russell
2006-01-01
This study investigated the impact of haptic augmentation of a science inquiry program on students' learning about viruses and nanoscale science. The study assessed how the addition of different types of haptic feedback (active touch and kinesthetic feedback) combined with computer visualizations influenced middle and high school students' experiences. The influences of a PHANToM (a sophisticated haptic desktop device), a Sidewinder (a haptic gaming joystick), and a mouse (no haptic feedback) interface were compared. The levels of engagement in the instruction and students' attitudes about the instructional program were assessed using a combination of constructed response and Likert scale items. Potential cognitive differences were examined through an analysis of spontaneously generated analogies that appeared during student discourse. Results showed that the addition of haptic feedback from the haptic-gaming joystick and the PHANToM provided a more immersive learning environment that not only made the instruction more engaging but may also influence the way in which the students construct their understandings about abstract science concepts.
Villard, P F; Vidal, F P; Hunt, C; Bello, F; John, N W; Johnson, S; Gould, D A
2009-11-01
We present here a simulator for interventional radiology focusing on percutaneous transhepatic cholangiography (PTC). This procedure consists of inserting a needle into the biliary tree using fluoroscopy for guidance. The requirements of the simulator have been driven by a task analysis. The three main components have been identified: the respiration, the real-time X-ray display (fluoroscopy) and the haptic rendering (sense of touch). The framework for modelling the respiratory motion is based on kinematics laws and on the Chainmail algorithm. The fluoroscopic simulation is performed on the graphic card and makes use of the Beer-Lambert law to compute the X-ray attenuation. Finally, the haptic rendering is integrated to the virtual environment and takes into account the soft-tissue reaction force feedback and maintenance of the initial direction of the needle during the insertion. Five training scenarios have been created using patient-specific data. Each of these provides the user with variable breathing behaviour, fluoroscopic display tuneable to any device parameters and needle force feedback. A detailed task analysis has been used to design and build the PTC simulator described in this paper. The simulator includes real-time respiratory motion with two independent parameters (rib kinematics and diaphragm action), on-line fluoroscopy implemented on the Graphics Processing Unit and haptic feedback to feel the soft-tissue behaviour of the organs during the needle insertion.
Haptic-STM: a human-in-the-loop interface to a scanning tunneling microscope.
Perdigão, Luís M A; Saywell, Alex
2011-07-01
The operation of a haptic device interfaced with a scanning tunneling microscope (STM) is presented here. The user moves the STM tip in three dimensions by means of a stylus attached to the haptic instrument. The tunneling current measured by the STM is converted to a vertical force, applied to the stylus and felt by the user, with the user being incorporated into the feedback loop that controls the tip-surface distance. A haptic-STM interface of this nature allows the user to feel atomic features on the surface and facilitates the tactile manipulation of the adsorbate/substrate system. The operation of this device is demonstrated via the room temperature STM imaging of C(60) molecules adsorbed on an Au(111) surface in ultra-high vacuum.
NASA Astrophysics Data System (ADS)
Do, T. N.; Tjahjowidodo, T.; Lau, M. W. S.; Phee, S. J.
2015-08-01
Natural Orifice Transluminal Endoscopic Surgery (NOTES) is a special method that allows surgical operations via natural orifices like mouth, anus, and vagina, without leaving visible scars. The use of flexible tendon-sheath mechanism (TSM) is common in these systems because of its light weight in structure, flexibility, and easy transmission of power. However, nonlinear friction and backlash hysteresis pose many challenges to control of such systems; in addition, they do not provide haptic feedback to assist the surgeon in the operation of the systems. In this paper, we propose a new dynamic friction model and backlash hysteresis nonlinearity for a pair of TSM to deal with these problems. The proposed friction model, unlike current approaches in the literature, is smooth and able to capture the force at near zero velocity when the system is stationary or operates at small motion. This model can be used to estimate the friction force for haptic feedback purpose. To improve the system tracking performances, a backlash hysteresis model will be introduced, which can be used in a feedforward controller scheme. The controller involves a simple computation of the inverse hysteresis model. The proposed models are configuration independent and able to capture the nonlinearities for arbitrary tendon-sheath shapes. A representative experimental setup is used to validate the proposed models and to demonstrate the improvement in position tracking accuracy and the possibility of providing desired force information at the distal end of a pair of TSM slave manipulator for haptic feedback to the surgeons.
Friction Compensation for Enhancing Transparency of a Teleoperator with Compliant Transmission
Mahvash, Mohsen; Okamura, Allison
2009-01-01
This article presents a model-based compensator for canceling friction in the tendon-driven joints of a haptic-feedback teleoperator. Unlike position-tracking systems, a teleoperator involves an unknown environment force that prevents the use of tracking position error as a feedback to the compensator. Thus, we use a model-based feedforward friction compensator to cancel the friction forces. We provide conditions for selecting compensator parameters to ensure passivity of the teleoperator and demonstrate performance experimentally. PMID:20514151
An implementation of sensor-based force feedback in a compact laparoscopic surgery robot.
Lee, Duk-Hee; Choi, Jaesoon; Park, Jun-Woo; Bach, Du-Jin; Song, Seung-Jun; Kim, Yoon-Ho; Jo, Yungho; Sun, Kyung
2009-01-01
Despite the rapid progress in the clinical application of laparoscopic surgery robots, many shortcomings have not yet been fully overcome, one of which is the lack of reliable haptic feedback. This study implemented a force-feedback structure in our compact laparoscopic surgery robot. The surgery robot is a master-slave configuration robot with 5 DOF (degree of freedom corresponding laparoscopic surgical motion. The force-feedback implementation was made in the robot with torque sensors and controllers installed in the pitch joint of the master and slave robots. A simple dynamic model of action-reaction force in the slave robot was used, through which the reflective force was estimated and fed back to the master robot. The results showed the system model could be identified with significant fidelity and the force feedback at the master robot was feasible. However, the qualitative human assessment of the fed-back force showed only limited level of object discrimination ability. Further developments are underway with this result as a framework.
Covarrubias, Mario; Bordegoni, Monica; Cugini, Umberto
2013-01-01
In this article, we present an approach that uses both two force sensitive handles (FSH) and a flexible capacitive touch sensor (FCTS) to drive a haptic-based immersive system. The immersive system has been developed as part of a multimodal interface for product design. The haptic interface consists of a strip that can be used by product designers to evaluate the quality of a 3D virtual shape by using touch, vision and hearing and, also, to interactively change the shape of the virtual object. Specifically, the user interacts with the FSH to move the virtual object and to appropriately position the haptic interface for retrieving the six degrees of freedom required for both manipulation and modification modalities. The FCTS allows the system to track the movement and position of the user's fingers on the strip, which is used for rendering visual and sound feedback. Two evaluation experiments are described, which involve both the evaluation and the modification of a 3D shape. Results show that the use of the haptic strip for the evaluation of aesthetic shapes is effective and supports product designers in the appreciation of the aesthetic qualities of the shape. PMID:24113680
Covarrubias, Mario; Bordegoni, Monica; Cugini, Umberto
2013-10-09
In this article, we present an approach that uses both two force sensitive handles (FSH) and a flexible capacitive touch sensor (FCTS) to drive a haptic-based immersive system. The immersive system has been developed as part of a multimodal interface for product design. The haptic interface consists of a strip that can be used by product designers to evaluate the quality of a 3D virtual shape by using touch, vision and hearing and, also, to interactively change the shape of the virtual object. Specifically, the user interacts with the FSH to move the virtual object and to appropriately position the haptic interface for retrieving the six degrees of freedom required for both manipulation and modification modalities. The FCTS allows the system to track the movement and position of the user's fingers on the strip, which is used for rendering visual and sound feedback. Two evaluation experiments are described, which involve both the evaluation and the modification of a 3D shape. Results show that the use of the haptic strip for the evaluation of aesthetic shapes is effective and supports product designers in the appreciation of the aesthetic qualities of the shape.
Fisher, J Brian; Porter, Susan M
2002-01-01
This paper describes an application of a display approach which uses chromakey techniques to composite real and computer-generated images allowing a user to see his hands and medical instruments collocated with the display of virtual objects during a medical training simulation. Haptic feedback is provided through the use of a PHANTOM force feedback device in addition to tactile augmentation, which allows the user to touch virtual objects by introducing corresponding real objects in the workspace. A simplified catheter introducer insertion simulation was developed to demonstrate the capabilities of this approach.
Review of Designs for Haptic Data Visualization.
Paneels, Sabrina; Roberts, Jonathan C
2010-01-01
There are many different uses for haptics, such as training medical practitioners, teleoperation, or navigation of virtual environments. This review focuses on haptic methods that display data. The hypothesis is that haptic devices can be used to present information, and consequently, the user gains quantitative, qualitative, or holistic knowledge about the presented data. Not only is this useful for users who are blind or partially sighted (who can feel line graphs, for instance), but also the haptic modality can be used alongside other modalities, to increase the amount of variables being presented, or to duplicate some variables to reinforce the presentation. Over the last 20 years, a significant amount of research has been done in haptic data presentation; e.g., researchers have developed force feedback line graphs, bar charts, and other forms of haptic representations. However, previous research is published in different conferences and journals, with different application emphases. This paper gathers and collates these various designs to provide a comprehensive review of designs for haptic data visualization. The designs are classified by their representation: Charts, Maps, Signs, Networks, Diagrams, Images, and Tables. This review provides a comprehensive reference for researchers and learners, and highlights areas for further research.
A simulator for surgery training: optimal sensory stimuli in a bone pinning simulation
NASA Astrophysics Data System (ADS)
Daenzer, Stefan; Fritzsche, Klaus
2008-03-01
Currently available low cost haptic devices allow inexpensive surgical training with no risk to patients. Major drawbacks of lower cost devices include limited maximum feedback force and the incapability to expose occurring moments. Aim of this work was the design and implementation of a surgical simulator that allows the evaluation of multi-sensory stimuli in order to overcome the occurring drawbacks. The simulator was built following a modular architecture to allow flexible combinations and thorough evaluation of different multi-sensory feedback modules. A Kirschner-Wire (K-Wire) tibial fracture fixation procedure was defined and implemented as a first test scenario. A set of computational metrics has been derived from the clinical requirements of the task to objectively assess the trainees performance during simulation. Sensory feedback modules for haptic and visual feedback have been developed, each in a basic and additionally in an enhanced form. First tests have shown that specific visual concepts can overcome some of the drawbacks coming along with low cost haptic devices. The simulator, the metrics and the surgery scenario together represent an important step towards a better understanding of the perception of multi-sensory feedback in complex surgical training tasks. Field studies on top of the architecture can open the way to risk-less and inexpensive surgical simulations that can keep up with traditional surgical training.
A haptic device for guide wire in interventional radiology procedures.
Moix, Thomas; Ilic, Dejan; Bleuler, Hannes; Zoethout, Jurjen
2006-01-01
Interventional Radiology (IR) is a minimally invasive procedure where thin tubular instruments, guide wires and catheters, are steered through the patient's vascular system under X-ray imaging. In order to perform these procedures, a radiologist has to be trained to master hand-eye coordination, instrument manipulation and procedure protocols. The existing simulation systems all have major drawbacks: the use of modified instruments, unrealistic insertion lengths, high inertia of the haptic device that creates a noticeably degraded dynamic behavior or excessive friction that is not properly compensated for. In this paper we propose a quality training environment dedicated to IR. The system is composed of a virtual reality (VR) simulation of the patient's anatomy linked to a robotic interface providing haptic force feedback. This paper focuses on the requirements, design and prototyping of a specific haptic interface for guide wires.
Force modeling for incisions into various tissues with MRF haptic master
NASA Astrophysics Data System (ADS)
Kim, Pyunghwa; Kim, Soomin; Park, Young-Dai; Choi, Seung-Bok
2016-03-01
This study proposes a new model to predict the reaction force that occurs in incisions during robot-assisted minimally invasive surgery. The reaction force is fed back to the manipulator by a magneto-rheological fluid (MRF) haptic master, which is featured by a bi-directional clutch actuator. The reaction force feedback provides similar sensations to laparotomy that cannot be provided by a conventional master for surgery. This advantage shortens the training period for robot-assisted minimally invasive surgery and can improve the accuracy of operations. The reaction force modeling of incisions can be utilized in a surgical simulator that provides a virtual reaction force. In this work, in order to model the reaction force during incisions, the energy aspect of the incision process is adopted and analyzed. Each mode of the incision process is classified by the tendency of the energy change, and modeled for realistic real-time application. The reaction force model uses actual reaction force information with three types of actual tissues: hard tissue, medium tissue, and soft tissue. This modeled force is realized by the MRF haptic master through an algorithm based on the position and velocity of a scalpel using two different control methods: an open-loop algorithm and a closed-loop algorithm. The reaction forces obtained from the proposed model are compared with a desired force in time domain.
NASA Technical Reports Server (NTRS)
Frank, Andreas O.; Twombly, I. Alexander; Barth, Timothy J.; Smith, Jeffrey D.; Dalton, Bonnie P. (Technical Monitor)
2001-01-01
We have applied the linear elastic finite element method to compute haptic force feedback and domain deformations of soft tissue models for use in virtual reality simulators. Our results show that, for virtual object models of high-resolution 3D data (>10,000 nodes), haptic real time computations (>500 Hz) are not currently possible using traditional methods. Current research efforts are focused in the following areas: 1) efficient implementation of fully adaptive multi-resolution methods and 2) multi-resolution methods with specialized basis functions to capture the singularity at the haptic interface (point loading). To achieve real time computations, we propose parallel processing of a Jacobi preconditioned conjugate gradient method applied to a reduced system of equations resulting from surface domain decomposition. This can effectively be achieved using reconfigurable computing systems such as field programmable gate arrays (FPGA), thereby providing a flexible solution that allows for new FPGA implementations as improved algorithms become available. The resulting soft tissue simulation system would meet NASA Virtual Glovebox requirements and, at the same time, provide a generalized simulation engine for any immersive environment application, such as biomedical/surgical procedures or interactive scientific applications.
Sensing interactions in the microworld with optical tweezers
NASA Astrophysics Data System (ADS)
Pacoret, Cécile; Bowman, Richard; Gibson, Graham; Sinan, Haliyo D.; Bergander, Arvid; Carberry, David; Régnier, Stéphane; Padgett, Miles
2009-08-01
Optical Tweezers have become a widespread tool in Cell Biology, microengineering and other fields requiring delicate micromanipulation. But for those sensitive tasks, it remains difficult to handle objects without damaging them. As the precision in position and force measurement increase, the richness of information cannot be fully exploited with simple interfaces such as a mouse or a common joystick. For this reason, we propose a haptic force-feedback optical tweezer command and a force-feedback system controlled by one hand. The system combines accurate force measurement using a fast camera and the coupling of these measured forces with a human operator. The overall transparency allows even the feeling of the Brownian motion.
Evaluation of Pseudo-Haptic Interactions with Soft Objects in Virtual Environments.
Li, Min; Sareh, Sina; Xu, Guanghua; Ridzuan, Maisarah Binti; Luo, Shan; Xie, Jun; Wurdemann, Helge; Althoefer, Kaspar
2016-01-01
This paper proposes a pseudo-haptic feedback method conveying simulated soft surface stiffness information through a visual interface. The method exploits a combination of two feedback techniques, namely visual feedback of soft surface deformation and control of the indenter avatar speed, to convey stiffness information of a simulated surface of a soft object in virtual environments. The proposed method was effective in distinguishing different sizes of virtual hard nodules integrated into the simulated soft bodies. To further improve the interactive experience, the approach was extended creating a multi-point pseudo-haptic feedback system. A comparison with regards to (a) nodule detection sensitivity and (b) elapsed time as performance indicators in hard nodule detection experiments to a tablet computer incorporating vibration feedback was conducted. The multi-point pseudo-haptic interaction is shown to be more time-efficient than the single-point pseudo-haptic interaction. It is noted that multi-point pseudo-haptic feedback performs similarly well when compared to a vibration-based feedback method based on both performance measures elapsed time and nodule detection sensitivity. This proves that the proposed method can be used to convey detailed haptic information for virtual environmental tasks, even subtle ones, using either a computer mouse or a pressure sensitive device as an input device. This pseudo-haptic feedback method provides an opportunity for low-cost simulation of objects with soft surfaces and hard inclusions, as, for example, occurring in ever more realistic video games with increasing emphasis on interaction with the physical environment and minimally invasive surgery in the form of soft tissue organs with embedded cancer nodules. Hence, the method can be used in many low-budget applications where haptic sensation is required, such as surgeon training or video games, either using desktop computers or portable devices, showing reasonably high fidelity in conveying stiffness perception to the user.
Rauter, Georg; Sigrist, Roland; Riener, Robert; Wolf, Peter
2015-01-01
In literature, the effectiveness of haptics for motor learning is controversially discussed. Haptics is believed to be effective for motor learning in general; however, different types of haptic control enhance different movement aspects. Thus, in dependence on the movement aspects of interest, one type of haptic control may be effective whereas another one is not. Therefore, in the current work, it was investigated if and how different types of haptic controllers affect learning of spatial and temporal movement aspects. In particular, haptic controllers that enforce active participation of the participants were expected to improve spatial aspects. Only haptic controllers that provide feedback about the task's velocity profile were expected to improve temporal aspects. In a study on learning a complex trunk-arm rowing task, the effect of training with four different types of haptic control was investigated: position control, path control, adaptive path control, and reactive path control. A fifth group (control) trained with visual concurrent augmented feedback. As hypothesized, the position controller was most effective for learning of temporal movement aspects, while the path controller was most effective in teaching spatial movement aspects of the rowing task. Visual feedback was also effective for learning temporal and spatial movement aspects.
Visual-perceptual mismatch in robotic surgery.
Abiri, Ahmad; Tao, Anna; LaRocca, Meg; Guan, Xingmin; Askari, Syed J; Bisley, James W; Dutson, Erik P; Grundfest, Warren S
2017-08-01
The principal objective of the experiment was to analyze the effects of the clutch operation of robotic surgical systems on the performance of the operator. The relative coordinate system introduced by the clutch operation can introduce a visual-perceptual mismatch which can potentially have negative impact on a surgeon's performance. We also assess the impact of the introduction of additional tactile sensory information on reducing the impact of visual-perceptual mismatch on the performance of the operator. We asked 45 novice subjects to complete peg transfers using the da Vinci IS 1200 system with grasper-mounted, normal force sensors. The task involves picking up a peg with one of the robotic arms, passing it to the other arm, and then placing it on the opposite side of the view. Subjects were divided into three groups: aligned group (no mismatch), the misaligned group (10 cm z axis mismatch), and the haptics-misaligned group (haptic feedback and z axis mismatch). Each subject performed the task five times, during which the grip force, time of completion, and number of faults were recorded. Compared to the subjects that performed the tasks using a properly aligned controller/arm configuration, subjects with a single-axis misalignment showed significantly more peg drops (p = 0.011) and longer time to completion (p < 0.001). Additionally, it was observed that addition of tactile feedback helps reduce the negative effects of visual-perceptual mismatch in some cases. Grip force data recorded from grasper-mounted sensors showed no difference between the different groups. The visual-perceptual mismatch created by the misalignment of the robotic controls relative to the robotic arms has a negative impact on the operator of a robotic surgical system. Introduction of other sensory information and haptic feedback systems can help in potentially reducing this effect.
Afzal, Muhammad Raheel; Byun, Ha-Young; Oh, Min-Kyun; Yoon, Jungwon
2015-03-13
Haptic control is a useful therapeutic option in rehabilitation featuring virtual reality interaction. As with visual and vibrotactile biofeedback, kinesthetic haptic feedback may assist in postural control, and can achieve balance control. Kinesthetic haptic feedback in terms of body sway can be delivered via a commercially available haptic device and can enhance the balance stability of both young healthy subjects and stroke patients. Our system features a waist-attached smartphone, software running on a computer (PC), and a dedicated Phantom Omni® device. Young healthy participants performed balance tasks after assumption of each of four distinct postures for 30 s (one foot on the ground; the Tandem Romberg stance; one foot on foam; and the Tandem Romberg stance on foam) with eyes closed. Patient eyes were not closed and assumption of the Romberg stance (only) was tested during a balance task 25 s in duration. An Android application running continuously on the smartphone sent mediolateral (ML) and anteroposterior (AP) tilt angles to a PC, which generated kinesthetic haptic feedback via Phantom Omni®. A total of 16 subjects, 8 of whom were young healthy and 8 of whom had suffered stroke, participated in the study. Post-experiment data analysis was performed using MATLAB®. Mean Velocity Displacement (MVD), Planar Deviation (PD), Mediolateral Trajectory (MLT) and Anteroposterior Trajectory (APT) parameters were analyzed to measure reduction in body sway. Our kinesthetic haptic feedback system was effective to reduce postural sway in young healthy subjects regardless of posture and the condition of the substrate (the ground) and to improve MVD and PD in stroke patients who assumed the Romberg stance. Analysis of Variance (ANOVA) revealed that kinesthetic haptic feedback significantly reduced body sway in both categories of subjects. Kinesthetic haptic feedback can be implemented using a commercial haptic device and a smartphone. Intuitive balance cues were created using the handle of a haptic device, rendering the approach very simple yet efficient in practice. This novel form of biofeedback will be a useful rehabilitation tool improving the balance of stroke patients.
An augmented reality haptic training simulator for spinal needle procedures.
Sutherland, Colin; Hashtrudi-Zaad, Keyvan; Sellens, Rick; Abolmaesumi, Purang; Mousavi, Parvin
2013-11-01
This paper presents the prototype for an augmented reality haptic simulation system with potential for spinal needle insertion training. The proposed system is composed of a torso mannequin, a MicronTracker2 optical tracking system, a PHANToM haptic device, and a graphical user interface to provide visual feedback. The system allows users to perform simulated needle insertions on a physical mannequin overlaid with an augmented reality cutaway of patient anatomy. A tissue model based on a finite-element model provides force during the insertion. The system allows for training without the need for the presence of a trained clinician or access to live patients or cadavers. A pilot user study demonstrates the potential and functionality of the system.
A review of haptic simulator for oral and maxillofacial surgery based on virtual reality.
Chen, Xiaojun; Hu, Junlei
2018-06-01
Traditional medical training in oral and maxillofacial surgery (OMFS) may be limited by its low efficiency and high price due to the shortage of cadaver resources. With the combination of visual rendering and feedback force, surgery simulators become increasingly popular in hospitals and medical schools as an alternative to the traditional training. Areas covered: The major goal of this review is to provide a comprehensive reference source of current and future developments of haptic OMFS simulators based on virtual reality (VR) for relevant researchers. Expert commentary: Visual rendering, haptic rendering, tissue deformation, and evaluation are key components of haptic surgery simulator based on VR. Compared with traditional medical training, virtual and tactical fusion of virtual environment in surgery simulator enables considerably vivid sensation, and the operators have more opportunities to practice surgical skills and receive objective evaluation as reference.
A survey of telerobotic surface finishing
NASA Astrophysics Data System (ADS)
Höglund, Thomas; Alander, Jarmo; Mantere, Timo
2018-05-01
This is a survey of research published on the subjects of telerobotics, haptic feedback, and mixed reality applied to surface finishing. The survey especially focuses on how visuo-haptic feedback can be used to improve a grinding process using a remote manipulator or robot. The benefits of teleoperation and reasons for using haptic feedback are presented. The use of genetic algorithms for optimizing haptic sensing is briefly discussed. Ways of augmenting the operator's vision are described. Visual feedback can be used to find defects and analyze the quality of the surface resulting from the surface finishing process. Visual cues can also be used to aid a human operator in manipulating a robot precisely and avoiding collisions.
The effect of haptic guidance and visual feedback on learning a complex tennis task.
Marchal-Crespo, Laura; van Raai, Mark; Rauter, Georg; Wolf, Peter; Riener, Robert
2013-11-01
While haptic guidance can improve ongoing performance of a motor task, several studies have found that it ultimately impairs motor learning. However, some recent studies suggest that the haptic demonstration of optimal timing, rather than movement magnitude, enhances learning in subjects trained with haptic guidance. Timing of an action plays a crucial role in the proper accomplishment of many motor skills, such as hitting a moving object (discrete timing task) or learning a velocity profile (time-critical tracking task). The aim of the present study is to evaluate which feedback conditions-visual or haptic guidance-optimize learning of the discrete and continuous elements of a timing task. The experiment consisted in performing a fast tennis forehand stroke in a virtual environment. A tendon-based parallel robot connected to the end of a racket was used to apply haptic guidance during training. In two different experiments, we evaluated which feedback condition was more adequate for learning: (1) a time-dependent discrete task-learning to start a tennis stroke and (2) a tracking task-learning to follow a velocity profile. The effect that the task difficulty and subject's initial skill level have on the selection of the optimal training condition was further evaluated. Results showed that the training condition that maximizes learning of the discrete time-dependent motor task depends on the subjects' initial skill level. Haptic guidance was especially suitable for less-skilled subjects and in especially difficult discrete tasks, while visual feedback seems to benefit more skilled subjects. Additionally, haptic guidance seemed to promote learning in a time-critical tracking task, while visual feedback tended to deteriorate the performance independently of the task difficulty and subjects' initial skill level. Haptic guidance outperformed visual feedback, although additional studies are needed to further analyze the effect of other types of feedback visualization on motor learning of time-critical tasks.
Zenner, Andre; Kruger, Antonio
2017-04-01
We define the concept of Dynamic Passive Haptic Feedback (DPHF) for virtual reality by introducing the weight-shifting physical DPHF proxy object Shifty. This concept combines actuators known from active haptics and physical proxies known from passive haptics to construct proxies that automatically adapt their passive haptic feedback. We describe the concept behind our ungrounded weight-shifting DPHF proxy Shifty and the implementation of our prototype. We then investigate how Shifty can, by automatically changing its internal weight distribution, enhance the user's perception of virtual objects interacted with in two experiments. In a first experiment, we show that Shifty can enhance the perception of virtual objects changing in shape, especially in length and thickness. Here, Shifty was shown to increase the user's fun and perceived realism significantly, compared to an equivalent passive haptic proxy. In a second experiment, Shifty is used to pick up virtual objects of different virtual weights. The results show that Shifty enhances the perception of weight and thus the perceived realism by adapting its kinesthetic feedback to the picked-up virtual object. In the same experiment, we additionally show that specific combinations of haptic, visual and auditory feedback during the pick-up interaction help to compensate for visual-haptic mismatch perceived during the shifting process.
Simulation and training of lumbar punctures using haptic volume rendering and a 6DOF haptic device
NASA Astrophysics Data System (ADS)
Färber, Matthias; Heller, Julika; Handels, Heinz
2007-03-01
The lumbar puncture is performed by inserting a needle into the spinal chord of the patient to inject medicaments or to extract liquor. The training of this procedure is usually done on the patient guided by experienced supervisors. A virtual reality lumbar puncture simulator has been developed in order to minimize the training costs and the patient's risk. We use a haptic device with six degrees of freedom (6DOF) to feedback forces that resist needle insertion and rotation. An improved haptic volume rendering approach is used to calculate the forces. This approach makes use of label data of relevant structures like skin, bone, muscles or fat and original CT data that contributes information about image structures that can not be segmented. A real-time 3D visualization with optional stereo view shows the punctured region. 2D visualizations of orthogonal slices enable a detailed impression of the anatomical context. The input data consisting of CT and label data and surface models of relevant structures is defined in an XML file together with haptic rendering and visualization parameters. In a first evaluation the visible human male data has been used to generate a virtual training body. Several users with different medical experience tested the lumbar puncture trainer. The simulator gives a good haptic and visual impression of the needle insertion and the haptic volume rendering technique enables the feeling of unsegmented structures. Especially, the restriction of transversal needle movement together with rotation constraints enabled by the 6DOF device facilitate a realistic puncture simulation.
Three Dimensional Projection Environment for Molecular Design and Surgical Simulation
2011-08-01
bypasses the cumbersome meshing process . The deformation model is only comprised of mass nodes, which are generated by sampling the object volume before...force should minimize the penetration volume, the haptic feedback force is derived directly. Additionally, a post- processing technique is developed to...render distinct physi-cal tissue properties across different interaction areas. The proposed approach does not require any pre- processing and is
Gurari, Netta; Baud-Bovy, Gabriel
2014-09-30
The emergence of commercial haptic devices offers new research opportunities to enhance our understanding of the human sensory-motor system. Yet, commercial device capabilities have limitations which need to be addressed. This paper describes the customization of a commercial force feedback device for displaying forces with a precision that exceeds the human force perception threshold. The device was outfitted with a multi-axis force sensor and closed-loop controlled to improve its transparency. Additionally, two force sensing resistors were attached to the device to measure grip force. Force errors were modeled in the frequency- and time-domain to identify contributions from the mass, viscous friction, and Coulomb friction during open- and closed-loop control. The effect of user interaction on system stability was assessed in the context of a user study which aimed to measure force perceptual thresholds. Findings based on 15 participants demonstrate that the system maintains stability when rendering forces ranging from 0-0.20 N, with an average maximum absolute force error of 0.041 ± 0.013 N. Modeling the force errors revealed that Coulomb friction and inertia were the main contributors to force distortions during respectively slow and fast motions. Existing commercial force feedback devices cannot render forces with the required precision for certain testing scenarios. Building on existing robotics work, this paper shows how a device can be customized to make it reliable for studying the perception of weak forces. The customized and closed-loop controlled device is suitable for measuring force perceptual thresholds. Copyright © 2014 Elsevier B.V. All rights reserved.
Deformation of Soft Tissue and Force Feedback Using the Smoothed Particle Hydrodynamics
Liu, Xuemei; Wang, Ruiyi; Li, Yunhua; Song, Dongdong
2015-01-01
We study the deformation and haptic feedback of soft tissue in virtual surgery based on a liver model by using a force feedback device named PHANTOM OMNI developed by SensAble Company in USA. Although a significant amount of research efforts have been dedicated to simulating the behaviors of soft tissue and implementing force feedback, it is still a challenging problem. This paper introduces a kind of meshfree method for deformation simulation of soft tissue and force computation based on viscoelastic mechanical model and smoothed particle hydrodynamics (SPH). Firstly, viscoelastic model can present the mechanical characteristics of soft tissue which greatly promotes the realism. Secondly, SPH has features of meshless technique and self-adaption, which supply higher precision than methods based on meshes for force feedback computation. Finally, a SPH method based on dynamic interaction area is proposed to improve the real time performance of simulation. The results reveal that SPH methodology is suitable for simulating soft tissue deformation and force feedback calculation, and SPH based on dynamic local interaction area has a higher computational efficiency significantly compared with usual SPH. Our algorithm has a bright prospect in the area of virtual surgery. PMID:26417380
Real-time, haptics-enabled simulator for probing ex vivo liver tissue.
Lister, Kevin; Gao, Zhan; Desai, Jaydev P
2009-01-01
The advent of complex surgical procedures has driven the need for realistic surgical training simulators. Comprehensive simulators that provide realistic visual and haptic feedback during surgical tasks are required to familiarize surgeons with the procedures they are to perform. Complex organ geometry inherent to biological tissues and intricate material properties drive the need for finite element methods to assure accurate tissue displacement and force calculations. Advances in real-time finite element methods have not reached the state where they are applicable to soft tissue surgical simulation. Therefore a real-time, haptics-enabled simulator for probing of soft tissue has been developed which utilizes preprocessed finite element data (derived from accurate constitutive model of the soft-tissue obtained from carefully collected experimental data) to accurately replicate the probing task in real-time.
Roughness Perception of Haptically Displayed Fractal Surfaces
NASA Technical Reports Server (NTRS)
Costa, Michael A.; Cutkosky, Mark R.; Lau, Sonie (Technical Monitor)
2000-01-01
Surface profiles were generated by a fractal algorithm and haptically rendered on a force feedback joystick, Subjects were asked to use the joystick to explore pairs of surfaces and report to the experimenter which of the surfaces they felt was rougher. Surfaces were characterized by their root mean square (RMS) amplitude and their fractal dimension. The most important factor affecting the perceived roughness of the fractal surfaces was the RMS amplitude of the surface. When comparing surfaces of fractal dimension 1.2-1.35 it was found that the fractal dimension was negatively correlated with perceived roughness.
Yu, Ningbo; Xu, Chang; Li, Huanshuai; Wang, Kui; Wang, Liancheng; Liu, Jingtai
2016-03-18
Disabilities after neural injury, such as stroke, bring tremendous burden to patients, families and society. Besides the conventional constrained-induced training with a paretic arm, bilateral rehabilitation training involves both the ipsilateral and contralateral sides of the neural injury, fitting well with the fact that both arms are needed in common activities of daily living (ADLs), and can promote good functional recovery. In this work, the fusion of a gesture sensor and a haptic sensor with force feedback capabilities has enabled a bilateral rehabilitation training therapy. The Leap Motion gesture sensor detects the motion of the healthy hand, and the omega.7 device can detect and assist the paretic hand, according to the designed cooperative task paradigm, as much as needed, with active force feedback to accomplish the manipulation task. A virtual scenario has been built up, and the motion and force data facilitate instantaneous visual and audio feedback, as well as further analysis of the functional capabilities of the patient. This task-oriented bimanual training paradigm recruits the sensory, motor and cognitive aspects of the patient into one loop, encourages the active involvement of the patients into rehabilitation training, strengthens the cooperation of both the healthy and impaired hands, challenges the dexterous manipulation capability of the paretic hand, suits easy of use at home or centralized institutions and, thus, promises effective potentials for rehabilitation training.
Yu, Ningbo; Xu, Chang; Li, Huanshuai; Wang, Kui; Wang, Liancheng; Liu, Jingtai
2016-01-01
Disabilities after neural injury, such as stroke, bring tremendous burden to patients, families and society. Besides the conventional constrained-induced training with a paretic arm, bilateral rehabilitation training involves both the ipsilateral and contralateral sides of the neural injury, fitting well with the fact that both arms are needed in common activities of daily living (ADLs), and can promote good functional recovery. In this work, the fusion of a gesture sensor and a haptic sensor with force feedback capabilities has enabled a bilateral rehabilitation training therapy. The Leap Motion gesture sensor detects the motion of the healthy hand, and the omega.7 device can detect and assist the paretic hand, according to the designed cooperative task paradigm, as much as needed, with active force feedback to accomplish the manipulation task. A virtual scenario has been built up, and the motion and force data facilitate instantaneous visual and audio feedback, as well as further analysis of the functional capabilities of the patient. This task-oriented bimanual training paradigm recruits the sensory, motor and cognitive aspects of the patient into one loop, encourages the active involvement of the patients into rehabilitation training, strengthens the cooperation of both the healthy and impaired hands, challenges the dexterous manipulation capability of the paretic hand, suits easy of use at home or centralized institutions and, thus, promises effective potentials for rehabilitation training. PMID:26999149
Vision-Based Haptic Feedback for Remote Micromanipulation in-SEM Environment
NASA Astrophysics Data System (ADS)
Bolopion, Aude; Dahmen, Christian; Stolle, Christian; Haliyo, Sinan; Régnier, Stéphane; Fatikow, Sergej
2012-07-01
This article presents an intuitive environment for remote micromanipulation composed of both haptic feedback and virtual reconstruction of the scene. To enable nonexpert users to perform complex teleoperated micromanipulation tasks, it is of utmost importance to provide them with information about the 3-D relative positions of the objects and the tools. Haptic feedback is an intuitive way to transmit such information. Since position sensors are not available at this scale, visual feedback is used to derive information about the scene. In this work, three different techniques are implemented, evaluated, and compared to derive the object positions from scanning electron microscope images. The modified correlation matching with generated template algorithm is accurate and provides reliable detection of objects. To track the tool, a marker-based approach is chosen since fast detection is required for stable haptic feedback. Information derived from these algorithms is used to propose an intuitive remote manipulation system that enables users situated in geographically distant sites to benefit from specific equipments, such as SEMs. Stability of the haptic feedback is ensured by the minimization of the delays, the computational efficiency of vision algorithms, and the proper tuning of the haptic coupling. Virtual guides are proposed to avoid any involuntary collisions between the tool and the objects. This approach is validated by a teleoperation involving melamine microspheres with a diameter of less than 2 μ m between Paris, France and Oldenburg, Germany.
Sensing and Force-Feedback Exoskeleton (SAFE) Robotic Glove.
Ben-Tzvi, Pinhas; Ma, Zhou
2015-11-01
This paper presents the design, implementation and experimental validation of a novel robotic haptic exoskeleton device to measure the user's hand motion and assist hand motion while remaining portable and lightweight. The device consists of a five-finger mechanism actuated with miniature DC motors through antagonistically routed cables at each finger, which act as both active and passive force actuators. The SAFE Glove is a wireless and self-contained mechatronic system that mounts over the dorsum of a bare hand and provides haptic force feedback to each finger. The glove is adaptable to a wide variety of finger sizes without constraining the range of motion. This makes it possible to accurately and comfortably track the complex motion of the finger and thumb joints associated with common movements of hand functions, including grip and release patterns. The glove can be wirelessly linked to a computer for displaying and recording the hand status through 3D Graphical User Interface (GUI) in real-time. The experimental results demonstrate that the SAFE Glove is capable of reliably modeling hand kinematics, measuring finger motion and assisting hand grasping motion. Simulation and experimental results show the potential of the proposed system in rehabilitation therapy and virtual reality applications.
Prasad, Raghu; Muniyandi, Manivannan; Manoharan, Govindan; Chandramohan, Servarayan M
2018-05-01
The purpose of this study was to examine the face and construct validity of a custom-developed bimanual laparoscopic force-skills trainer with haptics feedback. The study also examined the effect of handedness on fundamental and complex tasks. Residents (n = 25) and surgeons (n = 25) performed virtual reality-based bimanual fundamental and complex tasks. Tool-tissue reaction forces were summed, recorded, and analysed. Seven different force-based measures and a 1-time measure were used as metrics. Subsequently, participants filled out face validity and demographic questionnaires. Residents and surgeons were positive on the design, workspace, and usefulness of the simulator. Construct validity results showed significant differences between residents and experts during the execution of fundamental and complex tasks. In both tasks, residents applied large forces with higher coefficient of variation and force jerks (P < .001). Experts, with their dominant hand, applied lower forces in complex tasks and higher forces in fundamental tasks (P < .001). The coefficients of force variation (CoV) of residents and experts were higher in complex tasks (P < .001). Strong correlations were observed between CoV and task time for fundamental (r = 0.70) and complex tasks (r = 0.85). Range of smoothness of force was higher for the non-dominant hand in both fundamental and complex tasks. The simulator was able to differentiate the force-skills of residents and surgeons, and objectively evaluate the effects of handedness on laparoscopic force-skills. Competency-based laparoscopic skills assessment curriculum should be updated to meet the requirements of bimanual force-based training.
Lemole, G Michael; Banerjee, P Pat; Luciano, Cristian; Neckrysh, Sergey; Charbel, Fady T
2007-07-01
Mastery of the neurosurgical skill set involves many hours of supervised intraoperative training. Convergence of political, economic, and social forces has limited neurosurgical resident operative exposure. There is need to develop realistic neurosurgical simulations that reproduce the operative experience, unrestricted by time and patient safety constraints. Computer-based, virtual reality platforms offer just such a possibility. The combination of virtual reality with dynamic, three-dimensional stereoscopic visualization, and haptic feedback technologies makes realistic procedural simulation possible. Most neurosurgical procedures can be conceptualized and segmented into critical task components, which can be simulated independently or in conjunction with other modules to recreate the experience of a complex neurosurgical procedure. We use the ImmersiveTouch (ImmersiveTouch, Inc., Chicago, IL) virtual reality platform, developed at the University of Illinois at Chicago, to simulate the task of ventriculostomy catheter placement as a proof-of-concept. Computed tomographic data are used to create a virtual anatomic volume. Haptic feedback offers simulated resistance and relaxation with passage of a virtual three-dimensional ventriculostomy catheter through the brain parenchyma into the ventricle. A dynamic three-dimensional graphical interface renders changing visual perspective as the user's head moves. The simulation platform was found to have realistic visual, tactile, and handling characteristics, as assessed by neurosurgical faculty, residents, and medical students. We have developed a realistic, haptics-based virtual reality simulator for neurosurgical education. Our first module recreates a critical component of the ventriculostomy placement task. This approach to task simulation can be assembled in a modular manner to reproduce entire neurosurgical procedures.
Merians, Alma S; Fluet, Gerard G; Qiu, Qinyin; Lafond, Ian; Adamovich, Sergei V
2011-01-01
Robotic systems that are interfaced with virtual reality gaming and task simulations are increasingly being developed to provide repetitive intensive practice to promote increased compliance and facilitate better outcomes in rehabilitation post-stroke. A major development in the use of virtual environments (VEs) has been to incorporate tactile information and interaction forces into what was previously an essentially visual experience. Robots of varying complexity are being interfaced with more traditional virtual presentations to provide haptic feedback that enriches the sensory experience and adds physical task parameters. This provides forces that produce biomechanical and neuromuscular interactions with the VE that approximate real-world movement more accurately than visual-only VEs, simulating the weight and force found in upper extremity tasks. The purpose of this article is to present an overview of several systems that are commercially available for ambulation training and for training movement of the upper extremity. We will also report on the system that we have developed (NJIT-RAVR system) that incorporates motivating and challenging haptic feedback effects into VE simulations to facilitate motor recovery of the upper extremity post-stroke. The NJIT-RAVR system trains both the upper arm and the hand. The robotic arm acts as an interface between the participants and the VEs, enabling multiplanar movements against gravity in a three-dimensional workspace. The ultimate question is whether this medium can provide a motivating, challenging, gaming experience with dramatically decreased physical difficulty levels, which would allow for participation by an obese person and facilitate greater adherence to exercise regimes. PMID:21527097
Prototype tactile feedback system for examination by skin touch.
Lee, O; Lee, K; Oh, C; Kim, K; Kim, M
2014-08-01
Diagnosis of conditions such as psoriasis and atopic dermatitis, in the case of induration, involves palpating the infected area via hands and then selecting a ratings score. However, the score is determined based on the tester's experience and standards, making it subjective. To provide tactile feedback on the skin, we developed a prototype tactile feedback system to simulate skin wrinkles with PHANToM OMNI. To provide the user with tactile feedback on skin wrinkles, a visual and haptic Augmented Reality system was developed. First, a pair of stereo skin images obtained by a stereo camera generates a disparity map of skin wrinkles. Second, the generated disparity map is sent to an implemented tactile rendering algorithm that computes a reaction force according to the user's interaction with the skin image. We first obtained a stereo image of skin wrinkles from the in vivo stereo imaging system, which has a baseline of 50.8 μm, and obtained the disparity map with a graph cuts algorithm. The left image is displayed on the monitor to enable the user to recognize the location visually. The disparity map of the skin wrinkle image sends skin wrinkle information as a tactile response to the user through a haptic device. We successfully developed a tactile feedback system for virtual skin wrinkle simulation by means of a commercialized haptic device that provides the user with a single point of contact to feel the surface roughness of a virtual skin sample. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Haptic feedback can provide an objective assessment of arthroscopic skills.
Chami, George; Ward, James W; Phillips, Roger; Sherman, Kevin P
2008-04-01
The outcome of arthroscopic procedures is related to the surgeon's skills in arthroscopy. Currently, evaluation of such skills relies on direct observation by a surgeon trainer. This type of assessment, by its nature, is subjective and time-consuming. The aim of our study was to identify whether haptic information generated from arthroscopic tools could distinguish between skilled and less skilled surgeons. A standard arthroscopic probe was fitted with a force/torque sensor. The probe was used by five surgeons with different levels of experience in knee arthroscopy performing 11 different tasks in 10 standard knee arthroscopies. The force/torque data from the hand and tool interface were recorded and synchronized with a video recording of the procedure. The torque magnitude and patterns generated were analyzed and compared. A computerized system was used to analyze the force/torque signature based on general principles for quality of performance using such measures as economy in movement, time efficiency, and consistency in performance. The results showed a considerable correlation between three haptic parameters and the surgeon's experience, which could be used in an automated objective assessment system for arthroscopic surgery. Level II, diagnostic study. See the Guidelines for Authors for a complete description of levels of evidence.
Development of a StandAlone Surgical Haptic Arm.
Jones, Daniel; Lewis, Andrew; Fischer, Gregory S
2011-01-01
When performing telesurgery with current commercially available Minimally Invasive Robotic Surgery (MIRS) systems, a surgeon cannot feel the tool interactions that are inherent in traditional laparoscopy. It is proposed that haptic feedback in the control of MIRS systems could improve the speed, safety and learning curve of robotic surgery. To test this hypothesis, a standalone surgical haptic arm (SASHA) capable of manipulating da Vinci tools has been designed and fabricated with the additional ability of providing information for haptic feedback. This arm was developed as a research platform for developing and evaluating approaches to telesurgery, including various haptic mappings between master and slave and evaluating the effects of latency.
Rajanna, Vijay; Vo, Patrick; Barth, Jerry; Mjelde, Matthew; Grey, Trevor; Oduola, Cassandra; Hammond, Tracy
2016-03-01
A carefully planned, structured, and supervised physiotherapy program, following a surgery, is crucial for the successful diagnosis of physical injuries. Nearly 50 % of the surgeries fail due to unsupervised, and erroneous physiotherapy. The demand for a physiotherapist for an extended period is expensive to afford, and sometimes inaccessible. Researchers have tried to leverage the advancements in wearable sensors and motion tracking by building affordable, automated, physio-therapeutic systems that direct a physiotherapy session by providing audio-visual feedback on patient's performance. There are many aspects of automated physiotherapy program which are yet to be addressed by the existing systems: a wide classification of patients' physiological conditions to be diagnosed, multiple demographics of the patients (blind, deaf, etc.), and the need to pursue patients to adopt the system for an extended period for self-care. In our research, we have tried to address these aspects by building a health behavior change support system called KinoHaptics, for post-surgery rehabilitation. KinoHaptics is an automated, wearable, haptic assisted, physio-therapeutic system that can be used by a wide variety of demographics and for various physiological conditions of the patients. The system provides rich and accurate vibro-haptic feedback that can be felt by the user, irrespective of the physiological limitations. KinoHaptics is built to ensure that no injuries are induced during the rehabilitation period. The persuasive nature of the system allows for personal goal-setting, progress tracking, and most importantly life-style compatibility. The system was evaluated under laboratory conditions, involving 14 users. Results show that KinoHaptics is highly convenient to use, and the vibro-haptic feedback is intuitive, accurate, and has shown to prevent accidental injuries. Also, results show that KinoHaptics is persuasive in nature as it supports behavior change and habit building. The successful acceptance of KinoHaptics, an automated, wearable, haptic assisted, physio-therapeutic system proves the need and future-scope of automated physio-therapeutic systems for self-care and behavior change. It also proves that such systems incorporated with vibro-haptic feedback encourage strong adherence to the physiotherapy program; can have profound impact on the physiotherapy experience resulting in higher acceptance rate.
Prototyping a Hybrid Cooperative and Tele-robotic Surgical System for Retinal Microsurgery.
Balicki, Marcin; Xia, Tian; Jung, Min Yang; Deguet, Anton; Vagvolgyi, Balazs; Kazanzides, Peter; Taylor, Russell
2011-06-01
This paper presents the design of a tele-robotic microsurgical platform designed for development of cooperative and tele-operative control schemes, sensor based smart instruments, user interfaces and new surgical techniques with eye surgery as the driving application. The system is built using the distributed component-based cisst libraries and the Surgical Assistant Workstation framework. It includes a cooperatively controlled EyeRobot2, a da Vinci Master manipulator, and a remote stereo visualization system. We use constrained optimization based virtual fixture control to provide Virtual Remote-Center-of-Motion (vRCM) and haptic feedback. Such system can be used in a hybrid setup, combining local cooperative control with remote tele-operation, where an experienced surgeon can provide hand-over-hand tutoring to a novice user. In another scheme, the system can provide haptic feedback based on virtual fixtures constructed from real-time force and proximity sensor information.
Prototyping a Hybrid Cooperative and Tele-robotic Surgical System for Retinal Microsurgery
Balicki, Marcin; Xia, Tian; Jung, Min Yang; Deguet, Anton; Vagvolgyi, Balazs; Kazanzides, Peter; Taylor, Russell
2013-01-01
This paper presents the design of a tele-robotic microsurgical platform designed for development of cooperative and tele-operative control schemes, sensor based smart instruments, user interfaces and new surgical techniques with eye surgery as the driving application. The system is built using the distributed component-based cisst libraries and the Surgical Assistant Workstation framework. It includes a cooperatively controlled EyeRobot2, a da Vinci Master manipulator, and a remote stereo visualization system. We use constrained optimization based virtual fixture control to provide Virtual Remote-Center-of-Motion (vRCM) and haptic feedback. Such system can be used in a hybrid setup, combining local cooperative control with remote tele-operation, where an experienced surgeon can provide hand-over-hand tutoring to a novice user. In another scheme, the system can provide haptic feedback based on virtual fixtures constructed from real-time force and proximity sensor information. PMID:24398557
Smart glove: hand master using magnetorheological fluid actuators
NASA Astrophysics Data System (ADS)
Nam, Y. J.; Park, M. K.; Yamane, R.
2007-12-01
In this study, a hand master using five miniature magneto-rheological (MR) actuators, which is called 'the smart glove', is introduced. This hand master is intended to display haptic feedback to the fingertip of the human user interacting with any virtual objects in virtual environment. For the smart glove, two effective approaches are proposed: (i) by using the MR actuator which can be considered as a passive actuator, the smart glove is made simple in structure, high in power, low in inertia, safe in interface and stable in haptic feedback, and (ii) with a novel flexible link mechanism designed for the position-force transmission between the fingertips and the actuators, the number of the actuator and the weight of the smart glove can be reduced. These features lead to the improvement in the manipulability and portability of the smart glove. The feasibility of the constructed smart glove is verified through basic performance evaluation.
A virtual reality based simulator for learning nasogastric tube placement.
Choi, Kup-Sze; He, Xuejian; Chiang, Vico Chung-Lim; Deng, Zhaohong
2015-02-01
Nasogastric tube (NGT) placement is a common clinical procedure where a plastic tube is inserted into the stomach through the nostril for feeding or drainage. However, the placement is a blind process in which the tube may be mistakenly inserted into other locations, leading to unexpected complications or fatal incidents. The placement techniques are conventionally acquired by practising on unrealistic rubber mannequins or on humans. In this paper, a virtual reality based training simulation system is proposed to facilitate the training of NGT placement. It focuses on the simulation of tube insertion and the rendering of the feedback forces with a haptic device. A hybrid force model is developed to compute the forces analytically or numerically under different conditions, including the situations when the patient is swallowing or when the tube is buckled at the nostril. To ensure real-time interactive simulations, an offline simulation approach is adopted to obtain the relationship between the insertion depth and insertion force using a non-linear finite element method. The offline dataset is then used to generate real-time feedback forces by interpolation. The virtual training process is logged quantitatively with metrics that can be used for assessing objective performance and tracking progress. The system has been evaluated by nursing professionals. They found that the haptic feeling produced by the simulated forces is similar to their experience during real NGT insertion. The proposed system provides a new educational tool to enhance conventional training in NGT placement. Copyright © 2014 Elsevier Ltd. All rights reserved.
Training Toddlers Seated on Mobile Robots to Steer Using Force-Feedback Joystick.
Agrawal, S K; Xi Chen; Ragonesi, C; Galloway, J C
2012-01-01
The broader goal of our research is to train infants with special needs to safely and purposefully drive a mobile robot to explore the environment. The hypothesis is that these impaired infants will benefit from mobility in their early years and attain childhood milestones, similar to their healthy peers. In this paper, we present an algorithm and training method using a force-feedback joystick with an "assist-as-needed" paradigm for driving training. In this "assist-as-needed" approach, if the child steers the joystick outside a force tunnel centered on the desired direction, the driver experiences a bias force on the hand. We show results with a group study on typically developing toddlers that such a haptic guidance algorithm is superior to training with a conventional joystick. We also provide a case study on two special needs children, under three years old, who learn to make sharp turns during driving, when trained over a five-day period with the force-feedback joystick using the algorithm.
Adaptive space warping to enhance passive haptics in an arthroscopy surgical simulator.
Spillmann, Jonas; Tuchschmid, Stefan; Harders, Matthias
2013-04-01
Passive haptics, also known as tactile augmentation, denotes the use of a physical counterpart to a virtual environment to provide tactile feedback. Employing passive haptics can result in more realistic touch sensations than those from active force feedback, especially for rigid contacts. However, changes in the virtual environment would necessitate modifications of the physical counterparts. In recent work space warping has been proposed as one solution to overcome this limitation. In this technique virtual space is distorted such that a variety of virtual models can be mapped onto one single physical object. In this paper, we propose as an extension adaptive space warping; we show how this technique can be employed in a mixed-reality surgical training simulator in order to map different virtual patients onto one physical anatomical model. We developed methods to warp different organ geometries onto one physical mock-up, to handle different mechanical behaviors of the virtual patients, and to allow interactive modifications of the virtual structures, while the physical counterparts remain unchanged. Various practical examples underline the wide applicability of our approach. To the best of our knowledge this is the first practical usage of such a technique in the specific context of interactive medical training.
Cuppone, Anna Vera; Squeri, Valentina; Semprini, Marianna; Masia, Lorenzo; Konczak, Jürgen
2016-01-01
This study examined the trainability of the proprioceptive sense and explored the relationship between proprioception and motor learning. With vision blocked, human learners had to perform goal-directed wrist movements relying solely on proprioceptive/haptic cues to reach several haptically specified targets. One group received additional somatosensory movement error feedback in form of vibro-tactile cues applied to the skin of the forearm. We used a haptic robotic device for the wrist and implemented a 3-day training regimen that required learners to make spatially precise goal-directed wrist reaching movements without vision. We assessed whether training improved the acuity of the wrist joint position sense. In addition, we checked if sensory learning generalized to the motor domain and improved spatial precision of wrist tracking movements that were not trained. The main findings of the study are: First, proprioceptive acuity of the wrist joint position sense improved after training for the group that received the combined proprioceptive/haptic and vibro-tactile feedback (VTF). Second, training had no impact on the spatial accuracy of the untrained tracking task. However, learners who had received VTF significantly reduced their reliance on haptic guidance feedback when performing the untrained motor task. That is, concurrent VTF was highly salient movement feedback and obviated the need for haptic feedback. Third, VTF can be also provided by the limb not involved in the task. Learners who received VTF to the contralateral limb equally benefitted. In conclusion, somatosensory training can significantly enhance proprioceptive acuity within days when learning is coupled with vibro-tactile sensory cues that provide feedback about movement errors. The observable sensory improvements in proprioception facilitates motor learning and such learning may generalize to the sensorimotor control of the untrained motor tasks. The implications of these findings for neurorehabilitation are discussed.
Patient DF's visual brain in action: Visual feedforward control in visual form agnosia.
Whitwell, Robert L; Milner, A David; Cavina-Pratesi, Cristiana; Barat, Masihullah; Goodale, Melvyn A
2015-05-01
Patient DF, who developed visual form agnosia following ventral-stream damage, is unable to discriminate the width of objects, performing at chance, for example, when asked to open her thumb and forefinger a matching amount. Remarkably, however, DF adjusts her hand aperture to accommodate the width of objects when reaching out to pick them up (grip scaling). While this spared ability to grasp objects is presumed to be mediated by visuomotor modules in her relatively intact dorsal stream, it is possible that it may rely abnormally on online visual or haptic feedback. We report here that DF's grip scaling remained intact when her vision was completely suppressed during grasp movements, and it still dissociated sharply from her poor perceptual estimates of target size. We then tested whether providing trial-by-trial haptic feedback after making such perceptual estimates might improve DF's performance, but found that they remained significantly impaired. In a final experiment, we re-examined whether DF's grip scaling depends on receiving veridical haptic feedback during grasping. In one condition, the haptic feedback was identical to the visual targets. In a second condition, the haptic feedback was of a constant intermediate width while the visual target varied trial by trial. Despite this incongruent feedback, DF still scaled her grip aperture to the visual widths of the target blocks, showing only normal adaptation to the false haptically-experienced width. Taken together, these results strengthen the view that DF's spared grasping relies on a normal mode of dorsal-stream functioning, based chiefly on visual feedforward processing. Copyright © 2014 Elsevier B.V. All rights reserved.
Merians, Alma S; Fluet, Gerard G; Qiu, Qinyin; Lafond, Ian; Adamovich, Sergei V
2011-03-01
Robotic systems that are interfaced with virtual reality gaming and task simulations are increasingly being developed to provide repetitive intensive practice to promote increased compliance and facilitate better outcomes in rehabilitation post-stroke. A major development in the use of virtual environments (VEs) has been to incorporate tactile information and interaction forces into what was previously an essentially visual experience. Robots of varying complexity are being interfaced with more traditional virtual presentations to provide haptic feedback that enriches the sensory experience and adds physical task parameters. This provides forces that produce biomechanical and neuromuscular interactions with the VE that approximate real-world movement more accurately than visual-only VEs, simulating the weight and force found in upper extremity tasks. The purpose of this article is to present an overview of several systems that are commercially available for ambulation training and for training movement of the upper extremity. We will also report on the system that we have developed (NJIT-RAVR system) that incorporates motivating and challenging haptic feedback effects into VE simulations to facilitate motor recovery of the upper extremity post-stroke. The NJIT-RAVR system trains both the upper arm and the hand. The robotic arm acts as an interface between the participants and the VEs, enabling multiplanar movements against gravity in a three-dimensional workspace. The ultimate question is whether this medium can provide a motivating, challenging, gaming experience with dramatically decreased physical difficulty levels, which would allow for participation by an obese person and facilitate greater adherence to exercise regimes. © 2011 Diabetes Technology Society.
Self-Control of Haptic Assistance for Motor Learning: Influences of Frequency and Opinion of Utility
Williams, Camille K.; Tseung, Victrine; Carnahan, Heather
2017-01-01
Studies of self-controlled practice have shown benefits when learners controlled feedback schedule, use of assistive devices and task difficulty, with benefits attributed to information processing and motivational advantages of self-control. Although haptic assistance serves as feedback, aids task performance and modifies task difficulty, researchers have yet to explore whether self-control over haptic assistance could be beneficial for learning. We explored whether self-control of haptic assistance would be beneficial for learning a tracing task. Self-controlled participants selected practice blocks on which they would receive haptic assistance, while participants in a yoked group received haptic assistance on blocks determined by a matched self-controlled participant. We inferred learning from performance on retention tests without haptic assistance. From qualitative analysis of open-ended questions related to rationales for/experiences of the haptic assistance that was chosen/provided, themes emerged regarding participants’ views of the utility of haptic assistance for performance and learning. Results showed that learning was directly impacted by the frequency of haptic assistance for self-controlled participants only and view of haptic assistance. Furthermore, self-controlled participants’ views were significantly associated with their requested haptic assistance frequency. We discuss these findings as further support for the beneficial role of self-controlled practice for motor learning. PMID:29255438
Training Surgical Residents With a Haptic Robotic Central Venous Catheterization Simulator.
Pepley, David F; Gordon, Adam B; Yovanoff, Mary A; Mirkin, Katelin A; Miller, Scarlett R; Han, David C; Moore, Jason Z
Ultrasound guided central venous catheterization (CVC) is a common surgical procedure with complication rates ranging from 5 to 21 percent. Training is typically performed using manikins that do not simulate anatomical variations such as obesity and abnormal vessel positioning. The goal of this study was to develop and validate the effectiveness of a new virtual reality and force haptic based simulation platform for CVC of the right internal jugular vein. A CVC simulation platform was developed using a haptic robotic arm, 3D position tracker, and computer visualization. The haptic robotic arm simulated needle insertion force that was based on cadaver experiments. The 3D position tracker was used as a mock ultrasound device with realistic visualization on a computer screen. Upon completion of a practice simulation, performance feedback is given to the user through a graphical user interface including scoring factors based on good CVC practice. The effectiveness of the system was evaluated by training 13 first year surgical residents using the virtual reality haptic based training system over a 3 month period. The participants' performance increased from 52% to 96% on the baseline training scenario, approaching the average score of an expert surgeon: 98%. This also resulted in improvement in positive CVC practices including a 61% decrease between final needle tip position and vein center, a decrease in mean insertion attempts from 1.92 to 1.23, and a 12% increase in time spent aspirating the syringe throughout the procedure. A virtual reality haptic robotic simulator for CVC was successfully developed. Surgical residents training on the simulation improved to near expert levels after three robotic training sessions. This suggests that this system could act as an effective training device for CVC. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Enhancing audiovisual experience with haptic feedback: a survey on HAV.
Danieau, F; Lecuyer, A; Guillotel, P; Fleureau, J; Mollet, N; Christie, M
2013-01-01
Haptic technology has been widely employed in applications ranging from teleoperation and medical simulation to art and design, including entertainment, flight simulation, and virtual reality. Today there is a growing interest among researchers in integrating haptic feedback into audiovisual systems. A new medium emerges from this effort: haptic-audiovisual (HAV) content. This paper presents the techniques, formalisms, and key results pertinent to this medium. We first review the three main stages of the HAV workflow: the production, distribution, and rendering of haptic effects. We then highlight the pressing necessity for evaluation techniques in this context and discuss the key challenges in the field. By building on existing technologies and tackling the specific challenges of the enhancement of audiovisual experience with haptics, we believe the field presents exciting research perspectives whose financial and societal stakes are significant.
Haptic interface of web-based training system for interventional radiology procedures
NASA Astrophysics Data System (ADS)
Ma, Xin; Lu, Yiping; Loe, KiaFock; Nowinski, Wieslaw L.
2004-05-01
The existing web-based medical training systems and surgical simulators can provide affordable and accessible medical training curriculum, but they seldom offer the trainee realistic and affordable haptic feedback. Therefore, they cannot offer the trainee a suitable practicing environment. In this paper, a haptic solution for interventional radiology (IR) procedures is proposed. System architecture of a web-based training system for IR procedures is briefly presented first. Then, the mechanical structure, the working principle and the application of a haptic device are discussed in detail. The haptic device works as an interface between the training environment and the trainees and is placed at the end user side. With the system, the user can be trained on the interventional radiology procedures - navigating catheters, inflating balloons, deploying coils and placing stents on the web and get surgical haptic feedback in real time.
Development of a Haptic Interface for Natural Orifice Translumenal Endoscopic Surgery Simulation
Dargar, Saurabh; Sankaranarayanan, Ganesh
2016-01-01
Natural orifice translumenal endoscopic surgery (NOTES) is a minimally invasive procedure, which utilizes the body’s natural orifices to gain access to the peritoneal cavity. The NOTES procedure is designed to minimize external scarring and patient trauma, however flexible endoscopy based pure NOTES procedures require critical scope handling skills. The delicate nature of the NOTES procedure requires extensive training, thus to improve access to training while reducing risk to patients we have designed and developed the VTEST©, a virtual reality NOTES simulator. As part of the simulator, a novel decoupled 2-DOF haptic device was developed to provide realistic force feedback to the user in training. A series of experiments were performed to determine the behavioral characteristics of the device. The device was found capable of rendering up to 5.62N and 0.190Nm of continuous force and torque in the translational and rotational DOF, respectively. The device possesses 18.1Hz and 5.7Hz of force bandwidth in the translational and rotational DOF, respectively. A feedforward friction compensator was also successfully implemented to minimize the negative impact of friction during the interaction with the device. In this work we have presented the detailed development and evaluation of the haptic device for the VTEST©. PMID:27008674
Coaxial needle insertion assistant with enhanced force feedback.
De Lorenzo, Danilo; Koseki, Yoshihiko; De Momi, Elena; Chinzei, Kiyoyuki; Okamura, Allison M
2013-02-01
Many medical procedures involving needle insertion into soft tissues, such as anesthesia, biopsy, brachytherapy, and placement of electrodes, are performed without image guidance. In such procedures, haptic detection of changing tissue properties at different depths during needle insertion is important for needle localization and detection of subsurface structures. However, changes in tissue mechanical properties deep inside the tissue are difficult for human operators to sense, because the relatively large friction force between the needle shaft and the surrounding tissue masks the smaller tip forces. A novel robotic coaxial needle insertion assistant, which enhances operator force perception, is presented. This one-degree-of-freedom cable-driven robot provides to the operator a scaled version of the force applied by the needle tip to the tissue, using a novel design and sensors that separate the needle tip force from the shaft friction force. The ability of human operators to use the robot to detect membranes embedded in artificial soft tissue was tested under the conditions of 1) tip force and shaft force feedback, and 2) tip force only feedback. The ratio of successful to unsuccessful membrane detections was significantly higher (up to 50%) when only the needle tip force was provided to the user.
Haptic display for the VR arthroscopy training simulator
NASA Astrophysics Data System (ADS)
Ziegler, Rolf; Brandt, Christoph; Kunstmann, Christian; Mueller, Wolfgang; Werkhaeuser, Holger
1997-05-01
A specific desire to find new training methods arose from the new fields called 'minimal invasive surgery.' With the technical advance modern video arthroscopy became the standard procedure in the ORs. Holding the optical system with the video camera in one hand, watching the operation field on the monitor, the other hand was free to guide, e.g., a probe. As arthroscopy became a more common procedure it became obvious that some sort of special training was necessary to guarantee a certain level of qualification of the surgeons. Therefore, a hospital in Frankfurt, Germany approached the Fraunhofer Institute for Computer Graphics to develop a training system for arthroscopy based on VR techniques. At least the main drawback of the developed simulator is the missing of haptic perception, especially of force feedback. In cooperation with the Department of Electro-Mechanical Construction at the Darmstadt Technical University we have designed and built a haptic display for the VR arthroscopy training simulator. In parallel we developed a concept for the integration of the haptic display in a configurable way.
Soft tissue deformation modelling through neural dynamics-based reaction-diffusion mechanics.
Zhang, Jinao; Zhong, Yongmin; Gu, Chengfan
2018-05-30
Soft tissue deformation modelling forms the basis of development of surgical simulation, surgical planning and robotic-assisted minimally invasive surgery. This paper presents a new methodology for modelling of soft tissue deformation based on reaction-diffusion mechanics via neural dynamics. The potential energy stored in soft tissues due to a mechanical load to deform tissues away from their rest state is treated as the equivalent transmembrane potential energy, and it is distributed in the tissue masses in the manner of reaction-diffusion propagation of nonlinear electrical waves. The reaction-diffusion propagation of mechanical potential energy and nonrigid mechanics of motion are combined to model soft tissue deformation and its dynamics, both of which are further formulated as the dynamics of cellular neural networks to achieve real-time computational performance. The proposed methodology is implemented with a haptic device for interactive soft tissue deformation with force feedback. Experimental results demonstrate that the proposed methodology exhibits nonlinear force-displacement relationship for nonlinear soft tissue deformation. Homogeneous, anisotropic and heterogeneous soft tissue material properties can be modelled through the inherent physical properties of mass points. Graphical abstract Soft tissue deformation modelling with haptic feedback via neural dynamics-based reaction-diffusion mechanics.
Virtual Reality simulator for dental anesthesia training in the inferior alveolar nerve block.
Corrêa, Cléber Gimenez; Machado, Maria Aparecida de Andrade Moreira; Ranzini, Edith; Tori, Romero; Nunes, Fátima de Lourdes Santos
2017-01-01
This study shows the development and validation of a dental anesthesia-training simulator, specifically for the inferior alveolar nerve block (IANB). The system developed provides the tactile sensation of inserting a real needle in a human patient, using Virtual Reality (VR) techniques and a haptic device that can provide a perceived force feedback in the needle insertion task during the anesthesia procedure. To simulate a realistic anesthesia procedure, a Carpule syringe was coupled to a haptic device. The Volere method was used to elicit requirements from users in the Dentistry area; Repeated Measures Two-Way ANOVA (Analysis of Variance), Tukey post-hoc test and averages for the results' analysis. A questionnaire-based subjective evaluation method was applied to collect information about the simulator, and 26 people participated in the experiments (12 beginners, 12 at intermediate level, and 2 experts). The questionnaire included profile, preferences (number of viewpoints, texture of the objects, and haptic device handler), as well as visual (appearance, scale, and position of objects) and haptic aspects (motion space, tactile sensation, and motion reproduction). The visual aspect was considered appropriate and the haptic feedback must be improved, which the users can do by calibrating the virtual tissues' resistance. The evaluation of visual aspects was influenced by the participants' experience, according to ANOVA test (F=15.6, p=0.0002, with p<0.01). The user preferences were the simulator with two viewpoints, objects with texture based on images and the device with a syringe coupled to it. The simulation was considered thoroughly satisfactory for the anesthesia training, considering the needle insertion task, which includes the correct insertion point and depth, as well as the perception of tissues resistances during the insertion.
Robot-assisted training of the kinesthetic sense: enhancing proprioception after stroke.
De Santis, Dalia; Zenzeri, Jacopo; Casadio, Maura; Masia, Lorenzo; Riva, Assunta; Morasso, Pietro; Squeri, Valentina
2014-01-01
Proprioception has a crucial role in promoting or hindering motor learning. In particular, an intact position sense strongly correlates with the chances of recovery after stroke. A great majority of neurological patients present both motor dysfunctions and impairments in kinesthesia, but traditional robot and virtual reality training techniques focus either in recovering motor functions or in assessing proprioceptive deficits. An open challenge is to implement effective and reliable tests and training protocols for proprioception that go beyond the mere position sense evaluation and exploit the intrinsic bidirectionality of the kinesthetic sense, which refers to both sense of position and sense of movement. Modulated haptic interaction has a leading role in promoting sensorimotor integration, and it is a natural way to enhance volitional effort. Therefore, we designed a preliminary clinical study to test a new proprioception-based motor training technique for augmenting kinesthetic awareness via haptic feedback. The feedback was provided by a robotic manipulandum and the test involved seven chronic hemiparetic subjects over 3 weeks. The protocol included evaluation sessions that consisted of a psychometric estimate of the subject's kinesthetic sensation, and training sessions, in which the subject executed planar reaching movements in the absence of vision and under a minimally assistive haptic guidance made by sequences of graded force pulses. The bidirectional haptic interaction between the subject and the robot was optimally adapted to each participant in order to achieve a uniform task difficulty over the workspace. All the subjects consistently improved in the perceptual scores as a consequence of training. Moreover, they could minimize the level of haptic guidance in time. Results suggest that the proposed method is effective in enhancing kinesthetic acuity, but the level of impairment may affect the ability of subjects to retain their improvement in time.
Robot-Assisted Training of the Kinesthetic Sense: Enhancing Proprioception after Stroke
De Santis, Dalia; Zenzeri, Jacopo; Casadio, Maura; Masia, Lorenzo; Riva, Assunta; Morasso, Pietro; Squeri, Valentina
2015-01-01
Proprioception has a crucial role in promoting or hindering motor learning. In particular, an intact position sense strongly correlates with the chances of recovery after stroke. A great majority of neurological patients present both motor dysfunctions and impairments in kinesthesia, but traditional robot and virtual reality training techniques focus either in recovering motor functions or in assessing proprioceptive deficits. An open challenge is to implement effective and reliable tests and training protocols for proprioception that go beyond the mere position sense evaluation and exploit the intrinsic bidirectionality of the kinesthetic sense, which refers to both sense of position and sense of movement. Modulated haptic interaction has a leading role in promoting sensorimotor integration, and it is a natural way to enhance volitional effort. Therefore, we designed a preliminary clinical study to test a new proprioception-based motor training technique for augmenting kinesthetic awareness via haptic feedback. The feedback was provided by a robotic manipulandum and the test involved seven chronic hemiparetic subjects over 3 weeks. The protocol included evaluation sessions that consisted of a psychometric estimate of the subject’s kinesthetic sensation, and training sessions, in which the subject executed planar reaching movements in the absence of vision and under a minimally assistive haptic guidance made by sequences of graded force pulses. The bidirectional haptic interaction between the subject and the robot was optimally adapted to each participant in order to achieve a uniform task difficulty over the workspace. All the subjects consistently improved in the perceptual scores as a consequence of training. Moreover, they could minimize the level of haptic guidance in time. Results suggest that the proposed method is effective in enhancing kinesthetic acuity, but the level of impairment may affect the ability of subjects to retain their improvement in time. PMID:25601833
Olsson, Pontus; Nysjö, Fredrik; Hirsch, Jan-Michaél; Carlbom, Ingrid B
2013-11-01
Cranio-maxillofacial (CMF) surgery to restore normal skeletal anatomy in patients with serious trauma to the face can be both complex and time-consuming. But it is generally accepted that careful pre-operative planning leads to a better outcome with a higher degree of function and reduced morbidity in addition to reduced time in the operating room. However, today's surgery planning systems are primitive, relying mostly on the user's ability to plan complex tasks with a two-dimensional graphical interface. A system for planning the restoration of skeletal anatomy in facial trauma patients using a virtual model derived from patient-specific CT data. The system combines stereo visualization with six degrees-of-freedom, high-fidelity haptic feedback that enables analysis, planning, and preoperative testing of alternative solutions for restoring bone fragments to their proper positions. The stereo display provides accurate visual spatial perception, and the haptics system provides intuitive haptic feedback when bone fragments are in contact as well as six degrees-of-freedom attraction forces for precise bone fragment alignment. A senior surgeon without prior experience of the system received 45 min of system training. Following the training session, he completed a virtual reconstruction in 22 min of a complex mandibular fracture with an adequately reduced result. Preliminary testing with one surgeon indicates that our surgery planning system, which combines stereo visualization with sophisticated haptics, has the potential to become a powerful tool for CMF surgery planning. With little training, it allows a surgeon to complete a complex plan in a short amount of time.
Schvartzman, Sara C; Silva, Rebeka; Salisbury, Ken; Gaudilliere, Dyani; Girod, Sabine
2014-10-01
Computer-assisted surgical (CAS) planning tools have become widely available in craniomaxillofacial surgery, but are time consuming and often require professional technical assistance to simulate a case. An initial oral and maxillofacial (OM) surgical user experience was evaluated with a newly developed CAS system featuring a bimanual sense of touch (haptic). Three volunteer OM surgeons received a 5-minute verbal introduction to the use of a newly developed haptic-enabled planning system. The surgeons were instructed to simulate mandibular fracture reductions of 3 clinical cases, within a 15-minute time limit and without a time limit, and complete a questionnaire to assess their subjective experience with the system. Standard landmarks and linear and angular measurements between the simulated results and the actual surgical outcome were compared. After the 5-minute instruction, all 3 surgeons were able to use the system independently. The analysis of standardized anatomic measurements showed that the simulation results within a 15-minute time limit were not significantly different from those without a time limit. Mean differences between measurements of surgical and simulated fracture reductions were within current resolution limitations in collision detection, segmentation of computed tomographic scans, and haptic devices. All 3 surgeons reported that the system was easy to learn and use and that they would be comfortable integrating it into their daily clinical practice for trauma cases. A CAS system with a haptic interface that capitalizes on touch and force feedback experience similar to operative procedures is fast and easy for OM surgeons to learn and use. Copyright © 2014 American Association of Oral and Maxillofacial Surgeons. All rights reserved.
Cutaneous Feedback of Fingertip Deformation and Vibration for Palpation in Robotic Surgery.
Pacchierotti, Claudio; Prattichizzo, Domenico; Kuchenbecker, Katherine J
2016-02-01
Despite its expected clinical benefits, current teleoperated surgical robots do not provide the surgeon with haptic feedback largely because grounded forces can destabilize the system's closed-loop controller. This paper presents an alternative approach that enables the surgeon to feel fingertip contact deformations and vibrations while guaranteeing the teleoperator's stability. We implemented our cutaneous feedback solution on an Intuitive Surgical da Vinci Standard robot by mounting a SynTouch BioTac tactile sensor to the distal end of a surgical instrument and a custom cutaneous display to the corresponding master controller. As the user probes the remote environment, the contact deformations, dc pressure, and ac pressure (vibrations) sensed by the BioTac are directly mapped to input commands for the cutaneous device's motors using a model-free algorithm based on look-up tables. The cutaneous display continually moves, tilts, and vibrates a flat plate at the operator's fingertip to optimally reproduce the tactile sensations experienced by the BioTac. We tested the proposed approach by having eighteen subjects use the augmented da Vinci robot to palpate a heart model with no haptic feedback, only deformation feedback, and deformation plus vibration feedback. Fingertip deformation feedback significantly improved palpation performance by reducing the task completion time, the pressure exerted on the heart model, and the subject's absolute error in detecting the orientation of the embedded plastic stick. Vibration feedback significantly improved palpation performance only for the seven subjects who dragged the BioTac across the model, rather than pressing straight into it.
Haptic feedback for virtual assembly
NASA Astrophysics Data System (ADS)
Luecke, Greg R.; Zafer, Naci
1998-12-01
Assembly operations require high speed and precision with low cost. The manufacturing industry has recently turned attenuation to the possibility of investigating assembly procedures using graphical display of CAD parts. For these tasks, some sort of feedback to the person is invaluable in providing a real sense of interaction with virtual parts. This research develops the use of a commercial assembly robot as the haptic display in such tasks. For demonstration, a peg-hole insertion task is studied. Kane's Method is employed to derive the dynamics of the peg and the contact motions between the peg and the hole. A handle modeled as a cylindrical peg is attached to the end effector of a PUMA 560 robotic arm. The arm is handle modeled as a cylindrical peg is attached to the end effector of a PUMA 560 robotic arm. The arm is equipped with a six axis force/torque transducer. The use grabs the handle and the user-applied forces are recorded. A 300 MHz Pentium computer is used to simulate the dynamics of the virtual peg and its interactions as it is inserted in the virtual hole. The computed torque control is then employed to exert the full dynamics of the task to the user hand. Visual feedback is also incorporated to help the user in the process of inserting the peg into the hole. Experimental results are presented to show several contact configurations for this virtually simulated task.
Using haptic feedback to increase seat belt use of service vehicle drivers.
DOT National Transportation Integrated Search
2011-01-01
This study pilot-tested a new application of a technology-based intervention to increase seat belt use. The technology was based on a : contingency in which unbelted drivers experienced sustained haptic feedback to the gas pedal when they exceeded 25...
Shadow-driven 4D haptic visualization.
Zhang, Hui; Hanson, Andrew
2007-01-01
Just as we can work with two-dimensional floor plans to communicate 3D architectural design, we can exploit reduced-dimension shadows to manipulate the higher-dimensional objects generating the shadows. In particular, by taking advantage of physically reactive 3D shadow-space controllers, we can transform the task of interacting with 4D objects to a new level of physical reality. We begin with a teaching tool that uses 2D knot diagrams to manipulate the geometry of 3D mathematical knots via their projections; our unique 2D haptic interface allows the user to become familiar with sketching, editing, exploration, and manipulation of 3D knots rendered as projected imageson a 2D shadow space. By combining graphics and collision-sensing haptics, we can enhance the 2D shadow-driven editing protocol to successfully leverage 2D pen-and-paper or blackboard skills. Building on the reduced-dimension 2D editing tool for manipulating 3D shapes, we develop the natural analogy to produce a reduced-dimension 3D tool for manipulating 4D shapes. By physically modeling the correct properties of 4D surfaces, their bending forces, and their collisions in the 3D haptic controller interface, we can support full-featured physical exploration of 4D mathematical objects in a manner that is otherwise far beyond the experience accessible to human beings. As far as we are aware, this paper reports the first interactive system with force-feedback that provides "4D haptic visualization" permitting the user to model and interact with 4D cloth-like objects.
Assisting Movement Training and Execution With Visual and Haptic Feedback.
Ewerton, Marco; Rother, David; Weimar, Jakob; Kollegger, Gerrit; Wiemeyer, Josef; Peters, Jan; Maeda, Guilherme
2018-01-01
In the practice of motor skills in general, errors in the execution of movements may go unnoticed when a human instructor is not available. In this case, a computer system or robotic device able to detect movement errors and propose corrections would be of great help. This paper addresses the problem of how to detect such execution errors and how to provide feedback to the human to correct his/her motor skill using a general, principled methodology based on imitation learning. The core idea is to compare the observed skill with a probabilistic model learned from expert demonstrations. The intensity of the feedback is regulated by the likelihood of the model given the observed skill. Based on demonstrations, our system can, for example, detect errors in the writing of characters with multiple strokes. Moreover, by using a haptic device, the Haption Virtuose 6D, we demonstrate a method to generate haptic feedback based on a distribution over trajectories, which could be used as an auxiliary means of communication between an instructor and an apprentice. Additionally, given a performance measurement, the haptic device can help the human discover and perform better movements to solve a given task. In this case, the human first tries a few times to solve the task without assistance. Our framework, in turn, uses a reinforcement learning algorithm to compute haptic feedback, which guides the human toward better solutions.
Heterogeneous Deformable Modeling of Bio-Tissues and Haptic Force Rendering for Bio-Object Modeling
NASA Astrophysics Data System (ADS)
Lin, Shiyong; Lee, Yuan-Shin; Narayan, Roger J.
This paper presents a novel technique for modeling soft biological tissues as well as the development of an innovative interface for bio-manufacturing and medical applications. Heterogeneous deformable models may be used to represent the actual internal structures of deformable biological objects, which possess multiple components and nonuniform material properties. Both heterogeneous deformable object modeling and accurate haptic rendering can greatly enhance the realism and fidelity of virtual reality environments. In this paper, a tri-ray node snapping algorithm is proposed to generate a volumetric heterogeneous deformable model from a set of object interface surfaces between different materials. A constrained local static integration method is presented for simulating deformation and accurate force feedback based on the material properties of a heterogeneous structure. Biological soft tissue modeling is used as an example to demonstrate the proposed techniques. By integrating the heterogeneous deformable model into a virtual environment, users can both observe different materials inside a deformable object as well as interact with it by touching the deformable object using a haptic device. The presented techniques can be used for surgical simulation, bio-product design, bio-manufacturing, and medical applications.
Gal, Gilad Ben; Weiss, Ervin I; Gafni, Naomi; Ziv, Amitai
2011-04-01
Virtual reality force feedback simulators provide a haptic (sense of touch) feedback through the device being held by the user. The simulator's goal is to provide a learning experience resembling reality. A newly developed haptic simulator (IDEA Dental, Las Vegas, NV, USA) was assessed in this study. Our objectives were to assess the simulator's ability to serve as a tool for dental instruction, self-practice, and student evaluation, as well as to evaluate the sensation it provides. A total of thirty-three evaluators were divided into two groups. The first group consisted of twenty-one experienced dental educators; the second consisted of twelve fifth-year dental students. Each participant performed drilling tasks using the simulator and filled out a questionnaire regarding the simulator and potential ways of using it in dental education. The results show that experienced dental faculty members as well as advanced dental students found that the simulator could provide significant potential benefits in the teaching and self-learning of manual dental skills. Development of the simulator's tactile sensation is needed to attune it to genuine sensation. Further studies relating to aspects of the simulator's structure and its predictive validity, its scoring system, and the nature of the performed tasks should be conducted.
Takahashi, Chie; Watt, Simon J.
2014-01-01
When we hold an object while looking at it, estimates from visual and haptic cues to size are combined in a statistically optimal fashion, whereby the “weight” given to each signal reflects their relative reliabilities. This allows object properties to be estimated more precisely than would otherwise be possible. Tools such as pliers and tongs systematically perturb the mapping between object size and the hand opening. This could complicate visual-haptic integration because it may alter the reliability of the haptic signal, thereby disrupting the determination of appropriate signal weights. To investigate this we first measured the reliability of haptic size estimates made with virtual pliers-like tools (created using a stereoscopic display and force-feedback robots) with different “gains” between hand opening and object size. Haptic reliability in tool use was straightforwardly determined by a combination of sensitivity to changes in hand opening and the effects of tool geometry. The precise pattern of sensitivity to hand opening, which violated Weber's law, meant that haptic reliability changed with tool gain. We then examined whether the visuo-motor system accounts for these reliability changes. We measured the weight given to visual and haptic stimuli when both were available, again with different tool gains, by measuring the perceived size of stimuli in which visual and haptic sizes were varied independently. The weight given to each sensory cue changed with tool gain in a manner that closely resembled the predictions of optimal sensory integration. The results are consistent with the idea that different tool geometries are modeled by the brain, allowing it to calculate not only the distal properties of objects felt with tools, but also the certainty with which those properties are known. These findings highlight the flexibility of human sensory integration and tool-use, and potentially provide an approach for optimizing the design of visual-haptic devices. PMID:24592245
Husman, M A B; Maqbool, H F; Awad, M I; Abouhossein, A; Dehghani-Sanij, A A
2016-08-01
Haptic feedback to lower limb amputees is essential to maximize the functionality of a prosthetic device by providing information to the user about the interaction with the environment and the position of the prostheses in space. Severed sensory pathway and the absence of connection between the prosthesis and the Central Nervous System (CNS) after lower limb amputation reduces balance control, increases visual dependency and increases risk of falls among amputees. This work describes the design of a wearable haptic feedback device for lower limb amputees using lateral skin-stretch modality intended to serve as a feedback cue during ambulation. A feedback scheme was proposed based on gait event detection for possible real-time postural adjustment. Preliminary perceptual test with healthy subjects in static condition was carried out and the results indicated over 98% accuracy in determining stimuli location around the upper leg region, suggesting good perceptibility of the delivered stimuli.
Barmpoutis, Angelos; Alzate, Jose; Beekhuizen, Samantha; Delgado, Horacio; Donaldson, Preston; Hall, Andrew; Lago, Charlie; Vidal, Kevin; Fox, Emily J
2016-01-01
In this paper a prototype system is presented for home-based physical tele-therapy using a wearable device for haptic feedback. The haptic feedback is generated as a sequence of vibratory cues from 8 vibrator motors equally spaced along an elastic wearable band. The motors guide the patients' movement as they perform a prescribed exercise routine in a way that replaces the physical therapists' haptic guidance in an unsupervised or remotely supervised home-based therapy session. A pilot study of 25 human subjects was performed that focused on: a) testing the capability of the system to guide the users in arbitrary motion paths in the space and b) comparing the motion of the users during typical physical therapy exercises with and without haptic-based guidance. The results demonstrate the efficacy of the proposed system.
Dynamics modeling for parallel haptic interfaces with force sensing and control.
Bernstein, Nicholas; Lawrence, Dale; Pao, Lucy
2013-01-01
Closed-loop force control can be used on haptic interfaces (HIs) to mitigate the effects of mechanism dynamics. A single multidimensional force-torque sensor is often employed to measure the interaction force between the haptic device and the user's hand. The parallel haptic interface at the University of Colorado (CU) instead employs smaller 1D force sensors oriented along each of the five actuating rods to build up a 5D force vector. This paper shows that a particular manipulandum/hand partition in the system dynamics is induced by the placement and type of force sensing, and discusses the implications on force and impedance control for parallel haptic interfaces. The details of a "squaring down" process are also discussed, showing how to obtain reduced degree-of-freedom models from the general six degree-of-freedom dynamics formulation.
Virtual reality cerebral aneurysm clipping simulation with real-time haptic feedback.
Alaraj, Ali; Luciano, Cristian J; Bailey, Daniel P; Elsenousi, Abdussalam; Roitberg, Ben Z; Bernardo, Antonio; Banerjee, P Pat; Charbel, Fady T
2015-03-01
With the decrease in the number of cerebral aneurysms treated surgically and the increase of complexity of those treated surgically, there is a need for simulation-based tools to teach future neurosurgeons the operative techniques of aneurysm clipping. To develop and evaluate the usefulness of a new haptic-based virtual reality simulator in the training of neurosurgical residents. A real-time sensory haptic feedback virtual reality aneurysm clipping simulator was developed using the ImmersiveTouch platform. A prototype middle cerebral artery aneurysm simulation was created from a computed tomographic angiogram. Aneurysm and vessel volume deformation and haptic feedback are provided in a 3-dimensional immersive virtual reality environment. Intraoperative aneurysm rupture was also simulated. Seventeen neurosurgery residents from 3 residency programs tested the simulator and provided feedback on its usefulness and resemblance to real aneurysm clipping surgery. Residents thought that the simulation would be useful in preparing for real-life surgery. About two-thirds of the residents thought that the 3-dimensional immersive anatomic details provided a close resemblance to real operative anatomy and accurate guidance for deciding surgical approaches. They thought the simulation was useful for preoperative surgical rehearsal and neurosurgical training. A third of the residents thought that the technology in its current form provided realistic haptic feedback for aneurysm surgery. Neurosurgical residents thought that the novel immersive VR simulator is helpful in their training, especially because they do not get a chance to perform aneurysm clippings until late in their residency programs.
Three-axis force sensor with fiber Bragg grating.
Hyundo Choi; Yoan Lim; Junhyung Kim
2017-07-01
Haptic feedback is critical for many surgical tasks, and it replicates force reflections at the surgical site. To meet the force reflection requirements, we propose a force sensor with an optical fiber Bragg grating (FBG) for robotic surgery. The force sensor can calculate three directional forces of an instrument from the strain of three FBGs, even under electromagnetic interference. A flexible ring-shape structure connects an instrument tip and fiber strain gages to sense three directional force. And a stopper mechanism is added in the structure to avoid plastic deformation under unexpected large force on the instrument tip. The proposed sensor is experimentally verified to have a sensing range from -12 N to 12 N, and its sensitivity was less than 0.06 N.
Coincidence avoidance principle in surface haptic interpretation
Manuel, Steven G.; Klatzky, Roberta L.; Peshkin, Michael A.; Colgate, James Edward
2015-01-01
When multiple fingertips experience force sensations, how does the brain interpret the combined sensation? In particular, under what conditions are the sensations perceived as separate or, alternatively, as an integrated whole? In this work, we used a custom force-feedback device to display force signals to two fingertips (index finger and thumb) as they traveled along collinear paths. Each finger experienced a pattern of forces that, taken individually, produced illusory virtual bumps, and subjects reported whether they felt zero, one, or two bumps. We varied the spatial separation between these bump-like force-feedback regions, from being much greater than the finger span to nearly exactly the finger span. When the bump spacing was the same as the finger span, subjects tended to report only one bump. We found that the results are consistent with a quantitative model of perception in which the brain selects a structural interpretation of force signals that relies on minimizing coincidence stemming from accidental alignments between fingertips and inferred surface structures. PMID:25675477
Magnetorheological fluid based automotive steer-by-wire systems
NASA Astrophysics Data System (ADS)
Ahmadkhanlou, Farzad; Washington, Gregory N.; Bechtel, Stephen E.; Wang, Yingru
2006-03-01
The idea of this paper is to design a Magnetorheological (MR) fluid based damper for steer-by-wire systems to provide sensory feedback to the driver. The advantages of using MR fluids in haptic devices stem from the increase in transparency gained from the lightweight semiactive system and controller implementation. The performance of MR fluid based steer-by wire system depends on MR fluid model and specifications, MR damper geometry, and the control algorithm. All of these factors are addressed in this study. The experimental results show the improvements in steer-by-wire by adding force feedback to the system.
Characteristic analysis and simulation for polysilicon comb micro-accelerometer
NASA Astrophysics Data System (ADS)
Liu, Fengli; Hao, Yongping
2008-10-01
High force update rate is a key factor for achieving high performance haptic rendering, which imposes a stringent real time requirement upon the execution environment of the haptic system. This requirement confines the haptic system to simplified environment for reducing the computation cost of haptic rendering algorithms. In this paper, we present a novel "hyper-threading" architecture consisting of several threads for haptic rendering. The high force update rate is achieved with relatively large computation time interval for each haptic loop. The proposed method was testified and proved to be effective with experiments on virtual wall prototype haptic system via Delta Haptic Device.
Cappa, Paolo; Clerico, Andrea; Nov, Oded; Porfiri, Maurizio
2013-01-01
In this paper, we demonstrate that healthy adults respond differentially to the administration of force feedback and the presentation of scientific content in a virtual environment, where they interact with a low-cost haptic device. Subjects are tasked with controlling the movement of a cursor on a predefined trajectory that is superimposed on a map of New York City’s Bronx Zoo. The system is characterized in terms of a suite of objective indices quantifying the subjects’ dexterity in planning and generating the multijoint visuomotor tasks. We find that force feedback regulates the smoothness, accuracy, and duration of the subject’s movement, whereby converging or diverging force fields influence the range of variations of the hand speed. Finally, our findings provide preliminary evidence that using educational content increases subjects’ satisfaction. Improving the level of interest through the inclusion of learning elements can increase the time spent performing rehabilitation tasks and promote learning in a new context. PMID:24349562
Multimodal Interaction with Speech, Gestures and Haptic Feedback in a Media Center Application
NASA Astrophysics Data System (ADS)
Turunen, Markku; Hakulinen, Jaakko; Hella, Juho; Rajaniemi, Juha-Pekka; Melto, Aleksi; Mäkinen, Erno; Rantala, Jussi; Heimonen, Tomi; Laivo, Tuuli; Soronen, Hannu; Hansen, Mervi; Valkama, Pellervo; Miettinen, Toni; Raisamo, Roope
We demonstrate interaction with a multimodal media center application. Mobile phone-based interface includes speech and gesture input and haptic feedback. The setup resembles our long-term public pilot study, where a living room environment containing the application was constructed inside a local media museum allowing visitors to freely test the system.
Haptic seat for fuel economy feedback
Bobbitt, III, John Thomas
2016-08-30
A process of providing driver fuel economy feedback is disclosed in which vehicle sensors provide for haptic feedback on fuel usage. Such sensors may include one or more of a speed sensors, global position satellite units, vehicle pitch/roll angle sensors, suspension displacement sensors, longitudinal accelerometer sensors, throttle position in sensors, steering angle sensors, break pressure sensors, and lateral accelerometer sensors. Sensors used singlely or collectively can provide enhanced feedback as to various environmental conditions and operating conditions such that a more accurate assessment of fuel economy information can be provided to the driver.
Teleoperated master-slave needle insertion.
Abolhassani, Niki; Patel, Rajni V
2009-12-01
Accuracy of needle tip placement and needle tracking in soft tissue are of particular importance in many medical procedures. In recent years, developing autonomous and teleoperated systems for needle insertion has become an active area of research. In this study, needle insertion was performed using a master-slave set-up with multi-degrees of freedom. The effect of force feedback on the accuracy of needle insertion was investigated. In addition, this study compared autonomous, teleoperated and semi-autonomous needle insertion. The results of this study show that incorporation of force feedback can improve teleoperated needle insertion. However, autonomous and semi-autonomous needle insertions, which use feedback from a deflection model, provide significantly better performance. Development of a haptic master-slave needle insertion system, which is capable of performing some autonomous tasks based on feedback from tissue deformation and needle deflection models, can improve the performance of autonomous robotics-based insertions as well as non-autonomous teleoperated manual insertions. Copyright (c) 2009 John Wiley & Sons, Ltd.
Shang, Weijian; Su, Hao; Li, Gang; Fischer, Gregory S.
2014-01-01
This paper presents a surgical master-slave tele-operation system for percutaneous interventional procedures under continuous magnetic resonance imaging (MRI) guidance. This system consists of a piezoelectrically actuated slave robot for needle placement with integrated fiber optic force sensor utilizing Fabry-Perot interferometry (FPI) sensing principle. The sensor flexure is optimized and embedded to the slave robot for measuring needle insertion force. A novel, compact opto-mechanical FPI sensor interface is integrated into an MRI robot control system. By leveraging the complementary features of pneumatic and piezoelectric actuation, a pneumatically actuated haptic master robot is also developed to render force associated with needle placement interventions to the clinician. An aluminum load cell is implemented and calibrated to close the impedance control loop of the master robot. A force-position control algorithm is developed to control the hybrid actuated system. Teleoperated needle insertion is demonstrated under live MR imaging, where the slave robot resides in the scanner bore and the user manipulates the master beside the patient outside the bore. Force and position tracking results of the master-slave robot are demonstrated to validate the tracking performance of the integrated system. It has a position tracking error of 0.318mm and sine wave force tracking error of 2.227N. PMID:25126446
Shang, Weijian; Su, Hao; Li, Gang; Fischer, Gregory S
2013-01-01
This paper presents a surgical master-slave tele-operation system for percutaneous interventional procedures under continuous magnetic resonance imaging (MRI) guidance. This system consists of a piezoelectrically actuated slave robot for needle placement with integrated fiber optic force sensor utilizing Fabry-Perot interferometry (FPI) sensing principle. The sensor flexure is optimized and embedded to the slave robot for measuring needle insertion force. A novel, compact opto-mechanical FPI sensor interface is integrated into an MRI robot control system. By leveraging the complementary features of pneumatic and piezoelectric actuation, a pneumatically actuated haptic master robot is also developed to render force associated with needle placement interventions to the clinician. An aluminum load cell is implemented and calibrated to close the impedance control loop of the master robot. A force-position control algorithm is developed to control the hybrid actuated system. Teleoperated needle insertion is demonstrated under live MR imaging, where the slave robot resides in the scanner bore and the user manipulates the master beside the patient outside the bore. Force and position tracking results of the master-slave robot are demonstrated to validate the tracking performance of the integrated system. It has a position tracking error of 0.318mm and sine wave force tracking error of 2.227N.
Performance of a novel micro force vector sensor and outlook into its biomedical applications
NASA Astrophysics Data System (ADS)
Meiss, Thorsten; Rossner, Tim; Minamisava Faria, Carlos; Völlmeke, Stefan; Opitz, Thomas; Werthschützky, Roland
2011-05-01
For the HapCath system, which provides haptic feedback of the forces acting on a guide wire's tip during vascular catheterization, very small piezoresistive force sensors of 200•200•640μm3 have been developed. This paper focuses on the characterization of the measurement performance and on possible new applications. Besides the determination of the dynamic measurement performance, special focus is put onto the results of the 3- component force vector calibration. This article addresses special advantageous characteristics of the sensor, but also the limits of applicability will be addressed. As for the special characteristics of the sensor, the second part of the article demonstrates new applications which can be opened up with the novel force sensor, like automatic navigation of medical or biological instruments without impacting surrounding tissue, surface roughness evaluation in biomedical systems, needle insertion with tactile or higher level feedback, or even building tactile hairs for artificial organisms.
Force modeling for incision surgery into tissue with haptic application
NASA Astrophysics Data System (ADS)
Kim, Pyunghwa; Kim, Soomin; Choi, Seung-Hyun; Oh, Jong-Seok; Choi, Seung-Bok
2015-04-01
This paper presents a novel force modeling for an incision surgery into tissue and its haptic application for a surgeon. During the robot-assisted incision surgery, it is highly urgent to develop the haptic system for realizing sense of touch in the surgical area because surgeons cannot sense sensations. To achieve this goal, the force modeling related to reaction force of biological tissue is proposed in the perspective on energy. The force model describes reaction force focused on the elastic feature of tissue during the incision surgery. Furthermore, the force is realized using calculated information from the model by haptic device using magnetorheological fluid (MRF). The performance of realized force that is controlled by PID controller with open loop control is evaluated.
Haptic control with environment force estimation for telesurgery.
Bhattacharjee, Tapomayukh; Son, Hyoung Il; Lee, Doo Yong
2008-01-01
Success of telesurgical operations depends on better position tracking ability of the slave device. Improved position tracking of the slave device can lead to safer and less strenuous telesurgical operations. The two-channel force-position control architecture is widely used for better position tracking ability. This architecture requires force sensors for direct force feedback. Force sensors may not be a good choice in the telesurgical environment because of the inherent noise, and limitation in the deployable place and space. Hence, environment force estimation is developed using the concept of the robot function parameter matrix and a recursive least squares method. Simulation results show efficacy of the proposed method. The slave device successfully tracks the position of the master device, and the estimation error quickly becomes negligible.
Eye Tracking of Occluded Self-Moved Targets: Role of Haptic Feedback and Hand-Target Dynamics.
Danion, Frederic; Mathew, James; Flanagan, J Randall
2017-01-01
Previous studies on smooth pursuit eye movements have shown that humans can continue to track the position of their hand, or a target controlled by the hand, after it is occluded, thereby demonstrating that arm motor commands contribute to the prediction of target motion driving pursuit eye movements. Here, we investigated this predictive mechanism by manipulating both the complexity of the hand-target mapping and the provision of haptic feedback. Two hand-target mappings were used, either a rigid (simple) one in which hand and target motion matched perfectly or a nonrigid (complex) one in which the target behaved as a mass attached to the hand by means of a spring. Target animation was obtained by asking participants to oscillate a lightweight robotic device that provided (or not) haptic feedback consistent with the target dynamics. Results showed that as long as 7 s after target occlusion, smooth pursuit continued to be the main contributor to total eye displacement (∼60%). However, the accuracy of eye-tracking varied substantially across experimental conditions. In general, eye-tracking was less accurate under the nonrigid mapping, as reflected by higher positional and velocity errors. Interestingly, haptic feedback helped to reduce the detrimental effects of target occlusion when participants used the nonrigid mapping, but not when they used the rigid one. Overall, we conclude that the ability to maintain smooth pursuit in the absence of visual information can extend to complex hand-target mappings, but the provision of haptic feedback is critical for the maintenance of accurate eye-tracking performance.
Eye Tracking of Occluded Self-Moved Targets: Role of Haptic Feedback and Hand-Target Dynamics
Mathew, James
2017-01-01
Abstract Previous studies on smooth pursuit eye movements have shown that humans can continue to track the position of their hand, or a target controlled by the hand, after it is occluded, thereby demonstrating that arm motor commands contribute to the prediction of target motion driving pursuit eye movements. Here, we investigated this predictive mechanism by manipulating both the complexity of the hand-target mapping and the provision of haptic feedback. Two hand-target mappings were used, either a rigid (simple) one in which hand and target motion matched perfectly or a nonrigid (complex) one in which the target behaved as a mass attached to the hand by means of a spring. Target animation was obtained by asking participants to oscillate a lightweight robotic device that provided (or not) haptic feedback consistent with the target dynamics. Results showed that as long as 7 s after target occlusion, smooth pursuit continued to be the main contributor to total eye displacement (∼60%). However, the accuracy of eye-tracking varied substantially across experimental conditions. In general, eye-tracking was less accurate under the nonrigid mapping, as reflected by higher positional and velocity errors. Interestingly, haptic feedback helped to reduce the detrimental effects of target occlusion when participants used the nonrigid mapping, but not when they used the rigid one. Overall, we conclude that the ability to maintain smooth pursuit in the absence of visual information can extend to complex hand-target mappings, but the provision of haptic feedback is critical for the maintenance of accurate eye-tracking performance. PMID:28680964
Virtual Reality Cerebral Aneurysm Clipping Simulation With Real-time Haptic Feedback
Alaraj, Ali; Luciano, Cristian J.; Bailey, Daniel P.; Elsenousi, Abdussalam; Roitberg, Ben Z.; Bernardo, Antonio; Banerjee, P. Pat; Charbel, Fady T.
2014-01-01
Background With the decrease in the number of cerebral aneurysms treated surgically and the increase of complexity of those treated surgically, there is a need for simulation-based tools to teach future neurosurgeons the operative techniques of aneurysm clipping. Objective To develop and evaluate the usefulness of a new haptic-based virtual reality (VR) simulator in the training of neurosurgical residents. Methods A real-time sensory haptic feedback virtual reality aneurysm clipping simulator was developed using the Immersive Touch platform. A prototype middle cerebral artery aneurysm simulation was created from a computed tomography angiogram. Aneurysm and vessel volume deformation and haptic feedback are provided in a 3-D immersive VR environment. Intraoperative aneurysm rupture was also simulated. Seventeen neurosurgery residents from three residency programs tested the simulator and provided feedback on its usefulness and resemblance to real aneurysm clipping surgery. Results Residents felt that the simulation would be useful in preparing for real-life surgery. About two thirds of the residents felt that the 3-D immersive anatomical details provided a very close resemblance to real operative anatomy and accurate guidance for deciding surgical approaches. They believed the simulation is useful for preoperative surgical rehearsal and neurosurgical training. One third of the residents felt that the technology in its current form provided very realistic haptic feedback for aneurysm surgery. Conclusion Neurosurgical residents felt that the novel immersive VR simulator is helpful in their training especially since they do not get a chance to perform aneurysm clippings until very late in their residency programs. PMID:25599200
Lack of transfer of skills after virtual reality simulator training with haptic feedback.
Våpenstad, Cecilie; Hofstad, Erlend Fagertun; Bø, Lars Eirik; Kuhry, Esther; Johnsen, Gjermund; Mårvik, Ronald; Langø, Thomas; Hernes, Toril Nagelhus
2017-12-01
Virtual reality (VR) simulators enrich surgical training and offer training possibilities outside of the operating room (OR). In this study, we created a criterion-based training program on a VR simulator with haptic feedback and tested it by comparing the performances of a simulator group against a control group. Medical students with no experience in laparoscopy were randomly assigned to a simulator group or a control group. In the simulator group, the candidates trained until they reached predefined criteria on the LapSim ® VR simulator (Surgical Science AB, Göteborg, Sweden) with haptic feedback (Xitact TM IHP, Mentice AB, Göteborg, Sweden). All candidates performed a cholecystectomy on a porcine organ model in a box trainer (the clinical setting). The performances were video rated by two surgeons blinded to subject training status. In total, 30 students performed the cholecystectomy and had their videos rated (N = 16 simulator group, N = 14 control group). The control group achieved better video rating scores than the simulator group (p < .05). The criterion-based training program did not transfer skills to the clinical setting. Poor mechanical performance of the simulated haptic feedback is believed to have resulted in a negative training effect.
NASA Astrophysics Data System (ADS)
Erickson, David; Lacheray, Hervé; Lambert, Jason Michel; Mantegh, Iraj; Crymble, Derry; Daly, John; Zhao, Yan
2012-06-01
State-of-the-art robotic explosive ordnance disposal robotics have not, in general, adopted recent advances in control technology and man-machine interfaces and lag many years behind academia. This paper describes the Haptics-based Immersive Telerobotic System project investigating an immersive telepresence envrionment incorporating advanced vehicle control systems, Augmented immersive sensory feedback, dynamic 3D visual information, and haptic feedback for explosive ordnance disposal operators. The project aim is to provide operatiors a more sophisticated interface and expand sensory input to perform complex tasks to defeat improvised explosive devices successfully. The introduction of haptics and immersive teleprescence has the potential to shift the way teleprescence systems work for explosive ordnance disposal tasks or more widely for first responders scenarios involving remote unmanned ground vehicles.
Note: Hybrid active/passive force feedback actuator using hydrostatic transmission.
Park, Yea-Seok; Lee, Juwon; Kim, Kyung-Soo; Kim, Soohyun
2017-12-01
A hybrid actuator for haptic devices is proposed in this paper. The actuator is composed of a DC motor and a magneto-rheological (MR) brake to realize transparency and stable force control. Two piston cylinders are connected with a flexible tube to lighten the weight of the structures on the endpoint that interacts with an operator. Also, the MR brake is designed to be suitable for hydraulic transmission. For the proposed hybrid actuator, a cooperative force control method using a pressure sensor instead of a force sensor is proposed. To verify the proposed control algorithm, a virtual wall collision experiment was conducted using a developed prototype of the hybrid actuator.
Note: Hybrid active/passive force feedback actuator using hydrostatic transmission
NASA Astrophysics Data System (ADS)
Park, Yea-Seok; Lee, Juwon; Kim, Kyung-Soo; Kim, Soohyun
2017-12-01
A hybrid actuator for haptic devices is proposed in this paper. The actuator is composed of a DC motor and a magneto-rheological (MR) brake to realize transparency and stable force control. Two piston cylinders are connected with a flexible tube to lighten the weight of the structures on the endpoint that interacts with an operator. Also, the MR brake is designed to be suitable for hydraulic transmission. For the proposed hybrid actuator, a cooperative force control method using a pressure sensor instead of a force sensor is proposed. To verify the proposed control algorithm, a virtual wall collision experiment was conducted using a developed prototype of the hybrid actuator.
[Visual cuing effect for haptic angle judgment].
Era, Ataru; Yokosawa, Kazuhiko
2009-08-01
We investigated whether visual cues are useful for judging haptic angles. Participants explored three-dimensional angles with a virtual haptic feedback device. For visual cues, we use a location cue, which synchronizes haptic exploration, and a space cue, which specifies the haptic space. In Experiment 1, angles were judged more correctly with both cues, but were overestimated with a location cue only. In Experiment 2, the visual cues emphasized depth, and overestimation with location cues occurred, but space cues had no influence. The results showed that (a) when both cues are presented, haptic angles are judged more correctly. (b) Location cues facilitate only motion information, and not depth information. (c) Haptic angles are apt to be overestimated when there is both haptic and visual information.
Investigating Students' Ideas about Buoyancy and the Influence of Haptic Feedback
ERIC Educational Resources Information Center
Minogue, James; Borland, David
2016-01-01
While haptics (simulated touch) represents a potential breakthrough technology for science teaching and learning, there is relatively little research into its differential impact in the context of teaching and learning. This paper describes the testing of a haptically enhanced simulation (HES) for learning about buoyancy. Despite a lifetime of…
Modeling and Compensation of the Internal Friction Torque of a Travelling Wave Ultrasonic Motor.
Giraud, F; Sandulescu, P; Amberg, M; Lemaire-Semail, B; Ionescu, F
2011-01-01
This paper deals with the control and experimentation of a one-degree-of-freedom haptic stick, actuated by a travelling wave ultrasonic motor. This type of actuator has many interesting properties such as low-speed operation capabilities and a high torque-to-weight ratio, making it appropriate for haptic applications. However, the motor used in this application displays nonlinear behavior due to the necessary contact between its rotor and stator. Moreover, due to its energy conversion process, the torque applied to the end-effector is not a straightforward function of the supply current or voltage. This is why a force-feedback control strategy is presented, which includes an online parameter estimator. Experimental runs are then presented to examine the fidelity of the interface.
Vibrotactile Compliance Feedback for Tangential Force Interaction.
Heo, Seongkook; Lee, Geehyuk
2017-01-01
This paper presents a method to generate a haptic illusion of compliance using a vibrotactile actuator when a tangential force is applied to a rigid surface. The novel method builds on a conceptual compliance model where a physical object moves on a textured surface in response to a tangential force. The method plays vibration patterns simulating friction-induced vibrations as an applied tangential force changes. We built a prototype consisting of a two-dimensional tangential force sensor and a surface transducer to test the effectiveness of the model. Participants in user experiments with the prototype perceived the rigid surface of the prototype as a moving, rubber-like plate. The main findings of the experiments are: 1) the perceived stiffness of a simulated material can be controlled by controlling the force-playback transfer function, 2) its perceptual properties such as softness and pleasantness can be controlled by changing friction grain parameters, and 3) the use of the vibrotactile compliance feedback reduces participants' workload including physical demand and frustration while performing a force repetition task.
Detection of Nuclear Sources by UAV Teleoperation Using a Visuo-Haptic Augmented Reality Interface
Micconi, Giorgio; Caselli, Stefano; Benassi, Giacomo; Zambelli, Nicola; Bettelli, Manuele
2017-01-01
A visuo-haptic augmented reality (VHAR) interface is presented enabling an operator to teleoperate an unmanned aerial vehicle (UAV) equipped with a custom CdZnTe-based spectroscopic gamma-ray detector in outdoor environments. The task is to localize nuclear radiation sources, whose location is unknown to the user, without the close exposure of the operator. The developed detector also enables identification of the localized nuclear sources. The aim of the VHAR interface is to increase the situation awareness of the operator. The user teleoperates the UAV using a 3DOF haptic device that provides an attractive force feedback around the location of the most intense detected radiation source. Moreover, a fixed camera on the ground observes the environment where the UAV is flying. A 3D augmented reality scene is displayed on a computer screen accessible to the operator. Multiple types of graphical overlays are shown, including sensor data acquired by the nuclear radiation detector, a virtual cursor that tracks the UAV and geographical information, such as buildings. Experiments performed in a real environment are reported using an intense nuclear source. PMID:28961198
Detection of Nuclear Sources by UAV Teleoperation Using a Visuo-Haptic Augmented Reality Interface.
Aleotti, Jacopo; Micconi, Giorgio; Caselli, Stefano; Benassi, Giacomo; Zambelli, Nicola; Bettelli, Manuele; Zappettini, Andrea
2017-09-29
A visuo-haptic augmented reality (VHAR) interface is presented enabling an operator to teleoperate an unmanned aerial vehicle (UAV) equipped with a custom CdZnTe-based spectroscopic gamma-ray detector in outdoor environments. The task is to localize nuclear radiation sources, whose location is unknown to the user, without the close exposure of the operator. The developed detector also enables identification of the localized nuclear sources. The aim of the VHAR interface is to increase the situation awareness of the operator. The user teleoperates the UAV using a 3DOF haptic device that provides an attractive force feedback around the location of the most intense detected radiation source. Moreover, a fixed camera on the ground observes the environment where the UAV is flying. A 3D augmented reality scene is displayed on a computer screen accessible to the operator. Multiple types of graphical overlays are shown, including sensor data acquired by the nuclear radiation detector, a virtual cursor that tracks the UAV and geographical information, such as buildings. Experiments performed in a real environment are reported using an intense nuclear source.
Faster acquisition of laparoscopic skills in virtual reality with haptic feedback and 3D vision.
Hagelsteen, Kristine; Langegård, Anders; Lantz, Adam; Ekelund, Mikael; Anderberg, Magnus; Bergenfelz, Anders
2017-10-01
The study investigated whether 3D vision and haptic feedback in combination in a virtual reality environment leads to more efficient learning of laparoscopic skills in novices. Twenty novices were allocated to two groups. All completed a training course in the LapSim ® virtual reality trainer consisting of four tasks: 'instrument navigation', 'grasping', 'fine dissection' and 'suturing'. The study group performed with haptic feedback and 3D vision and the control group without. Before and after the LapSim ® course, the participants' metrics were recorded when tying a laparoscopic knot in the 2D video box trainer Simball ® Box. The study group completed the training course in 146 (100-291) minutes compared to 215 (175-489) minutes in the control group (p = .002). The number of attempts to reach proficiency was significantly lower. The study group had significantly faster learning of skills in three out of four individual tasks; instrument navigation, grasping and suturing. Using the Simball ® Box, no difference in laparoscopic knot tying after the LapSim ® course was noted when comparing the groups. Laparoscopic training in virtual reality with 3D vision and haptic feedback made training more time efficient and did not negatively affect later video box-performance in 2D. [Formula: see text].
Design of a 7-DOF slave robot integrated with a magneto-rheological haptic master
NASA Astrophysics Data System (ADS)
Hwang, Yong-Hoon; Cha, Seung-Woo; Kang, Seok-Rae; Choi, Seung-Bok
2017-04-01
In this study, a 7-DOF slave robot integrated with the haptic master is designed and its dynamic motion is controlled. The haptic master is made using a controllable magneto-rheological (MR) clutch and brake and it provides the surgeon with a sense of touch by using both kinetic and kinesthetic information. Due to the size constraint of the slave robot, a wire actuating is adopted to make the desired motion of the end-effector which has 3-DOF instead of a conventional direct-driven motor. Another motions of the link parts that have 4-DOF use direct-driven motor. In total system, for working as a haptic device, the haptic master need to receive the information of repulsive forces applied on the slave robot. Therefore, repulsive forces on the end-effector are sensed by using three uniaxial torque transducer inserted in the wire actuating system and another repulsive forces applied on link part are sensed by using 6-axis transducer that is able to sense forces and torques. Using another 6-axis transducer, verify the reliability of force information on final end of slave robot. Lastly, integrated with a MR haptic master, psycho-physical test is conducted by different operators who can feel the different repulsive force or torque generated from the haptic master which is equivalent to the force or torque occurred on the end-effector to demonstrate the effectiveness of the proposed system.
The simulation of the half-dry stroke based on the force feedback technology
NASA Astrophysics Data System (ADS)
Guo, Chao; Hou, Zeng-xuan; Zheng, Shuan-zhu; Yang, Guang-qing
2017-02-01
A novel stroke simulation method of the Half-dry style of Chinese calligraphy based on the force feedback technology is proposed for the virtual painting. Firstly, according to the deformation of the brush when the force is exerted on it, the brush footprint between the brush and paper is calculated. The complete brush stroke is obtained by superimposing brush footprints along the painting direction, and the dynamic painting of the brush stroke is implemented. Then, we establish the half-dry texture databases and propose the concept of half-dry value by researching the main factors that affect the effects of the half-dry stroke. In the virtual painting, the half-dry texture is mapped into the stroke in real time according to the half-dry value and painting technique. A technique of texture blending based on the KM model is applied to avoid the seams while texture mapping. The proposed method has been successfully applied to the virtual painting system based on the force feedback technology. In this system, users can implement the painting in real time with a Phantom Desktop haptic device, which can effectively enhance reality to users.
A Model for Steering with Haptic-Force Guidance
NASA Astrophysics Data System (ADS)
Yang, Xing-Dong; Irani, Pourang; Boulanger, Pierre; Bischof, Walter F.
Trajectory-based tasks are common in many applications and have been widely studied. Recently, researchers have shown that even very simple tasks, such as selecting items from cascading menus, can benefit from haptic-force guidance. Haptic guidance is also of significant value in many applications such as medical training, handwriting learning, and in applications requiring precise manipulations. There are, however, only very few guiding principles for selecting parameters that are best suited for proper force guiding. In this paper, we present a model, derived from the steering law that relates movement time to the essential components of a tunneling task in the presence of haptic-force guidance. Results of an experiment show that our model is highly accurate for predicting performance times in force-enhanced tunneling tasks.
Saliba, Christopher M; Clouthier, Allison L; Brandon, Scott C E; Rainbow, Michael J; Deluzio, Kevin J
2018-05-29
Abnormal loading of the knee joint contributes to the pathogenesis of knee osteoarthritis. Gait retraining is a non-invasive intervention that aims to reduce knee loads by providing audible, visual, or haptic feedback of gait parameters. The computational expense of joint contact force prediction has limited real-time feedback to surrogate measures of the contact force, such as the knee adduction moment. We developed a method to predict knee joint contact forces using motion analysis and a statistical regression model that can be implemented in near real-time. Gait waveform variables were deconstructed using principal component analysis and a linear regression was used to predict the principal component scores of the contact force waveforms. Knee joint contact force waveforms were reconstructed using the predicted scores. We tested our method using a heterogenous population of asymptomatic controls and subjects with knee osteoarthritis. The reconstructed contact force waveforms had mean (SD) RMS differences of 0.17 (0.05) bodyweight compared to the contact forces predicted by a musculoskeletal model. Our method successfully predicted subject-specific shape features of contact force waveforms and is a potentially powerful tool in biofeedback and clinical gait analysis.
NASA Astrophysics Data System (ADS)
Fenz, Wolfgang; Dirnberger, Johannes
2015-03-01
Providing suitable training for aspiring neurosurgeons is becoming more and more problematic. The increasing popularity of the endovascular treatment of intracranial aneurysms leads to a lack of simple surgical situations for clipping operations, leaving mainly the complex cases, which present even experienced surgeons with a challenge. To alleviate this situation, we have developed a training simulator with haptic interaction allowing trainees to practice virtual clipping surgeries on real patient-specific vessel geometries. By using specialized finite element (FEM) algorithms (fast finite element method, matrix condensation) combined with GPU acceleration, we can achieve the necessary frame rate for smooth real-time interaction with the detailed models needed for a realistic simulation of the vessel wall deformation caused by the clamping with surgical clips. Vessel wall geometries for typical training scenarios were obtained from 3D-reconstructed medical image data, while for the instruments (clipping forceps, various types of clips, suction tubes) we use models provided by manufacturer Aesculap AG. Collisions between vessel and instruments have to be continuously detected and transformed into corresponding boundary conditions and feedback forces, calculated using a contact plane method. After a training, the achieved result can be assessed based on various criteria, including a simulation of the residual blood flow into the aneurysm. Rigid models of the surgical access and surrounding brain tissue, plus coupling a real forceps to the haptic input device further increase the realism of the simulation.
Ascending and Descending in Virtual Reality: Simple and Safe System Using Passive Haptics.
Nagao, Ryohei; Matsumoto, Keigo; Narumi, Takuji; Tanikawa, Tomohiro; Hirose, Michitaka
2018-04-01
This paper presents a novel interactive system that provides users with virtual reality (VR) experiences, wherein users feel as if they are ascending/descending stairs through passive haptic feedback. The passive haptic stimuli are provided by small bumps under the feet of users; these stimuli are provided to represent the edges of the stairs in the virtual environment. The visual stimuli of the stairs and shoes, provided by head-mounted displays, evoke a visuo-haptic interaction that modifies a user's perception of the floor shape. Our system enables users to experience all types of stairs, such as half-turn and spiral stairs, in a VR setting. We conducted a preliminary user study and two experiments to evaluate the proposed technique. The preliminary user study investigated the effectiveness of the basic idea associated with the proposed technique for the case of a user ascending stairs. The results demonstrated that the passive haptic feedback produced by the small bumps enhanced the user's feeling of presence and sense of ascending. We subsequently performed an experiment to investigate an improved viewpoint manipulation method and the interaction of the manipulation and haptics for both the ascending and descending cases. The experimental results demonstrated that the participants had a feeling of presence and felt a steep stair gradient under the condition of haptic feedback and viewpoint manipulation based on the characteristics of actual stair walking data. However, these results also indicated that the proposed system may not be as effective in providing a sense of descending stairs without an optimization of the haptic stimuli. We then redesigned the shape of the small bumps, and evaluated the design in a second experiment. The results indicated that the best shape to present haptic stimuli is a right triangle cross section in both the ascending and descending cases. Although it is necessary to install small protrusions in the determined direction, by using this optimized shape the users feeling of presence of the stairs and the sensation of walking up and down was enhanced.
Leib, Raz; Rubin, Inbar; Nisky, Ilana
2018-05-16
Interaction with an object often requires the estimation of its mechanical properties. We examined whether the hand that is used to interact with the object and their handedness affected people's estimation of these properties using stiffness estimation as a test case. We recorded participants' responses on a stiffness discrimination of a virtual elastic force field and the grip force applied on the robotic device during the interaction. In half of the trials, the robotic device delayed the participants' force feedback. Consistent with previous studies, delayed force feedback biased the perceived stiffness of the force field. Interestingly, in both left-handed and right-handed participants, for the delayed force field, there was even less perceived stiffness when participants used their left hand than their right hand. This result supports the idea that haptic processing is affected by laterality in the brain, not by handedness. Consistent with previous studies, participants adjusted their applied grip force according to the correct size and timing of the load force regardless of the hand that was used, the handedness, or the delay. This suggests that in all these conditions, participants were able to form an accurate internal representation of the anticipated trajectory of the load force (size and timing) and that this representation was used for accurate control of grip force independently of the perceptual bias. Thus, these results provide additional evidence for the dissociation between action and perception in the processing of delayed information.
The impact of haptic feedback on students' conceptions of the cell
NASA Astrophysics Data System (ADS)
Minogue, James
2005-07-01
The purpose of this study was to investigate the efficacy of adding haptic (sense of touch) feedback to computer generated visualizations for use in middle school science instruction. Current technology allows for the simulation of tactile and kinesthetic sensations via haptic devices and a computer interface. This study, conducted with middle school students (n = 80), explored the cognitive and affective impacts of this innovative technology on students' conceptions of the cell and the process of passive transport. A pretest-posttest control group design was used and participants were randomly assigned to one of two treatment groups (n = 40 for each). Both groups experienced the same core computer-mediated instructional program. This Cell Exploration program engaged students in a 3-D immersive environment that allowed them to actively investigate the form and function of a typical animal cell including its major organelles. The program also engaged students in a study of the structure and function of the cell membrane as it pertains to the process of passive transport and the mechanisms behind the membrane's selective permeability. As they conducted their investigations, students in the experimental group received bi-modal visual and haptic (simulated tactile and kinesthetic) feedback whereas the control group students experienced the program with only visual stimuli. A battery of assessments, including objective and open-ended written response items as well as a haptic performance assessment, were used to gather quantitative and qualitative data regarding changes in students' understandings of the cell concepts prior to and following their completion of the instructional program. Additionally, the impact of haptics on the affective domain of students' learning was assessed using a post-experience semi-structured interview and an attitudinal survey. Results showed that students from both conditions (Visual-Only and Visual + Haptic) found the instructional program interesting and engaging. Additionally, the vast majority of the students reported that they learned a lot about and were more interested in the topic due to their participation. Moreover, students who received the bi-modal (Visual + Haptic) feedback indicated that they experienced lower levels of frustration and spatial disorientation as they conducted their investigations when compared to individuals that relied solely on vision. There were no significant differences measured across the treatment groups on the cognitive assessment items. Despite this finding, the study provided valuable insight into the theoretical and practical considerations involved in the development of multimodal instructional programs.
Karniel, Amir; Nisky, Ilana
2015-01-01
During interaction with objects, we form an internal representation of their mechanical properties. This representation is used for perception and for guiding actions, such as in precision grip, where grip force is modulated with the predicted load forces. In this study, we explored the relationship between grip force adjustment and perception of stiffness during interaction with linear elastic force fields. In a forced-choice paradigm, participants probed pairs of virtual force fields while grasping a force sensor that was attached to a haptic device. For each pair, they were asked which field had higher level of stiffness. In half of the pairs, the force feedback of one of the fields was delayed. Participants underestimated the stiffness of the delayed field relatively to the nondelayed, but their grip force characteristics were similar in both conditions. We analyzed the magnitude of the grip force and the lag between the grip force and the load force in the exploratory probing movements within each trial. Right before answering which force field had higher level of stiffness, both magnitude and lag were similar between delayed and nondelayed force fields. These results suggest that an accurate internal representation of environment stiffness and time delay was used for adjusting the grip force. However, this representation did not help in eliminating the bias in stiffness perception. We argue that during performance of a perceptual task that is based on proprioceptive feedback, separate neural mechanisms are responsible for perception and action-related computations in the brain. PMID:25717155
Leib, Raz; Karniel, Amir; Nisky, Ilana
2015-05-01
During interaction with objects, we form an internal representation of their mechanical properties. This representation is used for perception and for guiding actions, such as in precision grip, where grip force is modulated with the predicted load forces. In this study, we explored the relationship between grip force adjustment and perception of stiffness during interaction with linear elastic force fields. In a forced-choice paradigm, participants probed pairs of virtual force fields while grasping a force sensor that was attached to a haptic device. For each pair, they were asked which field had higher level of stiffness. In half of the pairs, the force feedback of one of the fields was delayed. Participants underestimated the stiffness of the delayed field relatively to the nondelayed, but their grip force characteristics were similar in both conditions. We analyzed the magnitude of the grip force and the lag between the grip force and the load force in the exploratory probing movements within each trial. Right before answering which force field had higher level of stiffness, both magnitude and lag were similar between delayed and nondelayed force fields. These results suggest that an accurate internal representation of environment stiffness and time delay was used for adjusting the grip force. However, this representation did not help in eliminating the bias in stiffness perception. We argue that during performance of a perceptual task that is based on proprioceptive feedback, separate neural mechanisms are responsible for perception and action-related computations in the brain. Copyright © 2015 the American Physiological Society.
Cyber integrated MEMS microhand for biological applications
NASA Astrophysics Data System (ADS)
Weissman, Adam; Frazier, Athena; Pepen, Michael; Lu, Yen-Wen; Yang, Shanchieh Jay
2009-05-01
Anthropomorphous robotic hands at microscales have been developed to receive information and perform tasks for biological applications. To emulate a human hand's dexterity, the microhand requires a master-slave interface with a wearable controller, force sensors, and perception displays for tele-manipulation. Recognizing the constraints and complexity imposed in developing feedback interface during miniaturization, this project address the need by creating an integrated cyber environment incorporating sensors with a microhand, haptic/visual display, and object model, to emulates human hands' psychophysical perception at microscale.
Passive haptics in a knee arthroscopy simulator: is it valid for core skills training?
McCarthy, Avril D; Moody, Louise; Waterworth, Alan R; Bickerstaff, Derek R
2006-01-01
Previous investigation of a cost-effective virtual reality arthroscopic training system, the Sheffield Knee Arthroscopy Training System (SKATS), indicated the desirability of including haptic feedback. A formal task analysis confirmed the importance of knee positioning as a core skill for trainees learning to navigate the knee arthroscopically. The system cost and existing limb interface, which permits knee positioning, would be compromised by the addition of commercial active haptic devices available currently. The validation results obtained when passive haptic feedback (resistance provided by physical structures) is provided indicate that SKATS has construct, predictive and face validity for navigation and triangulation training. When tested using SKATS, experienced surgeons (n = 11) performed significantly faster, located significantly more pathologies, and showed significantly shorter arthroscope path lengths than a less experienced surgeon cohort (n = 12). After SKATS training sessions, novices (n = 3) showed significant improvements in: task completion time, shorter arthroscope path lengths, shorter probe path lengths, and fewer arthroscope tip contacts. Main improvements occurred after the first two practice sessions, indicating rapid familiarization and a training effect. Feedback from questionnaires completed by orthopaedic surgeons indicates that the system has face validity for its remit of basic arthroscopic training.
Evaluation of Motor Control Using Haptic Device
NASA Astrophysics Data System (ADS)
Nuruki, Atsuo; Kawabata, Takuro; Shimozono, Tomoyuki; Yamada, Masafumi; Yunokuchi, Kazutomo
When the kinesthesia and the touch act at the same time, such perception is called haptic perception. This sense has the key role in motor information on the force and position control. The haptic perception is important in the field where the evaluation of the motor control is needed. The purpose of this paper is to evaluate the motor control, perception of heaviness and distance in normal and fatigue conditions using psychophysical experiment. We used a haptic device in order to generate precise force and distance, but the precedent of the evaluation system with the haptic device has been few. Therefore, it is another purpose to examine whether the haptic device is useful as evaluation system for the motor control. The psychophysical quantity of force and distance was measured by two kinds of experiments. Eight healthy subjects participated in this study. The stimulation was presented by haptic device [PHANTOM Omni: SensAble Company]. The subjects compared between standard and test stimulation, and answered it had felt which stimulation was strong. In the result of the psychophysical quantity of force, just noticeable difference (JND) had a significant difference, and point of subjective equality (PSE) was not different between normal and muscle fatigue. On the other hand, in the result of the psychophysical quantity of distance, JND and PSE were not difference between normal and muscle fatigue. These results show that control of force was influenced, but control of distance was not influenced in muscle fatigue. Moreover, these results suggested that the haptic device is useful as the evaluation system for the motor control.
Williams, Camille K.; Tremblay, Luc; Carnahan, Heather
2016-01-01
Researchers in the domain of haptic training are now entering the long-standing debate regarding whether or not it is best to learn a skill by experiencing errors. Haptic training paradigms provide fertile ground for exploring how various theories about feedback, errors and physical guidance intersect during motor learning. Our objective was to determine how error minimizing, error augmenting and no haptic feedback while learning a self-paced curve-tracing task impact performance on delayed (1 day) retention and transfer tests, which indicate learning. We assessed performance using movement time and tracing error to calculate a measure of overall performance – the speed accuracy cost function. Our results showed that despite exhibiting the worst performance during skill acquisition, the error augmentation group had significantly better accuracy (but not overall performance) than the error minimization group on delayed retention and transfer tests. The control group’s performance fell between that of the two experimental groups but was not significantly different from either on the delayed retention test. We propose that the nature of the task (requiring online feedback to guide performance) coupled with the error augmentation group’s frequent off-target experience and rich experience of error-correction promoted information processing related to error-detection and error-correction that are essential for motor learning. PMID:28082937
Brown, Jeremy D; Shelley, Mackenzie K; Gardner, Duane; Gansallo, Emmanuel A; Gillespie, R Brent
2016-01-01
An important goal of haptic display is to make available the action/reaction relationships that define interactions between the body and the physical world. While in physical world interactions reaction cues invariably impinge on the same part of the body involved in action (reaction and action are colocated), a haptic interface is quite capable of rendering feedback to a separate body part than that used for producing exploratory actions (non-colocated action and reaction). This most commonly occurs with the use of vibrotactile display, in which a cutaneous cue has been substituted for a kinesthetic cue (a kind of sensory substitution). In this paper, we investigate whether non-colocated force and displacement cues degrade the perception of compliance. Using a custom non-colocated kinesthetic display in which one hand controls displacement and the other senses force, we ask participants to discriminate between two virtual springs with matched terminal force and adjustable non-linearity. An additional condition includes one hand controlling displacement while the other senses force encoded in a vibrotactile cue. Results show that when the terminal force cue is unavailable, and even when sensory substitution is not involved, non-colocated kinesthetic displays degrade compliance discrimination relative to colocated kinesthetic displays. Compliance discrimination is also degraded with vibrotactile display of force. These findings suggest that non-colocated kinesthetic displays and, likewise, cutaneous sensory substitution displays should be avoided when discrimination of compliance is necessary for task success.
A three-axis force sensor for dual finger haptic interfaces.
Fontana, Marco; Marcheschi, Simone; Salsedo, Fabio; Bergamasco, Massimo
2012-10-10
In this work we present the design process, the characterization and testing of a novel three-axis mechanical force sensor. This sensor is optimized for use in closed-loop force control of haptic devices with three degrees of freedom. In particular the sensor has been conceived for integration with a dual finger haptic interface that aims at simulating forces that occur during grasping and surface exploration. The sensing spring structure has been purposely designed in order to match force and layout specifications for the application. In this paper the design of the sensor is presented, starting from an analytic model that describes the characteristic matrix of the sensor. A procedure for designing an optimal overload protection mechanism is proposed. In the last part of the paper the authors describe the experimental characterization and the integrated test on a haptic hand exoskeleton showing the improvements in the controller performances provided by the inclusion of the force sensor.
Low cost heads-up virtual reality (HUVR) with optical tracking and haptic feedback
NASA Astrophysics Data System (ADS)
Margolis, Todd; DeFanti, Thomas A.; Dawe, Greg; Prudhomme, Andrew; Schulze, Jurgen P.; Cutchin, Steve
2011-03-01
Researchers at the University of California, San Diego, have created a new, relatively low-cost augmented reality system that enables users to touch the virtual environment they are immersed in. The Heads-Up Virtual Reality device (HUVR) couples a consumer 3D HD flat screen TV with a half-silvered mirror to project any graphic image onto the user's hands and into the space surrounding them. With his or her head position optically tracked to generate the correct perspective view, the user maneuvers a force-feedback (haptic) device to interact with the 3D image, literally 'touching' the object's angles and contours as if it was a tangible physical object. HUVR can be used for training and education in structural and mechanical engineering, archaeology and medicine as well as other tasks that require hand-eye coordination. One of the most unique characteristics of HUVR is that a user can place their hands inside of the virtual environment without occluding the 3D image. Built using open-source software and consumer level hardware, HUVR offers users a tactile experience in an immersive environment that is functional, affordable and scalable.
A systematic review of phacoemulsification cataract surgery in virtual reality simulators.
Lam, Chee Kiang; Sundaraj, Kenneth; Sulaiman, Mohd Nazri
2013-01-01
The aim of this study was to review the capability of virtual reality simulators in the application of phacoemulsification cataract surgery training. Our review included the scientific publications on cataract surgery simulators that had been developed by different groups of researchers along with commercialized surgical training products, such as EYESI® and PhacoVision®. The review covers the simulation of the main cataract surgery procedures, i.e., corneal incision, capsulorrhexis, phacosculpting, and intraocular lens implantation in various virtual reality surgery simulators. Haptics realism and visual realism of the procedures are the main elements in imitating the actual surgical environment. The involvement of ophthalmology in research on virtual reality since the early 1990s has made a great impact on the development of surgical simulators. Most of the latest cataract surgery training systems are able to offer high fidelity in visual feedback and haptics feedback, but visual realism, such as the rotational movements of an eyeball with response to the force applied by surgical instruments, is still lacking in some of them. The assessment of the surgical tasks carried out on the simulators showed a significant difference in the performance before and after the training.
Force feedback in a piezoelectric linear actuator for neurosurgery.
De Lorenzo, Danilo; De Momi, Elena; Dyagilev, Ilya; Manganelli, Rudy; Formaglio, Alessandro; Prattichizzo, Domenico; Shoham, Moshe; Ferrigno, Giancarlo
2011-09-01
Force feedback in robotic minimally invasive surgery allows the human operator to manipulate tissues as if his/her hands were in contact with the patient organs. A force sensor mounted on the probe raises problems with sterilization of the overall surgical tool. Also, the use of off-axis gauges introduces a moment that increases the friction force on the bearing, which can easily mask off the signal, given the small force to be measured. This work aims at designing and testing two methods for estimating the resistance to the advancement (force) experienced by a standard probe for brain biopsies within a brain-like material. The further goal is to provide a neurosurgeon using a master-slave tele-operated driver with direct feedback on the tissue mechanical characteristics. Two possible sensing methods, in-axis strain gauge force sensor and position-position error (control-based method), were implemented and tested, both aimed at device miniaturization. The analysis carried out was aimed at fulfilment of the psychophysics requirements for force detection and delay tolerance, also taking into account safety, which is directly related to the last two issues. Controller parameters definition is addressed and consideration is given to development of the device with integration of a haptic interface. Results show better performance of the control-based method (RMSE < 0.1 N), which is also best for reliability, sterilizability, and material dimensions for the application addressed. The control-based method developed for force estimation is compatible with the neurosurgical application and is also capable of measuring tissue resistance without any additional sensors. Force feedback in minimally invasive surgery allows the human operator to manipulate tissues as if his/her hands were in contact with the patient organs. Copyright © 2011 John Wiley & Sons, Ltd.
New tools for sculpting cranial implants in a shared haptic augmented reality environment.
Ai, Zhuming; Evenhouse, Ray; Leigh, Jason; Charbel, Fady; Rasmussen, Mary
2006-01-01
New volumetric tools were developed for the design and fabrication of high quality cranial implants from patient CT data. These virtual tools replace time consuming physical sculpting, mold making and casting steps. The implant is designed by medical professionals in tele-immersive collaboration. Virtual clay is added in the virtual defect area on the CT data using the adding tool. With force feedback the modeler can feel the edge of the defect and fill only the space where no bone is present. A carving tool and a smoothing tool are then used to sculpt and refine the implant. To make a physical evaluation, the skull with simulated defect and the implant are fabricated via stereolithography to allow neurosurgeons to evaluate the quality of the implant. Initial tests demonstrate a very high quality fit. These new haptic volumetric sculpting tools are a critical component of a comprehensive tele-immersive system.
Surgical scissors extension adds the 7th axis of force feedback to the Freedom 6S.
Powers, Marilyn J; Sinclair, Ian P W; Brouwer, Iman; Laroche, Denis
2007-01-01
A virtual reality surgical simulator ideally allows seamless transition between the real and virtual world. In that respect, all of a surgeon's motions and tools must be simulated. Until now researchers have been limited to using a pen-like tool in six degrees-of-freedom. This paper presents the addition of haptically enabled scissors to the end effector of a 6-DOF haptic device, the Freedom 6S. The scissors are capable of pinching a maximum torque of 460 mN.m with low inertia and low back-drive friction. The device is a balanced design so that the user feels like they are holding no more than actual scissors, although with some added inertia on the load end. The system is interchangeable between the 6-DOF and 7-DOF configurations to allow switching tools quickly.
Liu, Juan; Ando, Hiroshi
2016-01-01
Most real-world events stimulate multiple sensory modalities simultaneously. Usually, the stiffness of an object is perceived haptically. However, auditory signals also contain stiffness-related information, and people can form impressions of stiffness from the different impact sounds of metal, wood, or glass. To understand whether there is any interaction between auditory and haptic stiffness perception, and if so, whether the inferred material category is the most relevant auditory information, we conducted experiments using a force-feedback device and the modal synthesis method to present haptic stimuli and impact sound in accordance with participants’ actions, and to modulate low-level acoustic parameters, i.e., frequency and damping, without changing the inferred material categories of sound sources. We found that metal sounds consistently induced an impression of stiffer surfaces than did drum sounds in the audio-only condition, but participants haptically perceived surfaces with modulated metal sounds as significantly softer than the same surfaces with modulated drum sounds, which directly opposes the impression induced by these sounds alone. This result indicates that, although the inferred material category is strongly associated with audio-only stiffness perception, low-level acoustic parameters, especially damping, are more tightly integrated with haptic signals than the material category is. Frequency played an important role in both audio-only and audio-haptic conditions. Our study provides evidence that auditory information influences stiffness perception differently in unisensory and multisensory tasks. Furthermore, the data demonstrated that sounds with higher frequency and/or shorter decay time tended to be judged as stiffer, and contact sounds of stiff objects had no effect on the haptic perception of soft surfaces. We argue that the intrinsic physical relationship between object stiffness and acoustic parameters may be applied as prior knowledge to achieve robust estimation of stiffness in multisensory perception. PMID:27902718
Forces on intraocular lens haptics induced by capsular fibrosis. An experimental study.
Guthoff, R; Abramo, F; Draeger, J; Chumbley, L C; Lang, G K; Neumann, W
1990-01-01
Electronic dynamometry measurements, performed upon intraocular lens (IOL) haptics of prototype one-piece three-loop silicone lenses, accurately defined the relationships between elastic force and haptic displacement. Lens implantations in the capsular bag of dogs (loop span equal to capsular bag diameter, loops underformed immediately after the operation) were evaluated macrophotographically 5-8 months postoperatively. The highly constant elastic property of silicon rubber permitted quantitative correlation of subsequent in vivo haptic displacement with the resultant force vectors responsible for tissue contraction. The lens optics were well centered in 17 (85%) and slightly offcenter in 3 (15%) of 20 implanted eyes. Of the 60 supporting loops, 28 could be visualized sufficiently well to permit reliable haptic measurement. Of these 28, 20 (71%) were clearly displaced, ranging from 0.45 mm away from to 1.4 mm towards the lens' optic center. These extremes represented resultant vector forces of 0.20 and 1.23 mN respectively. Quantitative vector analysis permits better understanding of IOL-capsular interactions.
Perceptualization of geometry using intelligent haptic and visual sensing
NASA Astrophysics Data System (ADS)
Weng, Jianguang; Zhang, Hui
2013-01-01
We present a set of paradigms for investigating geometric structures using haptic and visual sensing. Our principal test cases include smoothly embedded geometry shapes such as knotted curves embedded in 3D and knotted surfaces in 4D, that contain massive intersections when projected to one lower dimension. One can exploit a touch-responsive 3D interactive probe to haptically override this conflicting evidence in the rendered images, by forcing continuity in the haptic representation to emphasize the true topology. In our work, we exploited a predictive haptic guidance, a "computer-simulated hand" with supplementary force suggestion, to support intelligent exploration of geometry shapes that will smooth and maximize the probability of recognition. The cognitive load can be reduced further when enabling an attention-driven visual sensing during the haptic exploration. Our methods combine to reveal the full richness of the haptic exploration of geometric structures, and to overcome the limitations of traditional 4D visualization.
Haptograph Representation of Real-World Haptic Information by Wideband Force Control
NASA Astrophysics Data System (ADS)
Katsura, Seiichiro; Irie, Kouhei; Ohishi, Kiyoshi
Artificial acquisition and reproduction of human sensations are basic technologies of communication engineering. For example, auditory information is obtained by a microphone, and a speaker reproduces it by artificial means. Furthermore, a video camera and a television make it possible to transmit visual sensation by broadcasting. On the contrary, since tactile or haptic information is subject to the Newton's “law of action and reaction” in the real world, a device which acquires, transmits, and reproduces the information has not been established. From the point of view, real-world haptics is the key technology for future haptic communication engineering. This paper proposes a novel acquisition method of haptic information named “haptograph”. The haptograph visualizes the haptic information like photograph. The proposed haptograph is applied to haptic recognition of the contact environment. A linear motor contacts to the surface of the environment and its reaction force is used to make a haptograph. A robust contact motion and sensor-less sensing of the reaction force are attained by using a disturbance observer. As a result, an encyclopedia of contact environment is attained. Since temporal and spatial analyses are conducted to represent haptic information as the haptograph, it is possible to be recognized and to be evaluated intuitively.
Haptic exploration of fingertip-sized geometric features using a multimodal tactile sensor
NASA Astrophysics Data System (ADS)
Ponce Wong, Ruben D.; Hellman, Randall B.; Santos, Veronica J.
2014-06-01
Haptic perception remains a grand challenge for artificial hands. Dexterous manipulators could be enhanced by "haptic intelligence" that enables identification of objects and their features via touch alone. Haptic perception of local shape would be useful when vision is obstructed or when proprioceptive feedback is inadequate, as observed in this study. In this work, a robot hand outfitted with a deformable, bladder-type, multimodal tactile sensor was used to replay four human-inspired haptic "exploratory procedures" on fingertip-sized geometric features. The geometric features varied by type (bump, pit), curvature (planar, conical, spherical), and footprint dimension (1.25 - 20 mm). Tactile signals generated by active fingertip motions were used to extract key parameters for use as inputs to supervised learning models. A support vector classifier estimated order of curvature while support vector regression models estimated footprint dimension once curvature had been estimated. A distal-proximal stroke (along the long axis of the finger) enabled estimation of order of curvature with an accuracy of 97%. Best-performing, curvature-specific, support vector regression models yielded R2 values of at least 0.95. While a radial-ulnar stroke (along the short axis of the finger) was most helpful for estimating feature type and size for planar features, a rolling motion was most helpful for conical and spherical features. The ability to haptically perceive local shape could be used to advance robot autonomy and provide haptic feedback to human teleoperators of devices ranging from bomb defusal robots to neuroprostheses.
Real-time haptic cutting of high-resolution soft tissues.
Wu, Jun; Westermann, Rüdiger; Dick, Christian
2014-01-01
We present our systematic efforts in advancing the computational performance of physically accurate soft tissue cutting simulation, which is at the core of surgery simulators in general. We demonstrate a real-time performance of 15 simulation frames per second for haptic soft tissue cutting of a deformable body at an effective resolution of 170,000 finite elements. This is achieved by the following innovative components: (1) a linked octree discretization of the deformable body, which allows for fast and robust topological modifications of the simulation domain, (2) a composite finite element formulation, which thoroughly reduces the number of simulation degrees of freedom and thus enables to carefully balance simulation performance and accuracy, (3) a highly efficient geometric multigrid solver for solving the linear systems of equations arising from implicit time integration, (4) an efficient collision detection algorithm that effectively exploits the composition structure, and (5) a stable haptic rendering algorithm for computing the feedback forces. Considering that our method increases the finite element resolution for physically accurate real-time soft tissue cutting simulation by an order of magnitude, our technique has a high potential to significantly advance the realism of surgery simulators.
Soft Somatosensitive Actuators via Embedded 3D Printing.
Truby, Ryan L; Wehner, Michael; Grosskopf, Abigail K; Vogt, Daniel M; Uzel, Sebastien G M; Wood, Robert J; Lewis, Jennifer A
2018-04-01
Humans possess manual dexterity, motor skills, and other physical abilities that rely on feedback provided by the somatosensory system. Herein, a method is reported for creating soft somatosensitive actuators (SSAs) via embedded 3D printing, which are innervated with multiple conductive features that simultaneously enable haptic, proprioceptive, and thermoceptive sensing. This novel manufacturing approach enables the seamless integration of multiple ionically conductive and fluidic features within elastomeric matrices to produce SSAs with the desired bioinspired sensing and actuation capabilities. Each printed sensor is composed of an ionically conductive gel that exhibits both long-term stability and hysteresis-free performance. As an exemplar, multiple SSAs are combined into a soft robotic gripper that provides proprioceptive and haptic feedback via embedded curvature, inflation, and contact sensors, including deep and fine touch contact sensors. The multimaterial manufacturing platform enables complex sensing motifs to be easily integrated into soft actuating systems, which is a necessary step toward closed-loop feedback control of soft robots, machines, and haptic devices. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Human eye haptics-based multimedia.
Velandia, David; Uribe-Quevedo, Alvaro; Perez-Gutierrez, Byron
2014-01-01
Immersive and interactive multimedia applications offer complementary study tools in anatomy as users can explore 3D models while obtaining information about the organ, tissue or part being explored. Haptics increases the sense of interaction with virtual objects improving user experience in a more realistic manner. Common eye studying tools are books, illustrations, assembly models, and more recently these are being complemented with mobile apps whose 3D capabilities, computing power and customers are increasing. The goal of this project is to develop a complementary eye anatomy and pathology study tool using deformable models within a multimedia application, offering the students the opportunity for exploring the eye from up close and within with relevant information. Validation of the tool provided feedback on the potential of the development, along with suggestions on improving haptic feedback and navigation.
Disposable soft 3 axis force sensor for biomedical applications.
Chathuranga, Damith Suresh; Zhongkui Wang; Yohan Noh; Nanayakkara, Thrishantha; Hirai, Shinichi
2015-08-01
This paper proposes a new disposable soft 3D force sensor that can be used to calculate either force or displacement and vibrations. It uses three Hall Effect sensors orthogonally placed around a cylindrical beam made of silicon rubber. A niobium permanent magnet is inside the silicon. When a force is applied to the end of the cylinder, it is compressed and bent to the opposite side of the force displacing the magnet. This displacement causes change in the magnetic flux around the ratiomatric linear sensors (Hall Effect sensors). By analysing these changes, we calculate the force or displacement in three directions using a lookup table. This sensor can be used in minimal invasive surgery and haptic feedback applications. The cheap construction, bio-compatibility and ease of miniaturization are few advantages of this sensor. The sensor design, and its characterization are presented in this work.
A Three-Axis Force Sensor for Dual Finger Haptic Interfaces
Fontana, Marco; Marcheschi, Simone; Salsedo, Fabio; Bergamasco, Massimo
2012-01-01
In this work we present the design process, the characterization and testing of a novel three-axis mechanical force sensor. This sensor is optimized for use in closed-loop force control of haptic devices with three degrees of freedom. In particular the sensor has been conceived for integration with a dual finger haptic interface that aims at simulating forces that occur during grasping and surface exploration. The sensing spring structure has been purposely designed in order to match force and layout specifications for the application. In this paper the design of the sensor is presented, starting from an analytic model that describes the characteristic matrix of the sensor. A procedure for designing an optimal overload protection mechanism is proposed. In the last part of the paper the authors describe the experimental characterization and the integrated test on a haptic hand exoskeleton showing the improvements in the controller performances provided by the inclusion of the force sensor. PMID:23202012
Haptic device development based on electro static force of cellulose electro active paper
NASA Astrophysics Data System (ADS)
Yun, Gyu-young; Kim, Sang-Youn; Jang, Sang-Dong; Kim, Dong-Gu; Kim, Jaehwan
2011-04-01
Haptic is one of well-considered device which is suitable for demanding virtual reality applications such as medical equipment, mobile devices, the online marketing and so on. Nowadays, many of concepts for haptic devices have been suggested to meet the demand of industries. Cellulose has received much attention as an emerging smart material, named as electro-active paper (EAPap). The EAPap is attractive for mobile haptic devices due to its unique characteristics in terms of low actuation power, suitability for thin devices and transparency. In this paper, we suggest a new concept of haptic actuator with the use of cellulose EAPap. Its performance is evaluated depending on various actuation conditions. As a result, cellulose electrostatic force actuator shows a large output displacement and fast response, which is suitable for mobile haptic devices.
Adaptation of a haptic robot in a 3T fMRI.
Snider, Joseph; Plank, Markus; May, Larry; Liu, Thomas T; Poizner, Howard
2011-10-04
Functional magnetic resonance imaging (fMRI) provides excellent functional brain imaging via the BOLD signal with advantages including non-ionizing radiation, millimeter spatial accuracy of anatomical and functional data, and nearly real-time analyses. Haptic robots provide precise measurement and control of position and force of a cursor in a reasonably confined space. Here we combine these two technologies to allow precision experiments involving motor control with haptic/tactile environment interaction such as reaching or grasping. The basic idea is to attach an 8 foot end effecter supported in the center to the robot allowing the subject to use the robot, but shielding it and keeping it out of the most extreme part of the magnetic field from the fMRI machine (Figure 1). The Phantom Premium 3.0, 6DoF, high-force robot (SensAble Technologies, Inc.) is an excellent choice for providing force-feedback in virtual reality experiments, but it is inherently non-MR safe, introduces significant noise to the sensitive fMRI equipment, and its electric motors may be affected by the fMRI's strongly varying magnetic field. We have constructed a table and shielding system that allows the robot to be safely introduced into the fMRI environment and limits both the degradation of the fMRI signal by the electrically noisy motors and the degradation of the electric motor performance by the strongly varying magnetic field of the fMRI. With the shield, the signal to noise ratio (SNR: mean signal/noise standard deviation) of the fMRI goes from a baseline of ~380 to ~330, and ~250 without the shielding. The remaining noise appears to be uncorrelated and does not add artifacts to the fMRI of a test sphere (Figure 2). The long, stiff handle allows placement of the robot out of range of the most strongly varying parts of the magnetic field so there is no significant effect of the fMRI on the robot. The effect of the handle on the robot's kinematics is minimal since it is lightweight (~2.6 lbs) but extremely stiff 3/4" graphite and well balanced on the 3DoF joint in the middle. The end result is an fMRI compatible, haptic system with about 1 cubic foot of working space, and, when combined with virtual reality, it allows for a new set of experiments to be performed in the fMRI environment including naturalistic reaching, passive displacement of the limb and haptic perception, adaptation learning in varying force fields, or texture identification.
Control of a haptic gear shifting assistance device utilizing a magnetorheological clutch
NASA Astrophysics Data System (ADS)
Han, Young-Min; Choi, Seung-Bok
2014-10-01
This paper proposes a haptic clutch driven gear shifting assistance device that can help when the driver shifts the gear of a transmission system. In order to achieve this goal, a magnetorheological (MR) fluid-based clutch is devised to be capable of the rotary motion of an accelerator pedal to which the MR clutch is integrated. The proposed MR clutch is then manufactured, and its transmission torque is experimentally evaluated according to the magnetic field intensity. The manufactured MR clutch is integrated with the accelerator pedal to transmit a haptic cue signal to the driver. The impending control issue is to cue the driver to shift the gear via the haptic force. Therefore, a gear-shifting decision algorithm is constructed by considering the vehicle engine speed concerned with engine combustion dynamics, vehicle dynamics and driving resistance. Then, the algorithm is integrated with a compensation strategy for attaining the desired haptic force. In this work, the compensator is also developed and implemented through the discrete version of the inverse hysteretic model. The control performances, such as the haptic force tracking responses and fuel consumption, are experimentally evaluated.
The Use of Haptic Display Technology in Education
ERIC Educational Resources Information Center
Barfield, Woodrow
2009-01-01
The experience of "virtual reality" can consist of head-tracked and stereoscopic virtual worlds, spatialized sound, haptic feedback, and to a lesser extent olfactory cues. Although virtual reality systems have been proposed for numerous applications, the field of education is one particular application that seems well-suited for virtual…
Yasuda, Kazuhiro; Saichi, Kenta; Iwata, Hiroyasu
2018-01-01
Falls and fall-induced injuries are major global public health problems, and sensory input impairment in older adults results in significant limitations in feedback-type postural control. A haptic-based biofeedback (BF) system can be used for augmenting somatosensory input in older adults, and the application of this BF system can increase the objectivity of the feedback and encourage comparison with that provided by a trainer. Nevertheless, an optimal BF system that focuses on interpersonal feedback for balance training in older adults has not been proposed. Thus, we proposed a haptic-based perception-empathy BF system that provides information regarding the older adult's center-of-foot pressure pattern to the trainee and trainer for refining the motor learning effect. The first objective of this study was to examine the effect of this balance training regimen in healthy older adults performing a postural learning task. Second, this study aimed to determine whether BF training required high cognitive load to clarify its practicability in real-life settings. Twenty older adults were assigned to two groups: BF and control groups. Participants in both groups tried balance training in the single-leg stance while performing a cognitive task (i.e., serial subtraction task). Retention was tested 24 h later. Testing comprised balance performance measures (i.e., 95% confidence ellipse area and mean velocity of sway) and dual-task performance (number of responses and correct answers). Measurements of postural control using a force plate revealed that the stability of the single-leg stance was significantly lower in the BF group than in the control group during the balance task. The BF group retained the improvement in the 95% confidence ellipse area 24 h after the retention test. Results of dual-task performance during the balance task were not different between the two groups. These results confirmed the potential benefit of the proposed balance training regimen in designing successful motor learning programs for preventing falls in older adults. PMID:29868597
Yasuda, Kazuhiro; Saichi, Kenta; Iwata, Hiroyasu
2018-01-01
Falls and fall-induced injuries are major global public health problems, and sensory input impairment in older adults results in significant limitations in feedback-type postural control. A haptic-based biofeedback (BF) system can be used for augmenting somatosensory input in older adults, and the application of this BF system can increase the objectivity of the feedback and encourage comparison with that provided by a trainer. Nevertheless, an optimal BF system that focuses on interpersonal feedback for balance training in older adults has not been proposed. Thus, we proposed a haptic-based perception-empathy BF system that provides information regarding the older adult's center-of-foot pressure pattern to the trainee and trainer for refining the motor learning effect. The first objective of this study was to examine the effect of this balance training regimen in healthy older adults performing a postural learning task. Second, this study aimed to determine whether BF training required high cognitive load to clarify its practicability in real-life settings. Twenty older adults were assigned to two groups: BF and control groups. Participants in both groups tried balance training in the single-leg stance while performing a cognitive task (i.e., serial subtraction task). Retention was tested 24 h later. Testing comprised balance performance measures (i.e., 95% confidence ellipse area and mean velocity of sway) and dual-task performance (number of responses and correct answers). Measurements of postural control using a force plate revealed that the stability of the single-leg stance was significantly lower in the BF group than in the control group during the balance task. The BF group retained the improvement in the 95% confidence ellipse area 24 h after the retention test. Results of dual-task performance during the balance task were not different between the two groups. These results confirmed the potential benefit of the proposed balance training regimen in designing successful motor learning programs for preventing falls in older adults.
NASA Astrophysics Data System (ADS)
Yin, Feilong; Hayashi, Ryuzo; Raksincharoensak, Pongsathorn; Nagai, Masao
This research proposes a haptic velocity guidance assistance system for realizing eco-driving as well as enhancing traffic capacity by cooperating with ITS (Intelligent Transportation Systems). The proposed guidance system generates the desired accelerator pedal (abbreviated as pedal) stroke with respect to the desired velocity obtained from ITS considering vehicle dynamics, and provides the desired pedal stroke to the driver via a haptic pedal whose reaction force is controllable and guides the driver in order to trace the desired velocity in real time. The main purpose of this paper is to discuss the feasibility of the haptic velocity guidance. A haptic velocity guidance system for research is developed on the Driving Simulator of TUAT (DS), by attaching a low-inertia, low-friction motor to the pedal, which does not change the original characteristics of the original pedal when it is not operated, implementing an algorithm regarding the desired pedal stroke calculation and the reaction force controller. The haptic guidance maneuver is designed based on human pedal stepping experiments. A simple velocity profile with acceleration, deceleration and cruising is synthesized according to naturalistic driving for testing the proposed system. The experiment result of 9 drivers shows that the haptic guidance provides high accuracy and quick response in velocity tracking. These results prove that the haptic guidance is a promising velocity guidance method from the viewpoint of HMI (Human Machine Interface).
Assessment of navigation cues with proximal force sensing during endovascular catheterization.
Rafii-Taril, Hedyeh; Payne, Christopher J; Riga, Celia; Bicknell, Colin; Lee, Su-Lin; Yang, Guang-Zhong
2012-01-01
Despite increased use of robotic catheter navigation systems for endovascular intervention procedures, current master-slave platforms have not yet taken into account dexterous manipulation skill used in traditional catheterization procedures. Information on tool forces applied by operators is often limited. A novel force/torque sensor is developed in this paper to obtain behavioural data across different experience levels and identify underlying factors that affect overall operator performance. The miniature device can be attached to any part of the proximal end of the catheter, together with a position sensor attached to the catheter tip, for relating tool forces to catheter dynamics and overall performance. The results show clear differences in manipulation skills between experience groups, thus providing insights into different patterns and range of forces applied during routine endovascular procedures. They also provide important design specifications for ergonomically optimized catheter manipulation platforms with added haptic feedback while maintaining natural skills of the operators.
Haptics using a smart material for eyes-free interaction in personal devices
NASA Astrophysics Data System (ADS)
Wang, Huihui; Lane, William Brian; Pappas, Devin; Duque, Bryam; Leong, John
2014-03-01
In this paper we present a prototype using a dry ionic polymer metal composite (IPMC) in interactive personal devices such as bracelet, necklace, pocket key chain or mobile devices for haptic interaction when audio or visual feedback is not possible or practical. This prototype interface is an electro-mechanical system that realizes a shape-changing haptic display for information communication. A dry IPMC will change its dimensions due to the electrostatic effect when an electrical potential is provided to them. The IPMC can operate at a lower voltage (less than 2.5V) which is compatible with requirements for personal electrical devices or mobile devices. The prototype consists of the addressable arrays of the IPMCs with different dimensions which are deformable to different shapes with proper handling or customization. 3D printing technology will be used to form supporting parts. Microcontrollers (about 3cm square) from DigiKey will be imbedded into this personal device. An Android based mobile APP will be developed to talk with microcontrollers to control IPMCs. When personal devices receive information signals, the original shape of the prototype will change to another shape related to the specific sender or types of information sources. This interactive prototype can simultaneously realize multiple methods for conveying haptic information such as dimension, force, and texture due to the flexible array design. We conduct several studies of user experience to explore how users' respond to shape change information.
Modeling and test of a kinaesthetic actuator based on MR fluid for haptic applications.
Yang, Tae-Heon; Koo, Jeong-Hoi; Kim, Sang-Youn; Kwon, Dong-Soo
2017-03-01
Haptic display units have been widely used for conveying button sensations to users, primarily employing vibrotactile actuators. However, the human feeling for pressing buttons mainly relies on kinaesthetic sensations (rather than vibrotactile sensations), and little studies exist on small-scale kinaesthetic haptic units. Thus, the primary goals of this paper are to design a miniature kinaesthetic actuator based on Magneto-Rheological (MR) fluid that can convey various button-clicking sensations and to experimentally evaluate its haptic performance. The design focuses of the proposed actuator were to produce sufficiently large actuation forces (resistive forces) for human users in a given size constraint and to offer a wide range of actuation forces for conveying vivid haptic sensations to users. To this end, this study first performed a series of parametric studies using mathematical force models for multiple operating modes of MR fluid in conjunction with finite element electromagnetism analysis. After selecting design parameters based on parametric studies, a prototype actuator was constructed, and its performance was evaluated using a dynamic mechanical analyzer. It measured the actuator's resistive force with a varying stroke (pressed depth) up to 1 mm and a varying input current from 0 A to 200 mA. The results show that the proposed actuator creates a wide range of resistive forces from around 2 N (off-state) to over 9.5 N at 200 mA. In order to assess the prototype's performance in the terms of the haptic application prospective, a maximum force rate was calculated to determine just noticeable difference in force changes for the 1 mm stoke of the actuator. The results show that the force rate is sufficient to mimic various levels of button sensations, indicating that the proposed kinaesthetic actuator can offer a wide range of resistive force changes that can be conveyed to human operators.
ERIC Educational Resources Information Center
Yeom, Soonja; Choi-Lundberg, Derek L.; Fluck, Andrew Edward; Sale, Arthur
2017-01-01
Purpose: This study aims to evaluate factors influencing undergraduate students' acceptance of a computer-aided learning resource using the Phantom Omni haptic stylus to enable rotation, touch and kinaesthetic feedback and display of names of three-dimensional (3D) human anatomical structures on a visual display. Design/methodology/approach: The…
Unpacking Students' Conceptualizations through Haptic Feedback
ERIC Educational Resources Information Center
Magana, A. J.; Balachandran, S.
2017-01-01
While it is clear that the use of computer simulations has a beneficial effect on learning when compared to instruction without computer simulations, there is still room for improvement to fully realize their benefits for learning. Haptic technologies can fulfill the educational potential of computer simulations by adding the sense of touch.…
Open Touch/Sound Maps: A system to convey street data through haptic and auditory feedback
NASA Astrophysics Data System (ADS)
Kaklanis, Nikolaos; Votis, Konstantinos; Tzovaras, Dimitrios
2013-08-01
The use of spatial (geographic) information is becoming ever more central and pervasive in today's internet society but the most of it is currently inaccessible to visually impaired users. However, access in visual maps is severely restricted to visually impaired and people with blindness, due to their inability to interpret graphical information. Thus, alternative ways of a map's presentation have to be explored, in order to enforce the accessibility of maps. Multiple types of sensory perception like touch and hearing may work as a substitute of vision for the exploration of maps. The use of multimodal virtual environments seems to be a promising alternative for people with visual impairments. The present paper introduces a tool for automatic multimodal map generation having haptic and audio feedback using OpenStreetMap data. For a desired map area, an elevation map is being automatically generated and can be explored by touch, using a haptic device. A sonification and a text-to-speech (TTS) mechanism provide also audio navigation information during the haptic exploration of the map.
Virtual Reality Robotic Operation Simulations Using MEMICA Haptic System
NASA Technical Reports Server (NTRS)
Bar-Cohen, Y.; Mavroidis, C.; Bouzit, M.; Dolgin, B.; Harm, D. L.; Kopchok, G. E.; White, R.
2000-01-01
There is an increasing realization that some tasks can be performed significantly better by humans than robots but, due to associated hazards, distance, etc., only a robot can be employed. Telemedicine is one area where remotely controlled robots can have a major impact by providing urgent care at remote sites. In recent years, remotely controlled robotics has been greatly advanced. The robotic astronaut, "Robonaut," at NASA Johnson Space Center is one such example. Unfortunately, due to the unavailability of force and tactile feedback capability the operator must determine the required action using only visual feedback from the remote site, which limits the tasks that Robonaut can perform. There is a great need for dexterous, fast, accurate teleoperated robots with the operator?s ability to "feel" the environment at the robot's field. Recently, we conceived a haptic mechanism called MEMICA (Remote MEchanical MIrroring using Controlled stiffness and Actuators) that can enable the design of high dexterity, rapid response, and large workspace system. Our team is developing novel MEMICA gloves and virtual reality models to allow the simulation of telesurgery and other applications. The MEMICA gloves are designed to have a high dexterity, rapid response, and large workspace and intuitively mirror the conditions at a virtual site where a robot is simulating the presence of the human operator. The key components of MEMICA are miniature electrically controlled stiffness (ECS) elements and Electrically Controlled Force and Stiffness (ECFS) actuators that are based on the sue of Electro-Rheological Fluids (ERF). In this paper the design of the MEMICA system and initial experimental results are presented.
Virtual Cerebral Aneurysm Clipping with Real-Time Haptic Force Feedback in Neurosurgical Education.
Gmeiner, Matthias; Dirnberger, Johannes; Fenz, Wolfgang; Gollwitzer, Maria; Wurm, Gabriele; Trenkler, Johannes; Gruber, Andreas
2018-04-01
Realistic, safe, and efficient modalities for simulation-based training are highly warranted to enhance the quality of surgical education, and they should be incorporated in resident training. The aim of this study was to develop a patient-specific virtual cerebral aneurysm-clipping simulator with haptic force feedback and real-time deformation of the aneurysm and vessels. A prototype simulator was developed from 2012 to 2016. Evaluation of virtual clipping by blood flow simulation was integrated in this software, and the prototype was evaluated by 18 neurosurgeons. In 4 patients with different medial cerebral artery aneurysms, virtual clipping was performed after real-life surgery, and surgical results were compared regarding clip application, surgical trajectory, and blood flow. After head positioning and craniotomy, bimanual virtual aneurysm clipping with an original forceps was performed. Blood flow simulation demonstrated residual aneurysm filling or branch stenosis. The simulator improved anatomic understanding for 89% of neurosurgeons. Simulation of head positioning and craniotomy was considered realistic by 89% and 94% of users, respectively. Most participants agreed that this simulator should be integrated into neurosurgical education (94%). Our illustrative cases demonstrated that virtual aneurysm surgery was possible using the same trajectory as in real-life cases. Both virtual clipping and blood flow simulation were realistic in broad-based but not calcified aneurysms. Virtual clipping of a calcified aneurysm could be performed using the same surgical trajectory, but not the same clip type. We have successfully developed a virtual aneurysm-clipping simulator. Next, we will prospectively evaluate this device for surgical procedure planning and education. Copyright © 2018 Elsevier Inc. All rights reserved.
Closing the sensorimotor loop: haptic feedback facilitates decoding of motor imagery
NASA Astrophysics Data System (ADS)
Gomez-Rodriguez, M.; Peters, J.; Hill, J.; Schölkopf, B.; Gharabaghi, A.; Grosse-Wentrup, M.
2011-06-01
The combination of brain-computer interfaces (BCIs) with robot-assisted physical therapy constitutes a promising approach to neurorehabilitation of patients with severe hemiparetic syndromes caused by cerebrovascular brain damage (e.g. stroke) and other neurological conditions. In such a scenario, a key aspect is how to reestablish the disrupted sensorimotor feedback loop. However, to date it is an open question how artificially closing the sensorimotor feedback loop influences the decoding performance of a BCI. In this paper, we answer this issue by studying six healthy subjects and two stroke patients. We present empirical evidence that haptic feedback, provided by a seven degrees of freedom robotic arm, facilitates online decoding of arm movement intention. The results support the feasibility of future rehabilitative treatments based on the combination of robot-assisted physical therapy with BCIs.
Sorgini, Francesca; Massari, Luca; D’Abbraccio, Jessica; Petrovic, Petar B.; Carrozza, Maria Chiara; Newell, Fiona N.
2018-01-01
We present a tactile telepresence system for real-time transmission of information about object stiffness to the human fingertips. Experimental tests were performed across two laboratories (Italy and Ireland). In the Italian laboratory, a mechatronic sensing platform indented different rubber samples. Information about rubber stiffness was converted into on-off events using a neuronal spiking model and sent to a vibrotactile glove in the Irish laboratory. Participants discriminated the variation of the stiffness of stimuli according to a two-alternative forced choice protocol. Stiffness discrimination was based on the variation of the temporal pattern of spikes generated during the indentation of the rubber samples. The results suggest that vibrotactile stimulation can effectively simulate surface stiffness when using neuronal spiking models to trigger vibrations in the haptic interface. Specifically, fractional variations of stiffness down to 0.67 were significantly discriminated with the developed neuromorphic haptic interface. This is a performance comparable, though slightly worse, to the threshold obtained in a benchmark experiment evaluating the same set of stimuli naturally with the own hand. Our paper presents a bioinspired method for delivering sensory feedback about object properties to human skin based on contingency–mimetic neuronal models, and can be useful for the design of high performance haptic devices. PMID:29342076
Absence of modulatory action on haptic height perception with musical pitch
Geronazzo, Michele; Avanzini, Federico; Grassi, Massimo
2015-01-01
Although acoustic frequency is not a spatial property of physical objects, in common language, pitch, i.e., the psychological correlated of frequency, is often labeled spatially (i.e., “high in pitch” or “low in pitch”). Pitch-height is known to modulate (and interact with) the response of participants when they are asked to judge spatial properties of non-auditory stimuli (e.g., visual) in a variety of behavioral tasks. In the current study we investigated whether the modulatory action of pitch-height extended to the haptic estimation of height of a virtual step. We implemented a HW/SW setup which is able to render virtual 3D objects (stair-steps) haptically through a PHANTOM device, and to provide real-time continuous auditory feedback depending on the user interaction with the object. The haptic exploration was associated with a sinusoidal tone whose pitch varied as a function of the interaction point's height within (i) a narrower and (ii) a wider pitch range, or (iii) a random pitch variation acting as a control audio condition. Explorations were also performed with no sound (haptic only). Participants were instructed to explore the virtual step freely, and to communicate height estimation by opening their thumb and index finger to mimic the step riser height, or verbally by reporting the height in centimeters of the step riser. We analyzed the role of musical expertise by dividing participants into non-musicians and musicians. Results showed no effects of musical pitch on high-realistic haptic feedback. Overall there is no difference between the two groups in the proposed multimodal conditions. Additionally, we observed a different haptic response distribution between musicians and non-musicians when estimations of the auditory conditions are matched with estimations in the no sound condition. PMID:26441745
Teodorescu, Kinneret; Bouchigny, Sylvain; Korman, Maria
2013-08-01
In this study, we explored the time course of haptic stiffness discrimination learning and how it was affected by two experimental factors, the addition of visual information and/or knowledge of results (KR) during training. Stiffness perception may integrate both haptic and visual modalities. However, in many tasks, the visual field is typically occluded, forcing stiffness perception to be dependent exclusively on haptic information. No studies to date addressed the time course of haptic stiffness perceptual learning. Using a virtual environment (VE) haptic interface and a two-alternative forced-choice discrimination task, the haptic stiffness discrimination ability of 48 participants was tested across 2 days. Each day included two haptic test blocks separated by a training block Additional visual information and/or KR were manipulated between participants during training blocks. Practice repetitions alone induced significant improvement in haptic stiffness discrimination. Between days, accuracy was slightly improved, but decision time performance was deteriorated. The addition of visual information and/or KR had only temporary effects on decision time, without affecting the time course of haptic discrimination learning. Learning in haptic stiffness discrimination appears to evolve through at least two distinctive phases: A single training session resulted in both immediate and latent learning. This learning was not affected by the training manipulations inspected. Training skills in VE in spaced sessions can be beneficial for tasks in which haptic perception is critical, such as surgery procedures, when the visual field is occluded. However, training protocols for such tasks should account for low impact of multisensory information and KR.
Okrainec, A; Farcas, M; Henao, O; Choy, I; Green, J; Fotoohi, M; Leslie, R; Wight, D; Karam, P; Gonzalez, N; Apkarian, J
2009-01-01
The Veress needle is the most commonly used technique for creating the pneumoperitoneum at the start of a laparoscopic surgical procedure. Inserting the Veress needle correctly is crucial since errors can cause significant harm to patients. Unfortunately, this technique can be difficult to teach since surgeons rely heavily on tactile feedback while advancing the needle through the various layers of the abdominal wall. This critical step in laparoscopy, therefore, can be challenging for novice trainees to learn without adequate opportunities to practice in a safe environment with no risk of injury to patients. To address this issue, we have successfully developed a prototype of a virtual reality haptic needle insertion simulator using the tactile feedback of 22 surgeons to set realistic haptic parameters. A survey of these surgeons concluded that our device appeared and felt realistic, and could potentially be a useful tool for teaching the proper technique of Veress needle insertion.
fMRI-Compatible Electromagnetic Haptic Interface.
Riener, R; Villgrattner, T; Kleiser, R; Nef, T; Kollias, S
2005-01-01
A new haptic interface device is suggested, which can be used for functional magnetic resonance imaging (fMRI) studies. The basic component of this 1 DOF haptic device are two coils that produce a Lorentz force induced by the large static magnetic field of the MR scanner. A MR-compatible optical angular encoder and a optical force sensor enable the implementation of different control architectures for haptic interactions. The challenge was to provide a large torque, and not to affect image quality by the currents applied in the device. The haptic device was tested in a 3T MR scanner. With a current of up to 1A and a distance of 1m to the focal point of the MR-scanner it was possible to generate torques of up to 4 Nm. Within these boundaries image quality was not affected.
Menon, Samir; Zhu, Jack; Goyal, Deeksha; Khatib, Oussama
2017-07-01
Haptic interfaces compatible with functional magnetic resonance imaging (Haptic fMRI) promise to enable rich motor neuroscience experiments that study how humans perform complex manipulation tasks. Here, we present a large-scale study (176 scans runs, 33 scan sessions) that characterizes the reliability and performance of one such electromagnetically actuated device, Haptic fMRI Interface 3 (HFI-3). We outline engineering advances that ensured HFI-3 did not interfere with fMRI measurements. Observed fMRI temporal noise levels with HFI-3 operating were at the fMRI baseline (0.8% noise to signal). We also present results from HFI-3 experiments demonstrating that high resolution fMRI can be used to study spatio-temporal patterns of fMRI blood oxygenation dependent (BOLD) activation. These experiments include motor planning, goal-directed reaching, and visually-guided force control. Observed fMRI responses are consistent with existing literature, which supports Haptic fMRI's effectiveness at studying the brain's motor regions.
Upper limb assessment using a Virtual Peg Insertion Test.
Fluet, Marie-Christine; Lambercy, Olivier; Gassert, Roger
2011-01-01
This paper presents the initial evaluation of a Virtual Peg Insertion Test developed to assess sensorimotor functions of arm and hand using an instrumented tool, virtual reality and haptic feedback. Nine performance parameters derived from kinematic and kinetic data were selected and compared between two groups of healthy subjects performing the task with the dominant and non-dominant hand, as well as with a group of chronic stroke subjects suffering from different levels of upper limb impairment. Results showed significantly smaller grasping forces applied by the stroke subjects compared to the healthy subjects. The grasping force profiles suggest a poor coordination between position and grasping for the stroke subjects, and the collision forces with the virtual board were found to be indicative of sensory deficits. These preliminary results suggest that the analyzed parameters could be valid indicators of impairment. © 2011 IEEE
Robotic guidance benefits the learning of dynamic, but not of spatial movement characteristics.
Lüttgen, Jenna; Heuer, Herbert
2012-10-01
Robotic guidance is an engineered form of haptic-guidance training and intended to enhance motor learning in rehabilitation, surgery, and sports. However, its benefits (and pitfalls) are still debated. Here, we investigate the effects of different presentation modes on the reproduction of a spatiotemporal movement pattern. In three different groups of participants, the movement was demonstrated in three different modalities, namely visual, haptic, and visuo-haptic. After demonstration, participants had to reproduce the movement in two alternating recall conditions: haptic and visuo-haptic. Performance of the three groups during recall was compared with regard to spatial and dynamic movement characteristics. After haptic presentation, participants showed superior dynamic accuracy, whereas after visual presentation, participants performed better with regard to spatial accuracy. Added visual feedback during recall always led to enhanced performance, independent of the movement characteristic and the presentation modality. These findings substantiate the different benefits of different presentation modes for different movement characteristics. In particular, robotic guidance is beneficial for the learning of dynamic, but not of spatial movement characteristics.
Precise Haptic Device Co-Location for Visuo-Haptic Augmented Reality.
Eck, Ulrich; Pankratz, Frieder; Sandor, Christian; Klinker, Gudrun; Laga, Hamid
2015-12-01
Visuo-haptic augmented reality systems enable users to see and touch digital information that is embedded in the real world. PHANToM haptic devices are often employed to provide haptic feedback. Precise co-location of computer-generated graphics and the haptic stylus is necessary to provide a realistic user experience. Previous work has focused on calibration procedures that compensate the non-linear position error caused by inaccuracies in the joint angle sensors. In this article we present a more complete procedure that additionally compensates for errors in the gimbal sensors and improves position calibration. The proposed procedure further includes software-based temporal alignment of sensor data and a method for the estimation of a reference for position calibration, resulting in increased robustness against haptic device initialization and external tracker noise. We designed our procedure to require minimal user input to maximize usability. We conducted an extensive evaluation with two different PHANToMs, two different optical trackers, and a mechanical tracker. Compared to state-of-the-art calibration procedures, our approach significantly improves the co-location of the haptic stylus. This results in higher fidelity visual and haptic augmentations, which are crucial for fine-motor tasks in areas such as medical training simulators, assembly planning tools, or rapid prototyping applications.
NASA Astrophysics Data System (ADS)
Jones, M. Gail; Minogue, James; Oppewal, Tom; Cook, Michelle P.; Broadwell, Bethany
2006-12-01
Science instruction is typically highly dependent on visual representations of scientific concepts that are communicated through textbooks, teacher presentations, and computer-based multimedia materials. Little is known about how students with visual impairments access and interpret these types of visually-dependent instructional materials. This study explored the efficacy of new haptic (simulated tactile feedback and kinesthetics) instructional technology for teaching cell morphology and function to middle and high school students with visual impairments. The study examined students' prior experiences learning about the cell and cell functions in classroom instruction, as well as how haptic feedback technology impacted students' awareness of the 3-D nature of an animal cell, the morphology and function of cell organelles, and students' interest in the haptic technology as an instructional tool. Twenty-one students with visual impairment participated in the study. Students explored a tactile model of the cell with a haptic point probe that allowed them to feel the cell and its organelles. Results showed that students made significant gains in their ability to identify cell organelles and found the technology to be highly interesting as an instructional tool. The need for additional adaptive technology for students with visual impairments is discussed.
Collision detection and modeling of rigid and deformable objects in laparoscopic simulator
NASA Astrophysics Data System (ADS)
Dy, Mary-Clare; Tagawa, Kazuyoshi; Tanaka, Hiromi T.; Komori, Masaru
2015-03-01
Laparoscopic simulators are viable alternatives for surgical training and rehearsal. Haptic devices can also be incorporated with virtual reality simulators to provide additional cues to the users. However, to provide realistic feedback, the haptic device must be updated by 1kHz. On the other hand, realistic visual cues, that is, the collision detection and deformation between interacting objects must be rendered at least 30 fps. Our current laparoscopic simulator detects the collision between a point on the tool tip, and on the organ surfaces, in which haptic devices are attached on actual tool tips for realistic tool manipulation. The triangular-mesh organ model is rendered using a mass spring deformation model, or finite element method-based models. In this paper, we investigated multi-point-based collision detection on the rigid tool rods. Based on the preliminary results, we propose a method to improve the collision detection scheme, and speed up the organ deformation reaction. We discuss our proposal for an efficient method to compute simultaneous multiple collision between rigid (laparoscopic tools) and deformable (organs) objects, and perform the subsequent collision response, with haptic feedback, in real-time.
Safe Local Navigation for Visually Impaired Users With a Time-of-Flight and Haptic Feedback Device.
Katzschmann, Robert K; Araki, Brandon; Rus, Daniela
2018-03-01
This paper presents ALVU (Array of Lidars and Vibrotactile Units), a contactless, intuitive, hands-free, and discreet wearable device that allows visually impaired users to detect low- and high-hanging obstacles, as well as physical boundaries in their immediate environment. The solution allows for safe local navigation in both confined and open spaces by enabling the user to distinguish free space from obstacles. The device presented is composed of two parts: a sensor belt and a haptic strap. The sensor belt is an array of time-of-flight distance sensors worn around the front of a user's waist, and the pulses of infrared light provide reliable and accurate measurements of the distances between the user and surrounding obstacles or surfaces. The haptic strap communicates the measured distances through an array of vibratory motors worn around the user's upper abdomen, providing haptic feedback. The linear vibration motors are combined with a point-loaded pretensioned applicator to transmit isolated vibrations to the user. We validated the device's capability in an extensive user study entailing 162 trials with 12 blind users. Users wearing the device successfully walked through hallways, avoided obstacles, and detected staircases.
Chiang, Vico Chung-Lim; Lo, King-Hung; Choi, Kup-Sze
2017-10-01
To investigate the feasibility of using a virtual rehabilitation system with intuitive user interface and force feedback to improve the skills in activities of daily living (ADL). A virtual training system equipped with haptic devices was developed for the rehabilitation of three ADL tasks - door unlocking, water pouring and meat cutting. Twenty subjects with upper limb disabilities, supervised by two occupational therapists, received a four-session training using the system. The task completion time and the amount of water poured into a virtual glass were recorded. The performance of the three tasks in reality was assessed before and after the virtual training. Feedback of the participants was collected with questionnaires after the study. The completion time of the virtual tasks decreased during the training (p < 0.01) while the percentage of water successfully poured increased (p = 0.051). The score of the Borg scale of perceived exertion was 1.05 (SD = 1.85; 95% CI = 0.18-1.92) and that of the task specific feedback questionnaire was 31 (SD = 4.85; 95% CI = 28.66-33.34). The feedback of the therapists suggested a positive rehabilitation effect. The participants had positive perception towards the system. The system can potentially be used as a tool to complement conventional rehabilitation approaches of ADL. Implications for rehabilitation Rehabilitation of activities of daily living can be facilitated using computer-assisted approaches. The existing approaches focus on cognitive training rather than the manual skills. A virtual training system with intuitive user interface and force feedback was designed to improve the learning of the manual skills. The study shows that system could be used as a training tool to complement conventional rehabilitation approaches.
Human's Capability to Discriminate Spatial Forces at the Big Toe.
Hagengruber, Annette; Höppner, Hannes; Vogel, Jörn
2018-01-01
A key factor for reliable object manipulation is the tactile information provided by the skin of our hands. As this sensory information is so essential in our daily life it should also be provided during teleoperation of robotic devices or in the control of myoelectric prostheses. It is well-known that feeding back the tactile information to the user can lead to a more natural and intuitive control of robotic devices. However, in some applications it is difficult to use the hands as natural feedback channels since they may already be overloaded with other tasks or, e.g., in case of hand prostheses not accessible at all. Many alternatives for tactile feedback to the human hand have already been investigated. In particular, one approach shows that humans can integrate uni-directional (normal) force feedback at the toe into their sensorimotor-control loop. Extending this work, we investigate the human's capability to discriminate spatial forces at the bare front side of their toe. A state-of-the-art haptic feedback device was used to apply forces with three different amplitudes-2 N, 5 N, and 8 N-to subjects' right big toes. During the experiments, different force stimuli were presented, i.e., direction of the applied force was changed, such that tangential components occured. In total the four directions up (distal), down (proximal), left (medial), and right (lateral) were tested. The proportion of the tangential force was varied corresponding to a directional change of 5° to 25° with respect to the normal force. Given these force stimuli, the subjects' task was to identify the direction of the force change. We found the amplitude of the force as well as the proportion of tangential forces to have a significant influence on the success rate. Furthermore, the direction right showed a significantly different successrate from all other directions. The stimuli with a force amplitude of 8 N achieved success rates over 89% in all directions. The results of the user study provide evidence that the subjects were able to discriminate spatial forces at their toe within defined force amplitudes and tangential proportion.
Human's Capability to Discriminate Spatial Forces at the Big Toe
Hagengruber, Annette; Höppner, Hannes; Vogel, Jörn
2018-01-01
A key factor for reliable object manipulation is the tactile information provided by the skin of our hands. As this sensory information is so essential in our daily life it should also be provided during teleoperation of robotic devices or in the control of myoelectric prostheses. It is well-known that feeding back the tactile information to the user can lead to a more natural and intuitive control of robotic devices. However, in some applications it is difficult to use the hands as natural feedback channels since they may already be overloaded with other tasks or, e.g., in case of hand prostheses not accessible at all. Many alternatives for tactile feedback to the human hand have already been investigated. In particular, one approach shows that humans can integrate uni-directional (normal) force feedback at the toe into their sensorimotor-control loop. Extending this work, we investigate the human's capability to discriminate spatial forces at the bare front side of their toe. A state-of-the-art haptic feedback device was used to apply forces with three different amplitudes—2 N, 5 N, and 8 N—to subjects' right big toes. During the experiments, different force stimuli were presented, i.e., direction of the applied force was changed, such that tangential components occured. In total the four directions up (distal), down (proximal), left (medial), and right (lateral) were tested. The proportion of the tangential force was varied corresponding to a directional change of 5° to 25° with respect to the normal force. Given these force stimuli, the subjects' task was to identify the direction of the force change. We found the amplitude of the force as well as the proportion of tangential forces to have a significant influence on the success rate. Furthermore, the direction right showed a significantly different successrate from all other directions. The stimuli with a force amplitude of 8 N achieved success rates over 89% in all directions. The results of the user study provide evidence that the subjects were able to discriminate spatial forces at their toe within defined force amplitudes and tangential proportion. PMID:29692718
Improved haptic interface for colonoscopy simulation.
Woo, Hyun Soo; Kim, Woo Seok; Ahn, Woojin; Lee, Doo Yong; Yi, Sun Young
2007-01-01
This paper presents an improved haptic interface of the KAIST-Ewha colonoscopy simulator II. The haptic interface enables the distal portion of the colonoscope to be freely bent while guaranteeing enough workspace and reflective forces for colonoscopy simulation. Its force-torque sensor measures profiles of the user. Manipulation of the colonoscope tip is monitored by four deflection sensors, and triggers computation to render accurate graphic images corresponding to the angle knob rotation. Tack switches are attached on the valve-actuation buttons of the colonoscope to simulate air-injection or suction, and the corresponding deformation of the colon.
Zhang, Linshuai; Guo, Shuxiang; Yu, Huadong; Song, Yu; Tamiya, Takashi; Hirata, Hideyuki; Ishihara, Hidenori
2018-02-23
The robot-assisted catheter system can increase operating distance thus preventing the exposure radiation of the surgeon to X-ray for endovascular catheterization. However, few designs have considered the collision protection between the catheter tip and the vessel wall. This paper presents a novel catheter operating system based on tissue protection to prevent vessel puncture caused by collision. The integrated haptic interface not only allows the operator to feel the real force feedback, but also combines with the newly proposed collision protection mechanism (CPM) to mitigate the collision trauma. The CPM can release the catheter quickly when the measured force exceeds a certain threshold, so as to avoid the vessel puncture. A significant advantage is that the proposed mechanism can adjust the protection threshold in real time by the current according to the actual characteristics of the blood vessel. To verify the effectiveness of the tissue protection by the system, the evaluation experiments in vitro were carried out. The results show that the further collision damage can be effectively prevented by the CPM, which implies the realization of relative safe catheterization. This research provides some insights into the functional improvements of safe and reliable robot-assisted catheter systems.
Let the Force Be with Us: Dyads Exploit Haptic Coupling for Coordination
ERIC Educational Resources Information Center
van der Wel, Robrecht P. R. D.; Knoblich, Guenther; Sebanz, Natalie
2011-01-01
People often perform actions that involve a direct physical coupling with another person, such as when moving furniture together. Here, we examined how people successfully coordinate such actions with others. We tested the hypothesis that dyads amplify their forces to create haptic information to coordinate. Participants moved a pole (resembling a…
Human-arm-and-hand-dynamic model with variability analyses for a stylus-based haptic interface.
Fu, Michael J; Cavuşoğlu, M Cenk
2012-12-01
Haptic interface research benefits from accurate human arm models for control and system design. The literature contains many human arm dynamic models but lacks detailed variability analyses. Without accurate measurements, variability is modeled in a very conservative manner, leading to less than optimal controller and system designs. This paper not only presents models for human arm dynamics but also develops inter- and intrasubject variability models for a stylus-based haptic device. Data from 15 human subjects (nine male, six female, ages 20-32) were collected using a Phantom Premium 1.5a haptic device for system identification. In this paper, grip-force-dependent models were identified for 1-3-N grip forces in the three spatial axes. Also, variability due to human subjects and grip-force variation were modeled as both structured and unstructured uncertainties. For both forms of variability, the maximum variation, 95 %, and 67 % confidence interval limits were examined. All models were in the frequency domain with force as input and position as output. The identified models enable precise controllers targeted to a subset of possible human operator dynamics.
Zhao, Yan; Guo, Shuxiang; Xiao, Nan; Wang, Yuxin; Li, Youxiang; Jiang, Yuhua
2018-04-02
Vascular interventional surgery has its advantages compared to traditional operation. Master-slave robotic technology can further improve the operation accuracy, efficiency and safety of this complicated and high risk surgery. However, on-line acquisition of operating force information of catheter and guidewire remains to be a significant obstacle on the path to enhancing robotic surgery safety. Thus, a novel slave manipulator is proposed in this paper to realize on-line sensing of guidewire torsional operating torque and axial operation force during robotic assisted operations. A strain sensor is specially designed to detect the small scale torsional operation torque with low rotational frequency. Additionally, the axial operating force is detected via a load cell, which is incorporated into a sliding mechanism to eliminate the influence of friction. For validation, calibration and performance evaluation experiments are conducted. The results indicate that the proposed operation torque and force detection device is effective. Thus, it can provide the foundation for enabling accurate haptic feedback to the surgeon to improve surgical safety.
[Experimental study of angiography using vascular interventional robot-2(VIR-2)].
Tian, Zeng-min; Lu, Wang-sheng; Liu, Da; Wang, Da-ming; Guo, Shu-xiang; Xu, Wu-yi; Jia, Bo; Zhao, De-peng; Liu, Bo; Gao, Bao-feng
2012-06-01
To verify the feasibility and safety of new vascular interventional robot system used in vascular interventional procedures. Vascular interventional robot type-2 (VIR-2) included master-slave parts of body propulsion system, image navigation systems and force feedback system, the catheter movement could achieve under automatic control and navigation, force feedback was integrated real-time, followed by in vitro pre-test in vascular model and cerebral angiography in dog. Surgeon controlled vascular interventional robot remotely, the catheter was inserted into the intended target, the catheter positioning error and the operation time would be evaluated. In vitro pre-test and animal experiment went well; the catheter can enter any branch of vascular. Catheter positioning error was less than 1 mm. The angiography operation in animal was carried out smoothly without complication; the success rate of the operation was 100% and the entire experiment took 26 and 30 minutes, efficiency was slightly improved compared with the VIR-1, and the time what staff exposed to the DSA machine was 0 minute. The resistance of force sensor can be displayed to the operator to provide a security guarantee for the operation. No surgical complications. VIR-2 is safe and feasible, and can achieve the catheter remote operation and angiography; the master-slave system meets the characteristics of traditional procedure. The three-dimensional image can guide the operation more smoothly; force feedback device provides remote real-time haptic information to provide security for the operation.
NASA Technical Reports Server (NTRS)
DiZio, P.; Lackner, J. R.
2000-01-01
Reaching movements made to visual targets in a rotating room are initially deviated in path and endpoint in the direction of transient Coriolis forces generated by the motion of the arm relative to the rotating environment. With additional reaches, movements become progressively straighter and more accurate. Such adaptation can occur even in the absence of visual feedback about movement progression or terminus. Here we examined whether congenitally blind and sighted subjects without visual feedback would demonstrate adaptation to Coriolis forces when they pointed to a haptically specified target location. Subjects were tested pre-, per-, and postrotation at 10 rpm counterclockwise. Reaching to straight ahead targets prerotation, both groups exhibited slightly curved paths. Per-rotation, both groups showed large initial deviations of movement path and curvature but within 12 reaches on average had returned to prerotation curvature levels and endpoints. Postrotation, both groups showed mirror image patterns of curvature and endpoint to the per-rotation pattern. The groups did not differ significantly on any of the performance measures. These results provide compelling evidence that motor adaptation to Coriolis perturbations can be achieved on the basis of proprioceptive, somatosensory, and motor information in the complete absence of visual experience.
Su, Hao; Shang, Weijian; Li, Gang; Patel, Niravkumar; Fischer, Gregory S
2017-08-01
This paper presents a surgical master-slave teleoperation system for percutaneous interventional procedures under continuous magnetic resonance imaging (MRI) guidance. The slave robot consists of a piezoelectrically actuated 6-degree-of-freedom (DOF) robot for needle placement with an integrated fiber optic force sensor (1-DOF axial force measurement) using the Fabry-Perot interferometry (FPI) sensing principle; it is configured to operate inside the bore of the MRI scanner during imaging. By leveraging the advantages of pneumatic and piezoelectric actuation in force and position control respectively, we have designed a pneumatically actuated master robot (haptic device) with strain gauge based force sensing that is configured to operate the slave from within the scanner room during imaging. The slave robot follows the insertion motion of the haptic device while the haptic device displays the needle insertion force as measured by the FPI sensor. Image interference evaluation demonstrates that the telesurgery system presents a signal to noise ratio reduction of less than 17% and less than 1% geometric distortion during simultaneous robot motion and imaging. Teleoperated needle insertion and rotation experiments were performed to reach 10 targets in a soft tissue-mimicking phantom with 0.70 ± 0.35 mm Cartesian space error.
Seung, Sungmin; Choi, Hongseok; Jang, Jongseong; Kim, Young Soo; Park, Jong-Oh; Park, Sukho; Ko, Seong Young
2017-01-01
This article presents a haptic-guided teleoperation for a tumor removal surgical robotic system, so-called a SIROMAN system. The system was developed in our previous work to make it possible to access tumor tissue, even those that seat deeply inside the brain, and to remove the tissue with full maneuverability. For a safe and accurate operation to remove only tumor tissue completely while minimizing damage to the normal tissue, a virtual wall-based haptic guidance together with a medical image-guided control is proposed and developed. The virtual wall is extracted from preoperative medical images, and the robot is controlled to restrict its motion within the virtual wall using haptic feedback. Coordinate transformation between sub-systems, a collision detection algorithm, and a haptic-guided teleoperation using a virtual wall are described in the context of using SIROMAN. A series of experiments using a simplified virtual wall are performed to evaluate the performance of virtual wall-based haptic-guided teleoperation. With haptic guidance, the accuracy of the robotic manipulator's trajectory is improved by 57% compared to one without. The tissue removal performance is also improved by 21% ( p < 0.05). The experiments show that virtual wall-based haptic guidance provides safer and more accurate tissue removal for single-port brain surgery.
New Exoskeleton Arm Concept Design And Actuation For Haptic Interaction With Virtual Objects
NASA Astrophysics Data System (ADS)
Chakarov, D.; Veneva, I.; Tsveov, M.; Tiankov, T.
2014-12-01
In the work presented in this paper the conceptual design and actuation of one new exoskeleton of the upper limb is presented. The device is designed for application where both motion tracking and force feedback are required, such as human interaction with virtual environment or rehabilitation tasks. The choice is presented of mechanical structure kinematical equivalent to the structure of the human arm. An actuation system is selected based on braided pneumatic muscle actuators. Antagonistic drive system for each joint is shown, using pulley and cable transmissions. Force/displacement diagrams are presented of two antagonistic acting muscles. Kinematics and dynamic estimations are performed of the system exoskeleton and upper limb. Selected parameters ensure in the antagonistic scheme joint torque regulation and human arm range of motion.
Soft tissue modelling through autowaves for surgery simulation.
Zhong, Yongmin; Shirinzadeh, Bijan; Alici, Gursel; Smith, Julian
2006-09-01
Modelling of soft tissue deformation is of great importance to virtual reality based surgery simulation. This paper presents a new methodology for simulation of soft tissue deformation by drawing an analogy between autowaves and soft tissue deformation. The potential energy stored in a soft tissue as a result of a deformation caused by an external force is propagated among mass points of the soft tissue by non-linear autowaves. The novelty of the methodology is that (i) autowave techniques are established to describe the potential energy distribution of a deformation for extrapolating internal forces, and (ii) non-linear materials are modelled with non-linear autowaves other than geometric non-linearity. Integration with a haptic device has been achieved to simulate soft tissue deformation with force feedback. The proposed methodology not only deals with large-range deformations, but also accommodates isotropic, anisotropic and inhomogeneous materials by simply changing diffusion coefficients.
AR Feels "Softer" than VR: Haptic Perception of Stiffness in Augmented versus Virtual Reality.
Gaffary, Yoren; Le Gouis, Benoit; Marchal, Maud; Argelaguet, Ferran; Arnaldi, Bruno; Lecuyer, Anatole
2017-11-01
Does it feel the same when you touch an object in Augmented Reality (AR) or in Virtual Reality (VR)? In this paper we study and compare the haptic perception of stiffness of a virtual object in two situations: (1) a purely virtual environment versus (2) a real and augmented environment. We have designed an experimental setup based on a Microsoft HoloLens and a haptic force-feedback device, enabling to press a virtual piston, and compare its stiffness successively in either Augmented Reality (the virtual piston is surrounded by several real objects all located inside a cardboard box) or in Virtual Reality (the same virtual piston is displayed in a fully virtual scene composed of the same other objects). We have conducted a psychophysical experiment with 12 participants. Our results show a surprising bias in perception between the two conditions. The virtual piston is on average perceived stiffer in the VR condition compared to the AR condition. For instance, when the piston had the same stiffness in AR and VR, participants would select the VR piston as the stiffer one in 60% of cases. This suggests a psychological effect as if objects in AR would feel "softer" than in pure VR. Taken together, our results open new perspectives on perception in AR versus VR, and pave the way to future studies aiming at characterizing potential perceptual biases.
Kim, K; Lee, S
2015-05-01
Diagnosis of skin conditions is dependent on the assessment of skin surface properties that are represented by more tactile properties such as stiffness, roughness, and friction than visual information. Due to this reason, adding tactile feedback to existing vision based diagnosis systems can help dermatologists diagnose skin diseases or disorders more accurately. The goal of our research was therefore to develop a tactile rendering system for skin examinations by dynamic touch. Our development consists of two stages: converting a single image to a 3D haptic surface and rendering the generated haptic surface in real-time. Converting to 3D surfaces from 2D single images was implemented with concerning human perception data collected by a psychophysical experiment that measured human visual and haptic sensibility to 3D skin surface changes. For the second stage, we utilized real skin biomechanical properties found by prior studies. Our tactile rendering system is a standalone system that can be used with any single cameras and haptic feedback devices. We evaluated the performance of our system by conducting an identification experiment with three different skin images with five subjects. The participants had to identify one of the three skin surfaces by using a haptic device (Falcon) only. No visual cue was provided for the experiment. The results indicate that our system provides sufficient performance to render discernable tactile rendering with different skin surfaces. Our system uses only a single skin image and automatically generates a 3D haptic surface based on human haptic perception. Realistic skin interactions can be provided in real-time for the purpose of skin diagnosis, simulations, or training. Our system can also be used for other applications like virtual reality and cosmetic applications. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Learning to perceive haptic distance-to-break in the presence of friction.
Altenhoff, Bliss M; Pagano, Christopher C; Kil, Irfan; Burg, Timothy C
2017-02-01
Two experiments employed attunement and calibration training to investigate whether observers are able to identify material break points in compliant materials through haptic force application. The task required participants to attune to a recently identified haptic invariant, distance-to-break (DTB), rather than haptic stimulation not related to the invariant, including friction. In the first experiment participants probed simulated force-displacement relationships (materials) under 3 levels of friction with the aim of pushing as far as possible into the materials without breaking them. In a second experiment a different set of participants pulled on the materials. Results revealed that participants are sensitive to DTB for both pushing and pulling, even in the presence of varying levels of friction, and this sensitivity can be improved through training. The results suggest that the simultaneous presence of friction may assist participants in perceiving DTB. Potential applications include the development of haptic training programs for minimally invasive (laparoscopic) surgery to reduce accidental tissue damage. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Bernard, F.; Casset, F.; Danel, J. S.; Chappaz, C.; Basrour, S.
2016-08-01
This paper presents for the first time the characterization of a smartphone-size haptic rendering system based on the friction modulation effect. According to previous work and finite element modeling, the homogeneous flexural modes are needed to get the haptic feedback effect. The device studied consists of a thin film AlN transducers deposited on an 110 × 65 mm2 glass substrate. The transducer’s localization on the glass plate allows a transparent central area of 90 × 49 mm2. Electrical and mechanical parameters of the system are extracted from measurement. From this extraction, the electrical impedance matching reduced the applied voltage to 17.5 V AC and the power consumption to 1.53 W at the resonance frequency of the vibrating system to reach the haptic rendering specification. Transient characterizations of the actuation highlight a delay under the dynamic tactile detection. The characterization of the AlN transducers used as sensors, including the noise rejection, the delay or the output charge amplitude allows detections with high accuracy of any variation due to external influences. Those specifications are the first step to a low-power-consumption feedback-looped system.
Multi-fingered haptic palpation utilizing granular jamming stiffness feedback actuators
NASA Astrophysics Data System (ADS)
Li, Min; Ranzani, Tommaso; Sareh, Sina; Seneviratne, Lakmal D.; Dasgupta, Prokar; Wurdemann, Helge A.; Althoefer, Kaspar
2014-09-01
This paper describes a multi-fingered haptic palpation method using stiffness feedback actuators for simulating tissue palpation procedures in traditional and in robot-assisted minimally invasive surgery. Soft tissue stiffness is simulated by changing the stiffness property of the actuator during palpation. For the first time, granular jamming and pneumatic air actuation are combined to realize stiffness modulation. The stiffness feedback actuator is validated by stiffness measurements in indentation tests and through stiffness discrimination based on a user study. According to the indentation test results, the introduction of a pneumatic chamber to granular jamming can amplify the stiffness variation range and reduce hysteresis of the actuator. The advantage of multi-fingered palpation using the proposed actuators is proven by the comparison of the results of the stiffness discrimination performance using two-fingered (sensitivity: 82.2%, specificity: 88.9%, positive predicative value: 80.0%, accuracy: 85.4%, time: 4.84 s) and single-fingered (sensitivity: 76.4%, specificity: 85.7%, positive predicative value: 75.3%, accuracy: 81.8%, time: 7.48 s) stiffness feedback.
Shared virtual environments for telerehabilitation.
Popescu, George V; Burdea, Grigore; Boian, Rares
2002-01-01
Current VR telerehabilitation systems use offline remote monitoring from the clinic and patient-therapist videoconferencing. Such "store and forward" and video-based systems cannot implement medical services involving patient therapist direct interaction. Real-time telerehabilitation applications (including remote therapy) can be developed using a shared Virtual Environment (VE) architecture. We developed a two-user shared VE for hand telerehabilitation. Each site has a telerehabilitation workstation with a videocamera and a Rutgers Master II (RMII) force feedback glove. Each user can control a virtual hand and interact hapticly with virtual objects. Simulated physical interactions between therapist and patient are implemented using hand force feedback. The therapist's graphic interface contains several virtual panels, which allow control over the rehabilitation process. These controls start a videoconferencing session, collect patient data, or apply therapy. Several experimental telerehabilitation scenarios were successfully tested on a LAN. A Web-based approach to "real-time" patient telemonitoring--the monitoring portal for hand telerehabilitation--was also developed. The therapist interface is implemented as a Java3D applet that monitors patient hand movement. The monitoring portal gives real-time performance on off-the-shelf desktop workstations.
NASA Astrophysics Data System (ADS)
Ledermann, Christoph; Pauer, Hendrikje; Woern, Heinz
2014-05-01
In minimally invasive surgery, exible mechatronic instruments promise to improve the overall performance of surgical interventions. However, those instruments require highly developed sensors in order to provide haptic feedback to the surgeon or to enable (semi-)autonomous tasks. Precisely, haptic sensors and a shape sensor are required. In this paper, we present our ber optical sensor system of Fiber Bragg Gratings, which consists of a shape sensor, a kinesthetic sensor and a tactile sensor. The status quo of each of the three sensors is described, as well as the concept to integrate them into one ber optical sensor system.
Sensorimotor enhancement with a mixed reality system for balance and mobility rehabilitation.
Fung, Joyce; Perez, Claire F
2011-01-01
We have developed a mixed reality system incorporating virtual reality (VR), surface perturbations and light touch for gait rehabilitation. Haptic touch has emerged as a novel and efficient technique to improve postural control and dynamic stability. Our system combines visual display with the manipulation of physical environments and addition of haptic feedback to enhance balance and mobility post stroke. A research study involving 9 participants with stroke and 9 age-matched healthy individuals show that the haptic cue provided while walking is an effective means of improving gait stability in people post stroke, especially during challenging environmental conditions such as downslope walking.
Modeling the behavior of human body tissues on penetration
NASA Astrophysics Data System (ADS)
Conci, A.; Brazil, A. L.; Popovici, D.; Jiga, G.; Lebon, F.
2018-02-01
Several procedures in medicine (such as anesthesia, injections, biopsies and percutaneous treatments) involve a needle insertion. Such procedures operate without vision of the internal involved areas. Physicians and anesthetists rely on manual (force and tactile) feedback to guide their movements, so a number of medical practice is strongly based on manual skill. In order to be expert in the execution of such procedures the medical students must practice a number of times, but before practice in a real patient they must be trained in some place and a virtual environment, using Virtual Reality (VR) or Augmented Reality (AR) is the best possible solution for such training. In a virtual environment the success of user practices is improved by the addition of force output using haptic device to improve the manual sensations in the interactions between user and computer. Haptic devices enable simulate the physical restriction of the diverse tissues and force reactions to movements of operator hands. The trainees can effectively "feel" the reactions to theirs movements and receive immediate feedback from the actions executed by them in the implemented environment. However, in order to implement such systems, the tissue reaction to penetration and cutting must be modeled. A proper model must emulate the physical sensations of the needle action in the skin, fat, muscle, and so one, as if it really done in a patient that is as they are holding a real needle and feeling each tissue resistance when inserting it through the body. For example an average force value for human skin puncture is 6.0 N, it is 2.0 N for subcutaneous fat tissue and 4.4 N for muscles: this difference of sensations to penetration of each layers trespassed by the needle makes possible to suppose the correct position inside the body. This work presents a model for tissues before and after the cutting that with proper assumptions of proprieties can model any part of human body. It was based on experiments and used in embryonic system for epidural anesthesia having good evaluation as presented in the last section "Preliminary Results".
A haptic-inspired audio approach for structural health monitoring decision-making
NASA Astrophysics Data System (ADS)
Mao, Zhu; Todd, Michael; Mascareñas, David
2015-03-01
Haptics is the field at the interface of human touch (tactile sensation) and classification, whereby tactile feedback is used to train and inform a decision-making process. In structural health monitoring (SHM) applications, haptic devices have been introduced and applied in a simplified laboratory scale scenario, in which nonlinearity, representing the presence of damage, was encoded into a vibratory manual interface. In this paper, the "spirit" of haptics is adopted, but here ultrasonic guided wave scattering information is transformed into audio (rather than tactile) range signals. After sufficient training, the structural damage condition, including occurrence and location, can be identified through the encoded audio waveforms. Different algorithms are employed in this paper to generate the transformed audio signals and the performance of each encoding algorithms is compared, and also compared with standard machine learning classifiers. In the long run, the haptic decision-making is aiming to detect and classify structural damages in a more rigorous environment, and approaching a baseline-free fashion with embedded temperature compensation.
Ponce Wong, Ruben D; Hellman, Randall B; Santos, Veronica J
2014-01-01
Upper-limb amputees rely primarily on visual feedback when using their prostheses to interact with others or objects in their environment. A constant reliance upon visual feedback can be mentally exhausting and does not suffice for many activities when line-of-sight is unavailable. Upper-limb amputees could greatly benefit from the ability to perceive edges, one of the most salient features of 3D shape, through touch alone. We present an approach for estimating edge orientation with respect to an artificial fingertip through haptic exploration using a multimodal tactile sensor on a robot hand. Key parameters from the tactile signals for each of four exploratory procedures were used as inputs to a support vector regression model. Edge orientation angles ranging from -90 to 90 degrees were estimated with an 85-input model having an R (2) of 0.99 and RMS error of 5.08 degrees. Electrode impedance signals provided the most useful inputs by encoding spatially asymmetric skin deformation across the entire fingertip. Interestingly, sensor regions that were not in direct contact with the stimulus provided particularly useful information. Methods described here could pave the way for semi-autonomous capabilities in prosthetic or robotic hands during haptic exploration, especially when visual feedback is unavailable.
NASA Astrophysics Data System (ADS)
Han, Young-Min; Choi, Seung-Bok
2008-12-01
This paper presents the control performance of an electrorheological (ER) fluid-based haptic master device connected to a virtual slave environment that can be used for minimally invasive surgery (MIS). An already developed haptic joint featuring controllable ER fluid and a spherical joint mechanism is adopted for the master system. Medical forceps and an angular position measuring device are devised and integrated with the joint to establish the MIS master system. In order to embody a human organ in virtual space, a volumetric deformable object is used. The virtual object is then mathematically formulated by a shape-retaining chain-linked (S-chain) model. After evaluating the reflection force, computation time and compatibility with real-time control, the haptic architecture for MIS is established by incorporating the virtual slave with the master device so that the reflection force for the object of the virtual slave and the desired position for the master operator are transferred to each other. In order to achieve the desired force trajectories, a sliding mode controller is formulated and then experimentally realized. Tracking control performances for various force trajectories are evaluated and presented in the time domain.
Mixed reality temporal bone surgical dissector: mechanical design.
Hochman, Jordan Brent; Sepehri, Nariman; Rampersad, Vivek; Kraut, Jay; Khazraee, Milad; Pisa, Justyn; Unger, Bertram
2014-08-08
The Development of a Novel Mixed Reality (MR) Simulation. An evolving training environment emphasizes the importance of simulation. Current haptic temporal bone simulators have difficulty representing realistic contact forces and while 3D printed models convincingly represent vibrational properties of bone, they cannot reproduce soft tissue. This paper introduces a mixed reality model, where the effective elements of both simulations are combined; haptic rendering of soft tissue directly interacts with a printed bone model. This paper addresses one aspect in a series of challenges, specifically the mechanical merger of a haptic device with an otic drill. This further necessitates gravity cancelation of the work assembly gripper mechanism. In this system, the haptic end-effector is replaced by a high-speed drill and the virtual contact forces need to be repositioned to the drill tip from the mid wand. Previous publications detail generation of both the requisite printed and haptic simulations. Custom software was developed to reposition the haptic interaction point to the drill tip. A custom fitting, to hold the otic drill, was developed and its weight was offset using the haptic device. The robustness of the system to disturbances and its stable performance during drilling were tested. The experiments were performed on a mixed reality model consisting of two drillable rapid-prototyped layers separated by a free-space. Within the free-space, a linear virtual force model is applied to simulate drill contact with soft tissue. Testing illustrated the effectiveness of gravity cancellation. Additionally, the system exhibited excellent performance given random inputs and during the drill's passage between real and virtual components of the model. No issues with registration at model boundaries were encountered. These tests provide a proof of concept for the initial stages in the development of a novel mixed-reality temporal bone simulator.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anthony L. Crawford
MODIFIED PAPER TITLE AND ABSTRACT DUE TO SLIGHTLY MODIFIED SCOPE: TITLE: Nonlinear Force Profile Used to Increase the Performance of a Haptic User Interface for Teleoperating a Robotic Hand Natural movements and force feedback are important elements in using teleoperated equipment if complex and speedy manipulation tasks are to be accomplished in hazardous environments, such as hot cells, glove boxes, decommissioning, explosives disarmament, and space. The research associated with this paper hypothesizes that a user interface and complementary radiation compatible robotic hand that integrates the human hand’s anthropometric properties, speed capability, nonlinear strength profile, reduction of active degrees of freedommore » during the transition from manipulation to grasping, and just noticeable difference force sensation characteristics will enhance a user’s teleoperation performance. The main contribution of this research is in that a system that concisely integrates all these factors has yet to be developed and furthermore has yet to be applied to a hazardous environment as those referenced above. In fact, the most prominent slave manipulator teleoperation technology in use today is based on a design patented in 1945 (Patent 2632574) [1]. The robotic hand/user interface systems of similar function as the one being developed in this research limit their design input requirements in the best case to only complementing the hand’s anthropometric properties, speed capability, and linearly scaled force application relationship (e.g. robotic force is a constant, 4 times that of the user). In this paper a nonlinear relationship between the force experienced between the user interface and the robotic hand was devised based on property differences of manipulation and grasping activities as they pertain to the human hand. The results show that such a relationship when subjected to a manipulation task and grasping task produces increased performance compared to the traditional linear scaling techniques used by other systems. Key Words: Teleoperation, Robotic Hand, Robotic Force Scaling« less
Improving manual skills in persons with disabilities (PWD) through a multimodal assistance system.
Covarrubias, Mario; Gatti, Elia; Bordegoni, Monica; Cugini, Umberto; Mansutti, Alessandro
2014-07-01
In this research work, we present a Multimodal Guidance System (MGS) whose aim is to provide dynamic assistance to persons with disabilities (PWD) while performing manual activities such as drawing, coloring in and foam-cutting tasks. The MGS provides robotic assistance in the execution of 2D tasks through haptic and sound interactions. Haptic technology provides the virtual path of 2D shapes through the point-based approach, while sound technology provides audio feedback inputs related to the hand's velocity while sketching and filling or cutting operations. By combining this Multimodal System with the haptic assistance, we have created a new approach with possible applications to such diverse fields as physical rehabilitation, scientific investigation of sensorimotor learning and assessment of hand movements in PWD. The MGS has been tested by people with specific disorders affecting coordination, such as Down syndrome and developmental disabilities, under the supervision of their teachers and care assistants inside their learning environment. A Graphic User Interface has been designed for teachers and care assistants in order to provide training during the test sessions. Our results provide conclusive evidence that the effect of using the MGS increases the accuracy in the tasks operations. The Multimodal Guidance System (MGS) is an interface that offers haptic and sound feedback while performing manual tasks. Several studies demonstrated that the haptic guidance systems can help people in recovering cognitive function at different levels of complexity and impairment. The applications supported by our device could also have an important role in supporting physical therapist and cognitive psychologist in helping patients to recover motor and visuo-spatial abilities.
Garfjeld Roberts, Patrick; Guyver, Paul; Baldwin, Mathew; Akhtar, Kash; Alvand, Abtin; Price, Andrew J; Rees, Jonathan L
2017-02-01
To assess the construct and face validity of ArthroS, a passive haptic VR simulator. A secondary aim was to evaluate the novel performance metrics produced by this simulator. Two groups of 30 participants, each divided into novice, intermediate or expert based on arthroscopic experience, completed three separate tasks on either the knee or shoulder module of the simulator. Performance was recorded using 12 automatically generated performance metrics and video footage of the arthroscopic procedures. The videos were blindly assessed using a validated global rating scale (GRS). Participants completed a survey about the simulator's realism and training utility. This new simulator demonstrated construct validity of its tasks when evaluated against a GRS (p ≤ 0.003 in all cases). Regarding it's automatically generated performance metrics, established outputs such as time taken (p ≤ 0.001) and instrument path length (p ≤ 0.007) also demonstrated good construct validity. However, two-thirds of the proposed 'novel metrics' the simulator reports could not distinguish participants based on arthroscopic experience. Face validity assessment rated the simulator as a realistic and useful tool for trainees, but the passive haptic feedback (a key feature of this simulator) is rated as less realistic. The ArthroS simulator has good task construct validity based on established objective outputs, but some of the novel performance metrics could not distinguish between surgical experience. The passive haptic feedback of the simulator also needs improvement. If simulators could offer automated and validated performance feedback, this would facilitate improvements in the delivery of training by allowing trainees to practise and self-assess.
Semi-Immersive Virtual Turbine Engine Simulation System
NASA Astrophysics Data System (ADS)
Abidi, Mustufa H.; Al-Ahmari, Abdulrahman M.; Ahmad, Ali; Darmoul, Saber; Ameen, Wadea
2018-05-01
The design and verification of assembly operations is essential for planning product production operations. Recently, virtual prototyping has witnessed tremendous progress, and has reached a stage where current environments enable rich and multi-modal interaction between designers and models through stereoscopic visuals, surround sound, and haptic feedback. The benefits of building and using Virtual Reality (VR) models in assembly process verification are discussed in this paper. In this paper, we present the virtual assembly (VA) of an aircraft turbine engine. The assembly parts and sequences are explained using a virtual reality design system. The system enables stereoscopic visuals, surround sounds, and ample and intuitive interaction with developed models. A special software architecture is suggested to describe the assembly parts and assembly sequence in VR. A collision detection mechanism is employed that provides visual feedback to check the interference between components. The system is tested for virtual prototype and assembly sequencing of a turbine engine. We show that the developed system is comprehensive in terms of VR feedback mechanisms, which include visual, auditory, tactile, as well as force feedback. The system is shown to be effective and efficient for validating the design of assembly, part design, and operations planning.
Detection thresholds for small haptic effects
NASA Astrophysics Data System (ADS)
Dosher, Jesse A.; Hannaford, Blake
2002-02-01
We are interested in finding out whether or not haptic interfaces will be useful in portable and hand held devices. Such systems will have severe constraints on force output. Our first step is to investigate the lower limits at which haptic effects can be perceived. In this paper we report on experiments studying the effects of varying the amplitude, size, shape, and pulse-duration of a haptic feature. Using a specific haptic device we measure the smallest detectable haptics effects, with active exploration of saw-tooth shaped icons sized 3, 4 and 5 mm, a sine-shaped icon 5 mm wide, and static pulses 50, 100, and 150 ms in width. Smooth shaped icons resulted in a detection threshold of approximately 55 mN, almost twice that of saw-tooth shaped icons which had a threshold of 31 mN.
Sigrist, Roland; Rauter, Georg; Marchal-Crespo, Laura; Riener, Robert; Wolf, Peter
2015-03-01
Concurrent augmented feedback has been shown to be less effective for learning simple motor tasks than for complex tasks. However, as mostly artificial tasks have been investigated, transfer of results to tasks in sports and rehabilitation remains unknown. Therefore, in this study, the effect of different concurrent feedback was evaluated in trunk-arm rowing. It was then investigated whether multimodal audiovisual and visuohaptic feedback are more effective for learning than visual feedback only. Naïve subjects (N = 24) trained in three groups on a highly realistic virtual reality-based rowing simulator. In the visual feedback group, the subject's oar was superimposed to the target oar, which continuously became more transparent when the deviation between the oars decreased. Moreover, a trace of the subject's trajectory emerged if deviations exceeded a threshold. The audiovisual feedback group trained with oar movement sonification in addition to visual feedback to facilitate learning of the velocity profile. In the visuohaptic group, the oar movement was inhibited by path deviation-dependent braking forces to enhance learning of spatial aspects. All groups significantly decreased the spatial error (tendency in visual group) and velocity error from baseline to the retention tests. Audiovisual feedback fostered learning of the velocity profile significantly more than visuohaptic feedback. The study revealed that well-designed concurrent feedback fosters complex task learning, especially if the advantages of different modalities are exploited. Further studies should analyze the impact of within-feedback design parameters and the transferability of the results to other tasks in sports and rehabilitation.
Students' Development of Representational Competence Through the Sense of Touch
NASA Astrophysics Data System (ADS)
Magana, Alejandra J.; Balachandran, Sadhana
2017-06-01
Electromagnetism is an umbrella encapsulating several different concepts like electric current, electric fields and forces, and magnetic fields and forces, among other topics. However, a number of studies in the past have highlighted the poor conceptual understanding of electromagnetism concepts by students even after instruction. This study aims to identify novel forms of "hands-on" instruction that can result in representational competence and conceptual gain. Specifically, this study aimed to identify if the use of visuohaptic simulations can have an effect on student representations of electromagnetic-related concepts. The guiding questions is How do visuohaptic simulations influence undergraduate students' representations of electric forces? Participants included nine undergraduate students from science, technology, or engineering backgrounds who participated in a think-aloud procedure while interacting with a visuohaptic simulation. The think-aloud procedure was divided in three stages, a prediction stage, a minimally visual haptic stage, and a visually enhanced haptic stage. The results of this study suggest that students' accurately characterized and represented the forces felt around a particle, line, and ring charges either in the prediction stage, a minimally visual haptic stage or the visually enhanced haptic stage. Also, some students accurately depicted the three-dimensional nature of the field for each configuration in the two stages that included a tactile mode, where the point charge was the most challenging one.
Active Manual Movement Improves Directional Perception of Illusory Force.
Amemiya, Tomohiro; Gomi, Hiroaki
2016-01-01
Active touch sensing is known to facilitate the discrimination or recognition of the spatial properties of an object from the movement of tactile sensors on the skin and by integrating proprioceptive feedback about hand positions or motor commands related to ongoing hand movements. On the other hand, several studies have reported that tactile processing is suppressed by hand movement. Thus, it is unclear whether or not the active exploration of force direction by using hand or arm movement improves the perception of the force direction. Here, we show that active manual movement in both the rotational and translational directions enhances the precise perception of the force direction. To make it possible to move a hand in space without any physical constraints, we have adopted a method of inducing the sensation of illusory force by asymmetric vibration. We found that the precision of the perceived force direction was significantly better when the shoulder is rotated medially and laterally. We also found that directional errors supplied by the motor response of the perceived force were smaller than those resulting from perceptual judgments between visual and haptic directional stimuli. These results demonstrate that active manual movement boosts the precision of the perceived direction of an illusory force.
Haptic Cues for Balance: Use of a Cane Provides Immediate Body Stabilization
Sozzi, Stefania; Crisafulli, Oscar; Schieppati, Marco
2017-01-01
Haptic cues are important for balance. Knowledge of the temporal features of their effect may be crucial for the design of neural prostheses. Touching a stable surface with a fingertip reduces body sway in standing subjects eyes closed (EC), and removal of haptic cue reinstates a large sway pattern. Changes in sway occur rapidly on changing haptic conditions. Here, we describe the effects and time-course of stabilization produced by a haptic cue derived from a walking cane. We intended to confirm that cane use reduces body sway, to evaluate the effect of vision on stabilization by a cane, and to estimate the delay of the changes in body sway after addition and withdrawal of haptic input. Seventeen healthy young subjects stood in tandem position on a force platform, with eyes closed or open (EO). They gently lowered the cane onto and lifted it from a second force platform. Sixty trials per direction of haptic shift (Touch → NoTouch, T-NT; NoTouch → Touch, NT-T) and visual condition (EC-EO) were acquired. Traces of Center of foot Pressure (CoP) and the force exerted by cane were filtered, rectified, and averaged. The position in space of a reflective marker positioned on the cane tip was also acquired by an optoelectronic device. Cross-correlation (CC) analysis was performed between traces of cane tip and CoP displacement. Latencies of changes in CoP oscillation in the frontal plane EC following the T-NT and NT-T haptic shift were statistically estimated. The CoP oscillations were larger in EC than EO under both T and NT (p < 0.001) and larger during NT than T conditions (p < 0.001). Haptic-induced effect under EC (Romberg quotient NT/T ~ 1.2) was less effective than that of vision under NT condition (EC/EO ~ 1.5) (p < 0.001). With EO cane had little effect. Cane displacement lagged CoP displacement under both EC and EO. Latencies to changes in CoP oscillations were longer after addition (NT-T, about 1.6 s) than withdrawal (T-NT, about 0.9 s) of haptic input (p < 0.001). These latencies were similar to those occurring on fingertip touch, as previously shown. Overall, data speak in favor of substantial equivalence of the haptic information derived from both “direct” fingertip contact and “indirect” contact with the floor mediated by the cane. Cane, finger and visual inputs would be similarly integrated in the same neural centers for balance control. Haptic input from a walking aid and its processing time should be considered when designing prostheses for locomotion. PMID:29311785
Rodriguez-Guerrero, Carlos; Knaepen, Kristel; Fraile-Marinero, Juan C.; Perez-Turiel, Javier; Gonzalez-de-Garibay, Valentin; Lefeber, Dirk
2017-01-01
In order to harmonize robotic devices with human beings, the robots should be able to perceive important psychosomatic impact triggered by emotional states such as frustration or boredom. This paper presents a new type of biocooperative control architecture, which acts toward improving the challenge/skill relation perceived by the user when interacting with a robotic multimodal interface in a cooperative scenario. In the first part of the paper, open-loop experiments revealed which physiological signals were optimal for inclusion in the feedback loop. These were heart rate, skin conductance level, and skin conductance response frequency. In the second part of the paper, the proposed controller, consisting of a biocooperative architecture with two degrees of freedom, simultaneously modulating game difficulty and haptic assistance through performance and psychophysiological feedback, is presented. With this setup, the perceived challenge can be modulated by means of the game difficulty and the perceived skill by means of the haptic assistance. A new metric (FlowIndex) is proposed to numerically quantify and visualize the challenge/skill relation. The results are contrasted with comparable previously published work and show that the new method afforded a higher FlowIndex (i.e., a superior challenge/skill relation) and an improved balance between augmented performance and user satisfaction (higher level of valence, i.e., a more enjoyable and satisfactory experience). PMID:28507503
A Study on Immersion and Presence of a Portable Hand Haptic System for Immersive Virtual Reality
Kim, Mingyu; Jeon, Changyu; Kim, Jinmo
2017-01-01
This paper proposes a portable hand haptic system using Leap Motion as a haptic interface that can be used in various virtual reality (VR) applications. The proposed hand haptic system was designed as an Arduino-based sensor architecture to enable a variety of tactile senses at low cost, and is also equipped with a portable wristband. As a haptic system designed for tactile feedback, the proposed system first identifies the left and right hands and then sends tactile senses (vibration and heat) to each fingertip (thumb and index finger). It is incorporated into a wearable band-type system, making its use easy and convenient. Next, hand motion is accurately captured using the sensor of the hand tracking system and is used for virtual object control, thus achieving interaction that enhances immersion. A VR application was designed with the purpose of testing the immersion and presence aspects of the proposed system. Lastly, technical and statistical tests were carried out to assess whether the proposed haptic system can provide a new immersive presence to users. According to the results of the presence questionnaire and the simulator sickness questionnaire, we confirmed that the proposed hand haptic system, in comparison to the existing interaction that uses only the hand tracking system, provided greater presence and a more immersive environment in the virtual reality. PMID:28513545
Design of a 4-DOF MR haptic master for application to robot surgery: virtual environment work
NASA Astrophysics Data System (ADS)
Oh, Jong-Seok; Choi, Seung-Hyun; Choi, Seung-Bok
2014-09-01
This paper presents the design and control performance of a novel type of 4-degrees-of-freedom (4-DOF) haptic master in cyberspace for a robot-assisted minimally invasive surgery (RMIS) application. By using a controllable magnetorheological (MR) fluid, the proposed haptic master can have a feedback function for a surgical robot. Due to the difficulty in utilizing real human organs in the experiment, the cyberspace that features the virtual object is constructed to evaluate the performance of the haptic master. In order to realize the cyberspace, a volumetric deformable object is represented by a shape-retaining chain-linked (S-chain) model, which is a fast volumetric model and is suitable for real-time applications. In the haptic architecture for an RMIS application, the desired torque and position induced from the virtual object of the cyberspace and the haptic master of real space are transferred to each other. In order to validate the superiority of the proposed master and volumetric model, a tracking control experiment is implemented with a nonhomogenous volumetric cubic object to demonstrate that the proposed model can be utilized in real-time haptic rendering architecture. A proportional-integral-derivative (PID) controller is then designed and empirically implemented to accomplish the desired torque trajectories. It has been verified from the experiment that tracking the control performance for torque trajectories from a virtual slave can be successfully achieved.
A Study on Immersion and Presence of a Portable Hand Haptic System for Immersive Virtual Reality.
Kim, Mingyu; Jeon, Changyu; Kim, Jinmo
2017-05-17
This paper proposes a portable hand haptic system using Leap Motion as a haptic interface that can be used in various virtual reality (VR) applications. The proposed hand haptic system was designed as an Arduino-based sensor architecture to enable a variety of tactile senses at low cost, and is also equipped with a portable wristband. As a haptic system designed for tactile feedback, the proposed system first identifies the left and right hands and then sends tactile senses (vibration and heat) to each fingertip (thumb and index finger). It is incorporated into a wearable band-type system, making its use easy and convenient. Next, hand motion is accurately captured using the sensor of the hand tracking system and is used for virtual object control, thus achieving interaction that enhances immersion. A VR application was designed with the purpose of testing the immersion and presence aspects of the proposed system. Lastly, technical and statistical tests were carried out to assess whether the proposed haptic system can provide a new immersive presence to users. According to the results of the presence questionnaire and the simulator sickness questionnaire, we confirmed that the proposed hand haptic system, in comparison to the existing interaction that uses only the hand tracking system, provided greater presence and a more immersive environment in the virtual reality.
NASA Technical Reports Server (NTRS)
Mavroidis, Constantinos; Pfeiffer, Charles; Paljic, Alex; Celestino, James; Lennon, Jamie; Bar-Cohen, Yoseph
2000-01-01
For many years, the robotic community sought to develop robots that can eventually operate autonomously and eliminate the need for human operators. However, there is an increasing realization that there are some tasks that human can perform significantly better but, due to associated hazards, distance, physical limitations and other causes, only robot can be employed to perform these tasks. Remotely performing these types of tasks requires operating robots as human surrogates. While current "hand master" haptic systems are able to reproduce the feeling of rigid objects, they present great difficulties in emulating the feeling of remote/virtual stiffness. In addition, they tend to be heavy, cumbersome and usually they only allow limited operator workspace. In this paper a novel haptic interface is presented to enable human-operators to "feel" and intuitively mirror the stiffness/forces at remote/virtual sites enabling control of robots as human-surrogates. This haptic interface is intended to provide human operators intuitive feeling of the stiffness and forces at remote or virtual sites in support of space robots performing dexterous manipulation tasks (such as operating a wrench or a drill). Remote applications are referred to the control of actual robots whereas virtual applications are referred to simulated operations. The developed haptic interface will be applicable to IVA operated robotic EVA tasks to enhance human performance, extend crew capability and assure crew safety. The electrically controlled stiffness is obtained using constrained ElectroRheological Fluids (ERF), which changes its viscosity under electrical stimulation. Forces applied at the robot end-effector due to a compliant environment will be reflected to the user using this ERF device where a change in the system viscosity will occur proportionally to the force to be transmitted. In this paper, we will present the results of our modeling, simulation, and initial testing of such an electrorheological fluid (ERF) based haptic device.
Development of a force-reflecting robotic platform for cardiac catheter navigation.
Park, Jun Woo; Choi, Jaesoon; Pak, Hui-Nam; Song, Seung Joon; Lee, Jung Chan; Park, Yongdoo; Shin, Seung Min; Sun, Kyung
2010-11-01
Electrophysiological catheters are used for both diagnostics and clinical intervention. To facilitate more accurate and precise catheter navigation, robotic cardiac catheter navigation systems have been developed and commercialized. The authors have developed a novel force-reflecting robotic catheter navigation system. The system is a network-based master-slave configuration having a 3-degree of freedom robotic manipulator for operation with a conventional cardiac ablation catheter. The master manipulator implements a haptic user interface device with force feedback using a force or torque signal either measured with a sensor or estimated from the motor current signal in the slave manipulator. The slave manipulator is a robotic motion control platform on which the cardiac ablation catheter is mounted. The catheter motions-forward and backward movements, rolling, and catheter tip bending-are controlled by electromechanical actuators located in the slave manipulator. The control software runs on a real-time operating system-based workstation and implements the master/slave motion synchronization control of the robot system. The master/slave motion synchronization response was assessed with step, sinusoidal, and arbitrarily varying motion commands, and showed satisfactory performance with insignificant steady-state motion error. The current system successfully implemented the motion control function and will undergo safety and performance evaluation by means of animal experiments. Further studies on the force feedback control algorithm and on an active motion catheter with an embedded actuation mechanism are underway. © 2010, Copyright the Authors. Artificial Organs © 2010, International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.
Fu, Qiushi; Santello, Marco
2018-01-01
The concept of postural synergies of the human hand has been shown to potentially reduce complexity in the neuromuscular control of grasping. By merging this concept with soft robotics approaches, a multi degrees of freedom soft-synergy prosthetic hand [SoftHand-Pro (SHP)] was created. The mechanical innovation of the SHP enables adaptive and robust functional grasps with simple and intuitive myoelectric control from only two surface electromyogram (sEMG) channels. However, the current myoelectric controller has very limited capability for fine control of grasp forces. We addressed this challenge by designing a hybrid-gain myoelectric controller that switches control gains based on the sensorimotor state of the SHP. This controller was tested against a conventional single-gain (SG) controller, as well as against native hand in able-bodied subjects. We used the following tasks to evaluate the performance of grasp force control: (1) pick and place objects with different size, weight, and fragility levels using power or precision grasp and (2) squeezing objects with different stiffness. Sensory feedback of the grasp forces was provided to the user through a non-invasive, mechanotactile haptic feedback device mounted on the upper arm. We demonstrated that the novel hybrid controller enabled superior task completion speed and fine force control over SG controller in object pick-and-place tasks. We also found that the performance of the hybrid controller qualitatively agrees with the performance of native human hands. PMID:29375360
Mixed reality temporal bone surgical dissector: mechanical design
2014-01-01
Objective The Development of a Novel Mixed Reality (MR) Simulation. An evolving training environment emphasizes the importance of simulation. Current haptic temporal bone simulators have difficulty representing realistic contact forces and while 3D printed models convincingly represent vibrational properties of bone, they cannot reproduce soft tissue. This paper introduces a mixed reality model, where the effective elements of both simulations are combined; haptic rendering of soft tissue directly interacts with a printed bone model. This paper addresses one aspect in a series of challenges, specifically the mechanical merger of a haptic device with an otic drill. This further necessitates gravity cancelation of the work assembly gripper mechanism. In this system, the haptic end-effector is replaced by a high-speed drill and the virtual contact forces need to be repositioned to the drill tip from the mid wand. Previous publications detail generation of both the requisite printed and haptic simulations. Method Custom software was developed to reposition the haptic interaction point to the drill tip. A custom fitting, to hold the otic drill, was developed and its weight was offset using the haptic device. The robustness of the system to disturbances and its stable performance during drilling were tested. The experiments were performed on a mixed reality model consisting of two drillable rapid-prototyped layers separated by a free-space. Within the free-space, a linear virtual force model is applied to simulate drill contact with soft tissue. Results Testing illustrated the effectiveness of gravity cancellation. Additionally, the system exhibited excellent performance given random inputs and during the drill’s passage between real and virtual components of the model. No issues with registration at model boundaries were encountered. Conclusion These tests provide a proof of concept for the initial stages in the development of a novel mixed-reality temporal bone simulator. PMID:25927300
Performance improvement of planar dielectric elastomer actuators by magnetic modulating mechanism
NASA Astrophysics Data System (ADS)
Zhao, Yun-Hua; Li, Wen-Bo; Zhang, Wen-Ming; Yan, Han; Peng, Zhi-Ke; Meng, Guang
2018-06-01
In this paper, a novel planar dielectric elastomer actuator (DEA) with magnetic modulating mechanism is proposed. This design can provide the availability of wider actuation range and larger output force, which are significant indicators to evaluate the performance of DEAs. The DEA tends to be a compact and simple design, and an analytical model is developed to characterize the mechanical behavior. The result shows that the output force induced by the DEA can be improved by 76.90% under a certain applied voltage and initial magnet distance. Moreover, experiments are carried out to reveal the performance of the proposed DEA and validate the theoretical model. It demonstrates that the DEA using magnetic modulating mechanism can enlarge the actuation range and has more remarkable effect with decreasing initial magnet distance within the stable range. It can be useful to promote the applications of DEAs to soft robots and haptic feedback.
Girod, Sabine; Schvartzman, Sara C; Gaudilliere, Dyani; Salisbury, Kenneth; Silva, Rebeka
2016-01-01
Computer-assisted surgical (CAS) planning tools are available for craniofacial surgery, but are usually based on computer-aided design (CAD) tools that lack the ability to detect the collision of virtual objects (i.e., fractured bone segments). We developed a CAS system featuring a sense of touch (haptic) that enables surgeons to physically interact with individual, patient-specific anatomy and immerse in a three-dimensional virtual environment. In this study, we evaluated initial user experience with our novel system compared to an existing CAD system. Ten surgery resident trainees received a brief verbal introduction to both the haptic and CAD systems. Users simulated mandibular fracture reduction in three clinical cases within a 15 min time limit for each system and completed a questionnaire to assess their subjective experience. We compared standard landmarks and linear and angular measurements between the simulated results and the actual surgical outcome and found that haptic simulation results were not significantly different from actual postoperative outcomes. In contrast, CAD results significantly differed from both the haptic simulation and actual postoperative results. In addition to enabling a more accurate fracture repair, the haptic system provided a better user experience than the CAD system in terms of intuitiveness and self-reported quality of repair.
Geometric modeling of the temporal bone for cochlea implant simulation
NASA Astrophysics Data System (ADS)
Todd, Catherine A.; Naghdy, Fazel; O'Leary, Stephen
2004-05-01
The first stage in the development of a clinically valid surgical simulator for training otologic surgeons in performing cochlea implantation is presented. For this purpose, a geometric model of the temporal bone has been derived from a cadaver specimen using the biomedical image processing software package Analyze (AnalyzeDirect, Inc) and its three-dimensional reconstruction is examined. Simulator construction begins with registration and processing of a Computer Tomography (CT) medical image sequence. Important anatomical structures of the middle and inner ear are identified and segmented from each scan in a semi-automated threshold-based approach. Linear interpolation between image slices produces a three-dimensional volume dataset: the geometrical model. Artefacts are effectively eliminated using a semi-automatic seeded region-growing algorithm and unnecessary bony structures are removed. Once validated by an Ear, Nose and Throat (ENT) specialist, the model may be imported into the Reachin Application Programming Interface (API) (Reachin Technologies AB) for visual and haptic rendering associated with a virtual mastoidectomy. Interaction with the model is realized with haptics interfacing, providing the user with accurate torque and force feedback. Electrode array insertion into the cochlea will be introduced in the final stage of design.
A robotic wheelchair trainer: design overview and a feasibility study
2010-01-01
Background Experiencing independent mobility is important for children with a severe movement disability, but learning to drive a powered wheelchair can be labor intensive, requiring hand-over-hand assistance from a skilled therapist. Methods To improve accessibility to training, we developed a robotic wheelchair trainer that steers itself along a course marked by a line on the floor using computer vision, haptically guiding the driver's hand in appropriate steering motions using a force feedback joystick, as the driver tries to catch a mobile robot in a game of "robot tag". This paper provides a detailed design description of the computer vision and control system. In addition, we present data from a pilot study in which we used the chair to teach children without motor impairment aged 4-9 (n = 22) to drive the wheelchair in a single training session, in order to verify that the wheelchair could enable learning by the non-impaired motor system, and to establish normative values of learning rates. Results and Discussion Training with haptic guidance from the robotic wheelchair trainer improved the steering ability of children without motor impairment significantly more than training without guidance. We also report the results of a case study with one 8-year-old child with a severe motor impairment due to cerebral palsy, who replicated the single-session training protocol that the non-disabled children participated in. This child also improved steering ability after training with guidance from the joystick by an amount even greater than the children without motor impairment. Conclusions The system not only provided a safe, fun context for automating driver's training, but also enhanced motor learning by the non-impaired motor system, presumably by demonstrating through intuitive movement and force of the joystick itself exemplary control to follow the course. The case study indicates that a child with a motor system impaired by CP can also gain a short-term benefit from driver's training with haptic guidance. PMID:20707886
A robotic wheelchair trainer: design overview and a feasibility study.
Marchal-Crespo, Laura; Furumasu, Jan; Reinkensmeyer, David J
2010-08-13
Experiencing independent mobility is important for children with a severe movement disability, but learning to drive a powered wheelchair can be labor intensive, requiring hand-over-hand assistance from a skilled therapist. To improve accessibility to training, we developed a robotic wheelchair trainer that steers itself along a course marked by a line on the floor using computer vision, haptically guiding the driver's hand in appropriate steering motions using a force feedback joystick, as the driver tries to catch a mobile robot in a game of "robot tag". This paper provides a detailed design description of the computer vision and control system. In addition, we present data from a pilot study in which we used the chair to teach children without motor impairment aged 4-9 (n = 22) to drive the wheelchair in a single training session, in order to verify that the wheelchair could enable learning by the non-impaired motor system, and to establish normative values of learning rates. Training with haptic guidance from the robotic wheelchair trainer improved the steering ability of children without motor impairment significantly more than training without guidance. We also report the results of a case study with one 8-year-old child with a severe motor impairment due to cerebral palsy, who replicated the single-session training protocol that the non-disabled children participated in. This child also improved steering ability after training with guidance from the joystick by an amount even greater than the children without motor impairment. The system not only provided a safe, fun context for automating driver's training, but also enhanced motor learning by the non-impaired motor system, presumably by demonstrating through intuitive movement and force of the joystick itself exemplary control to follow the course. The case study indicates that a child with a motor system impaired by CP can also gain a short-term benefit from driver's training with haptic guidance.
Multimodal Virtual Environments: MAGIC Toolkit and Visual-Haptic Interaction Paradigms
1998-01-01
2.7.3 Load/Save Options ..... 2.7.4 Information Display .... 2.8 Library Files. 2.9 Evaluation .............. 3 Visual-Haptic Interactions 3.1...Northwestern University[ Colgate , 1994]. It is possible for a user to touch one side of a thin object and be propelled out the opposite side, because...when there is a high correlation in motion and force between the visual and haptic realms. * Chapter 7 concludes with an evaluation of the application
Haptic device for a ventricular shunt insertion simulator.
Panchaphongsaphak, Bundit; Stutzer, Diego; Schwyter, Etienne; Bernays, René-Ludwig; Riener, Robert
2006-01-01
In this paper we propose a new one-degree-of-freedom haptic device that can be used to simulate ventricular shunt insertion procedures. The device is used together with the BRAINTRAIN training simulator developed for neuroscience education, neurological data visualization and surgical planning. The design of the haptic device is based on a push-pull cable concept. The rendered forces produced by a linear motor connected at one end of the cable are transferred to the user via a sliding mechanism at the end-effector located at the other end of the cable. The end-effector provides the range of movement up to 12 cm. The force is controlled by an open-loop impedance algorithm and can become up to 15 N.
Intra-Personal and Inter-Personal Kinetic Synergies During Jumping.
Slomka, Kajetan; Juras, Grzegorz; Sobota, Grzegorz; Furmanek, Mariusz; Rzepko, Marian; Latash, Mark L
2015-12-22
We explored synergies between two legs and two subjects during preparation for a long jump into a target. Synergies were expected during one-person jumping. No such synergies were expected between two persons jumping in parallel without additional contact, while synergies were expected to emerge with haptic contact and become stronger with strong mechanical contact. Subjects performed jumps either alone (each foot standing on a separate force platform) or in dyads (parallel to each other, each person standing on a separate force platform) without any contact, with haptic contact, and with strong coupling. Strong negative correlations between pairs of force variables (strong synergies) were seen in the vertical force in one-person jumps and weaker synergies in two-person jumps with the strong contact. For other force variables, only weak synergies were present in one-person jumps and no negative correlations between pairs of force variable for two-person jumps. Pairs of moment variables from the two force platforms at steady state showed positive correlations, which were strong in one-person jumps and weaker, but still significant, in two-person jumps with the haptic and strong contact. Anticipatory synergy adjustments prior to action initiation were observed in one-person trials only. We interpret the different results for the force and moment variables at steady state as reflections of postural sway.
Intra-Personal and Inter-Personal Kinetic Synergies During Jumping
Slomka, Kajetan; Juras, Grzegorz; Sobota, Grzegorz; Furmanek, Mariusz; Rzepko, Marian; Latash, Mark L.
2015-01-01
We explored synergies between two legs and two subjects during preparation for a long jump into a target. Synergies were expected during one-person jumping. No such synergies were expected between two persons jumping in parallel without additional contact, while synergies were expected to emerge with haptic contact and become stronger with strong mechanical contact. Subjects performed jumps either alone (each foot standing on a separate force platform) or in dyads (parallel to each other, each person standing on a separate force platform) without any contact, with haptic contact, and with strong coupling. Strong negative correlations between pairs of force variables (strong synergies) were seen in the vertical force in one-person jumps and weaker synergies in two-person jumps with the strong contact. For other force variables, only weak synergies were present in one-person jumps and no negative correlations between pairs of force variable for two-person jumps. Pairs of moment variables from the two force platforms at steady state showed positive correlations, which were strong in one-person jumps and weaker, but still significant, in two-person jumps with the haptic and strong contact. Anticipatory synergy adjustments prior to action initiation were observed in one-person trials only. We interpret the different results for the force and moment variables at steady state as reflections of postural sway. PMID:26839608
Menon, Samir; Brantner, Gerald; Aholt, Chris; Kay, Kendrick; Khatib, Oussama
2013-01-01
A challenging problem in motor control neuroimaging studies is the inability to perform complex human motor tasks given the Magnetic Resonance Imaging (MRI) scanner's disruptive magnetic fields and confined workspace. In this paper, we propose a novel experimental platform that combines Functional MRI (fMRI) neuroimaging, haptic virtual simulation environments, and an fMRI-compatible haptic device for real-time haptic interaction across the scanner workspace (above torso ∼ .65×.40×.20m(3)). We implement this Haptic fMRI platform with a novel haptic device, the Haptic fMRI Interface (HFI), and demonstrate its suitability for motor neuroimaging studies. HFI has three degrees-of-freedom (DOF), uses electromagnetic motors to enable high-fidelity haptic rendering (>350Hz), integrates radio frequency (RF) shields to prevent electromagnetic interference with fMRI (temporal SNR >100), and is kinematically designed to minimize currents induced by the MRI scanner's magnetic field during motor displacement (<2cm). HFI possesses uniform inertial and force transmission properties across the workspace, and has low friction (.05-.30N). HFI's RF noise levels, in addition, are within a 3 Tesla fMRI scanner's baseline noise variation (∼.85±.1%). Finally, HFI is haptically transparent and does not interfere with human motor tasks (tested for .4m reaches). By allowing fMRI experiments involving complex three-dimensional manipulation with haptic interaction, Haptic fMRI enables-for the first time-non-invasive neuroscience experiments involving interactive motor tasks, object manipulation, tactile perception, and visuo-motor integration.
Contact Force Compensated Thermal Stimulators for Holistic Haptic Interfaces.
Sim, Jai Kyoung; Cho, Young-Ho
2016-05-01
We present a contact force compensated thermal stimulator that can provide a consistent tempera- ture sensation on the human skin independent of the contact force between the thermal stimulator and the skin. Previous passive thermal stimulators were not capable of providing a consistent tem- perature on the human skin even when using identical heat source voltage due to an inconsistency of the heat conduction, which changes due to the force-dependent thermal contact resistance. We propose a force-based feedback method that monitors the contact force and controls the heat source voltage according to this contact force, thus providing consistent temperature on the skin. We composed a heat circuit model equivalent to the skin heat-transfer rate as it is changed by the contact forces; we obtained the optimal voltage condition for the constant skin heat-transfer rate independent of the contact force using a numerical estimation simulation tool. Then, in the experiment, we heated real human skin at the obtained heat source voltage condition, and investigated the skin heat transfer-rate by measuring the skin temperature at various times at different levels of contact force. In the numerical estimation results, the skin heat-transfer rate for the contact forces showed a linear profile in the contact force range of 1-3 N; from this profile we obtained the voltage equation for heat source control. In the experimental study, we adjusted the heat source voltage according to the contact force based on the obtained equation. As a result, without the heat source voltage control for the contact forces, the coefficients of variation (CV) of the skin heat-transfer rate in the contact force range of 1-3 N was found to be 11.9%. On the other hand, with the heat source voltage control for the contact forces, the CV of the skin heat-transfer rate in the contact force range of 1-3 N was found to be barely 2.0%, which indicate an 83.2% improvement in consistency compared to the skin heat-transfer rate without the heat source voltage control. The present technique provides a consistent temperature sensation on the human skin independent of the body movement environment; therefore, it has high potential for use in holistic haptic interfaces that have thermal displays.
A Haptic-Enhanced System for Molecular Sensing
NASA Astrophysics Data System (ADS)
Comai, Sara; Mazza, Davide
The science of haptics has received an enormous attention in the last decade. One of the major application trends of haptics technology is data visualization and training. In this paper, we present a haptically-enhanced system for manipulation and tactile exploration of molecules.The geometrical models of molecules is extracted either from theoretical or empirical data using file formats widely adopted in chemical and biological fields. The addition of information computed with computational chemistry tools, allows users to feel the interaction forces between an explored molecule and a charge associated to the haptic device, and to visualize a huge amount of numerical data in a more comprehensible way. The developed tool can be used either for teaching or research purposes due to its high reliance on both theoretical and experimental data.
Haptic Technologies for MEMS Design
NASA Astrophysics Data System (ADS)
Calis, Mustafa; Desmulliez, Marc P. Y.
2006-04-01
This paper presents for the first time a design methodology for MEMS/NEMS based on haptic sensing technologies. The software tool created as a result of this methodology will enable designers to model and interact in real time with their virtual prototype. One of the main advantages of haptic sensing is the ability to bring unusual microscopic forces back to the designer's world. Other significant benefits for developing such a methodology include gain productivity and the capability to include manufacturing costs within the design cycle.
A brain-computer interface with vibrotactile biofeedback for haptic information.
Chatterjee, Aniruddha; Aggarwal, Vikram; Ramos, Ander; Acharya, Soumyadipta; Thakor, Nitish V
2007-10-17
It has been suggested that Brain-Computer Interfaces (BCI) may one day be suitable for controlling a neuroprosthesis. For closed-loop operation of BCI, a tactile feedback channel that is compatible with neuroprosthetic applications is desired. Operation of an EEG-based BCI using only vibrotactile feedback, a commonly used method to convey haptic senses of contact and pressure, is demonstrated with a high level of accuracy. A Mu-rhythm based BCI using a motor imagery paradigm was used to control the position of a virtual cursor. The cursor position was shown visually as well as transmitted haptically by modulating the intensity of a vibrotactile stimulus to the upper limb. A total of six subjects operated the BCI in a two-stage targeting task, receiving only vibrotactile biofeedback of performance. The location of the vibration was also systematically varied between the left and right arms to investigate location-dependent effects on performance. Subjects are able to control the BCI using only vibrotactile feedback with an average accuracy of 56% and as high as 72%. These accuracies are significantly higher than the 15% predicted by random chance if the subject had no voluntary control of their Mu-rhythm. The results of this study demonstrate that vibrotactile feedback is an effective biofeedback modality to operate a BCI using motor imagery. In addition, the study shows that placement of the vibrotactile stimulation on the biceps ipsilateral or contralateral to the motor imagery introduces a significant bias in the BCI accuracy. This bias is consistent with a drop in performance generated by stimulation of the contralateral limb. Users demonstrated the capability to overcome this bias with training.
Haptic cues for orientation and postural control in sighted and blind individuals
NASA Technical Reports Server (NTRS)
Jeka, J. J.; Easton, R. D.; Bentzen, B. L.; Lackner, J. R.
1996-01-01
Haptic cues from fingertip contact with a stable surface attenuate body sway in subjects even when the contact forces are too small to provide physical support of the body. We investigated how haptic cues derived from contact of a cane with a stationary surface at low force levels aids postural control in sighted and congenitally blind individuals. Five sighted (eyes closed) and five congenitally blind subjects maintained a tandem Romberg stance in five conditions: (1) no cane; (2,3) touch contact (< 2 N of applied force) while holding the cane in a vertical or slanted orientation; and (4,5) force contact (as much force as desired) in the vertical and slanted orientations. Touch contact of a cane at force levels below those necessary to provide significant physical stabilization was as effective as force contact in reducing postural sway in all subjects, compared to the no-cane condition. A slanted cane was far more effective in reducing postural sway than was a perpendicular cane. Cane use also decreased head displacement of sighted subjects far more than that of blind subjects. These results suggest that head movement control is linked to postural control through gaze stabilization reflexes in sighted subjects; such reflexes are absent in congenitally blind individuals and may account for their higher levels of head displacement.
Modeling the forces of cutting with scissors.
Mahvash, Mohsen; Voo, Liming M; Kim, Diana; Jeung, Kristin; Wainer, Joshua; Okamura, Allison M
2008-03-01
Modeling forces applied to scissors during cutting of biological materials is useful for surgical simulation. Previous approaches to haptic display of scissor cutting are based on recording and replaying measured data. This paper presents an analytical model based on the concepts of contact mechanics and fracture mechanics to calculate forces applied to scissors during cutting of a slab of material. The model considers the process of cutting as a sequence of deformation and fracture phases. During deformation phases, forces applied to the scissors are calculated from a torque-angle response model synthesized from measurement data multiplied by a ratio that depends on the position of the cutting crack edge and the curve of the blades. Using the principle of conservation of energy, the forces of fracture are related to the fracture toughness of the material and the geometry of the blades of the scissors. The forces applied to scissors generally include high-frequency fluctuations. We show that the analytical model accurately predicts the average applied force. The cutting model is computationally efficient, so it can be used for real-time computations such as haptic rendering. Experimental results from cutting samples of paper, plastic, cloth, and chicken skin confirm the model, and the model is rendered in a haptic virtual environment.
Identification of Vibrotactile Patterns Encoding Obstacle Distance Information.
Kim, Yeongmi; Harders, Matthias; Gassert, Roger
2015-01-01
Delivering distance information of nearby obstacles from sensors embedded in a white cane-in addition to the intrinsic mechanical feedback from the cane-can aid the visually impaired in ambulating independently. Haptics is a common modality for conveying such information to cane users, typically in the form of vibrotactile signals. In this context, we investigated the effect of tactile rendering methods, tactile feedback configurations and directions of tactile flow on the identification of obstacle distance. Three tactile rendering methods with temporal variation only, spatio-temporal variation and spatial/temporal/intensity variation were investigated for two vibration feedback configurations. Results showed a significant interaction between tactile rendering method and feedback configuration. Spatio-temporal variation generally resulted in high correct identification rates for both feedback configurations. In the case of the four-finger vibration, tactile rendering with spatial/temporal/intensity variation also resulted in high distance identification rate. Further, participants expressed their preference for the four-finger vibration over the single-finger vibration in a survey. Both preferred rendering methods with spatio-temporal variation and spatial/temporal/intensity variation for the four-finger vibration could convey obstacle distance information with low workload. Overall, the presented findings provide valuable insights and guidance for the design of haptic displays for electronic travel aids for the visually impaired.
Shokur, Solaiman; Gallo, Simone; Moioli, Renan C; Donati, Ana Rita C; Morya, Edgard; Bleuler, Hannes; Nicolelis, Miguel A L
2016-09-19
Spinal cord injuries disrupt bidirectional communication between the patient's brain and body. Here, we demonstrate a new approach for reproducing lower limb somatosensory feedback in paraplegics by remapping missing leg/foot tactile sensations onto the skin of patients' forearms. A portable haptic display was tested in eight patients in a setup where the lower limbs were simulated using immersive virtual reality (VR). For six out of eight patients, the haptic display induced the realistic illusion of walking on three different types of floor surfaces: beach sand, a paved street or grass. Additionally, patients experienced the movements of the virtual legs during the swing phase or the sensation of the foot rolling on the floor while walking. Relying solely on this tactile feedback, patients reported the position of the avatar leg during virtual walking. Crossmodal interference between vision of the virtual legs and tactile feedback revealed that patients assimilated the virtual lower limbs as if they were their own legs. We propose that the addition of tactile feedback to neuroprosthetic devices is essential to restore a full lower limb perceptual experience in spinal cord injury (SCI) patients, and will ultimately, lead to a higher rate of prosthetic acceptance/use and a better level of motor proficiency.
Shokur, Solaiman; Gallo, Simone; Moioli, Renan C.; Donati, Ana Rita C.; Morya, Edgard; Bleuler, Hannes; Nicolelis, Miguel A.L.
2016-01-01
Spinal cord injuries disrupt bidirectional communication between the patient’s brain and body. Here, we demonstrate a new approach for reproducing lower limb somatosensory feedback in paraplegics by remapping missing leg/foot tactile sensations onto the skin of patients’ forearms. A portable haptic display was tested in eight patients in a setup where the lower limbs were simulated using immersive virtual reality (VR). For six out of eight patients, the haptic display induced the realistic illusion of walking on three different types of floor surfaces: beach sand, a paved street or grass. Additionally, patients experienced the movements of the virtual legs during the swing phase or the sensation of the foot rolling on the floor while walking. Relying solely on this tactile feedback, patients reported the position of the avatar leg during virtual walking. Crossmodal interference between vision of the virtual legs and tactile feedback revealed that patients assimilated the virtual lower limbs as if they were their own legs. We propose that the addition of tactile feedback to neuroprosthetic devices is essential to restore a full lower limb perceptual experience in spinal cord injury (SCI) patients, and will ultimately, lead to a higher rate of prosthetic acceptance/use and a better level of motor proficiency. PMID:27640345
NASA Astrophysics Data System (ADS)
Hanyu, Ryosuke; Tsuji, Toshiaki
This paper proposes a whole-body haptic sensing system that has multiple supporting points between the body frame and the end-effector. The system consists of an end-effector and multiple force sensors. Using this mechanism, the position of a contact force on the surface can be calculated without any sensor array. A haptic sensing system with a single supporting point structure has previously been developed by the present authors. However, the system has drawbacks such as low stiffness and low strength. Therefore, in this study, a mechanism with multiple supporting points was proposed and its performance was verified. In this paper, the basic concept of the mechanism is first introduced. Next, an evaluation of the proposed method, performed by conducting some experiments, is presented.
Estimating Tool–Tissue Forces Using a 3-Degree-of-Freedom Robotic Surgical Tool
Zhao, Baoliang; Nelson, Carl A.
2016-01-01
Robot-assisted minimally invasive surgery (MIS) has gained popularity due to its high dexterity and reduced invasiveness to the patient; however, due to the loss of direct touch of the surgical site, surgeons may be prone to exert larger forces and cause tissue damage. To quantify tool–tissue interaction forces, researchers have tried to attach different kinds of sensors on the surgical tools. This sensor attachment generally makes the tools bulky and/or unduly expensive and may hinder the normal function of the tools; it is also unlikely that these sensors can survive harsh sterilization processes. This paper investigates an alternative method by estimating tool–tissue interaction forces using driving motors' current, and validates this sensorless force estimation method on a 3-degree-of-freedom (DOF) robotic surgical grasper prototype. The results show that the performance of this method is acceptable with regard to latency and accuracy. With this tool–tissue interaction force estimation method, it is possible to implement force feedback on existing robotic surgical systems without any sensors. This may allow a haptic surgical robot which is compatible with existing sterilization methods and surgical procedures, so that the surgeon can obtain tool–tissue interaction forces in real time, thereby increasing surgical efficiency and safety. PMID:27303591
Estimating Tool-Tissue Forces Using a 3-Degree-of-Freedom Robotic Surgical Tool.
Zhao, Baoliang; Nelson, Carl A
2016-10-01
Robot-assisted minimally invasive surgery (MIS) has gained popularity due to its high dexterity and reduced invasiveness to the patient; however, due to the loss of direct touch of the surgical site, surgeons may be prone to exert larger forces and cause tissue damage. To quantify tool-tissue interaction forces, researchers have tried to attach different kinds of sensors on the surgical tools. This sensor attachment generally makes the tools bulky and/or unduly expensive and may hinder the normal function of the tools; it is also unlikely that these sensors can survive harsh sterilization processes. This paper investigates an alternative method by estimating tool-tissue interaction forces using driving motors' current, and validates this sensorless force estimation method on a 3-degree-of-freedom (DOF) robotic surgical grasper prototype. The results show that the performance of this method is acceptable with regard to latency and accuracy. With this tool-tissue interaction force estimation method, it is possible to implement force feedback on existing robotic surgical systems without any sensors. This may allow a haptic surgical robot which is compatible with existing sterilization methods and surgical procedures, so that the surgeon can obtain tool-tissue interaction forces in real time, thereby increasing surgical efficiency and safety.
Azarnoush, Hamed; Siar, Samaneh; Sawaya, Robin; Zhrani, Gmaan Al; Winkler-Schwartz, Alexander; Alotaibi, Fahad Eid; Bugdadi, Abdulgadir; Bajunaid, Khalid; Marwa, Ibrahim; Sabbagh, Abdulrahman Jafar; Del Maestro, Rolando F
2017-07-01
OBJECTIVE Virtual reality simulators allow development of novel methods to analyze neurosurgical performance. The concept of a force pyramid is introduced as a Tier 3 metric with the ability to provide visual and spatial analysis of 3D force application by any instrument used during simulated tumor resection. This study was designed to answer 3 questions: 1) Do study groups have distinct force pyramids? 2) Do handedness and ergonomics influence force pyramid structure? 3) Are force pyramids dependent on the visual and haptic characteristics of simulated tumors? METHODS Using a virtual reality simulator, NeuroVR (formerly NeuroTouch), ultrasonic aspirator force application was continually assessed during resection of simulated brain tumors by neurosurgeons, residents, and medical students. The participants performed simulated resections of 18 simulated brain tumors with different visual and haptic characteristics. The raw data, namely, coordinates of the instrument tip as well as contact force values, were collected by the simulator. To provide a visual and qualitative spatial analysis of forces, the authors created a graph, called a force pyramid, representing force sum along the z-coordinate for different xy coordinates of the tool tip. RESULTS Sixteen neurosurgeons, 15 residents, and 84 medical students participated in the study. Neurosurgeon, resident and medical student groups displayed easily distinguishable 3D "force pyramid fingerprints." Neurosurgeons had the lowest force pyramids, indicating application of the lowest forces, followed by resident and medical student groups. Handedness, ergonomics, and visual and haptic tumor characteristics resulted in distinct well-defined 3D force pyramid patterns. CONCLUSIONS Force pyramid fingerprints provide 3D spatial assessment displays of instrument force application during simulated tumor resection. Neurosurgeon force utilization and ergonomic data form a basis for understanding and modulating resident force application and improving patient safety during tumor resection.
Design of high-fidelity haptic display for one-dimensional force reflection applications
NASA Astrophysics Data System (ADS)
Gillespie, Brent; Rosenberg, Louis B.
1995-12-01
This paper discusses the development of a virtual reality platform for the simulation of medical procedures which involve needle insertion into human tissue. The paper's focus is the hardware and software requirements for haptic display of a particular medical procedure known as epidural analgesia. To perform this delicate manual procedure, an anesthesiologist must carefully guide a needle through various layers of tissue using only haptic cues for guidance. As a simplifying aspect for the simulator design, all motions and forces involved in the task occur along a fixed line once insertion begins. To create a haptic representation of this procedure, we have explored both physical modeling and perceptual modeling techniques. A preliminary physical model was built based on CT-scan data of the operative site. A preliminary perceptual model was built based on current training techniques for the procedure provided by a skilled instructor. We compare and contrast these two modeling methods and discuss the implications of each. We select and defend the perceptual model as a superior approach for the epidural analgesia simulator.
NASA Astrophysics Data System (ADS)
Valenza, G.; Greco, A.; Citi, L.; Bianchi, M.; Barbieri, R.; Scilingo, E. P.
2016-06-01
This study proposes the application of a comprehensive signal processing framework, based on inhomogeneous point-process models of heartbeat dynamics, to instantaneously assess affective haptic perception using electrocardiogram-derived information exclusively. The framework relies on inverse-Gaussian point-processes with Laguerre expansion of the nonlinear Wiener-Volterra kernels, accounting for the long-term information given by the past heartbeat events. Up to cubic-order nonlinearities allow for an instantaneous estimation of the dynamic spectrum and bispectrum of the considered cardiovascular dynamics, as well as for instantaneous measures of complexity, through Lyapunov exponents and entropy. Short-term caress-like stimuli were administered for 4.3-25 seconds on the forearms of 32 healthy volunteers (16 females) through a wearable haptic device, by selectively superimposing two levels of force, 2 N and 6 N, and two levels of velocity, 9.4 mm/s and 65 mm/s. Results demonstrated that our instantaneous linear and nonlinear features were able to finely characterize the affective haptic perception, with a recognition accuracy of 69.79% along the force dimension, and 81.25% along the velocity dimension.
A flexible tactile-feedback touch screen using transparent ferroelectric polymer film vibrators
NASA Astrophysics Data System (ADS)
Ju, Woo-Eon; Moon, Yong-Ju; Park, Cheon-Ho; Choi, Seung Tae
2014-07-01
To provide tactile feedback on flexible touch screens, transparent relaxor ferroelectric polymer film vibrators were designed and fabricated in this study. The film vibrator can be integrated underneath a transparent cover film or glass, and can also produce acoustic waves that cause a tactile sensation on human fingertips. Poly(vinylidene fluoride-trifluoroethylene-chlorotrifluoroethylene) [P(VDF-TrFE-CTFE)] polymer was used as the relaxor ferroelectric polymer because it produces a large strain under applied electric fields, shows a fast response, and has excellent optical transparency. The natural frequency of this tactile-feedback touch screen was designed to be around 200-240 Hz, at which the haptic perception of human fingertips is the most sensitive; therefore, the resonance of the touch screen at its natural frequency provides maximum haptic sensation. A multilayered relaxor ferroelectric polymer film vibrator was also demonstrated to provide the same vibration power at reduced voltage. The flexible P(VDF-TrFE-CTFE) film vibrators developed in this study are expected to provide tactile sensation not only in large-area flat panel displays, but also in flexible displays and touch screens.
Using haptic feedback to increase seat belt use : traffic tech.
DOT National Transportation Integrated Search
2011-07-01
The legacy of research on increasing seat belt use has : focused on enactment of seat belt legislation, public education, : high-visibility police enforcement, and seat belt : reminder systems. Several behavioral programs have : produced large, susta...
Emotion Telepresence: Emotion Augmentation through Affective Haptics and Visual Stimuli
NASA Astrophysics Data System (ADS)
Tsetserukou, D.; Neviarouskaya, A.
2012-03-01
The paper focuses on a novel concept of emotional telepresence. The iFeel_IM! system which is in the vanguard of this technology integrates 3D virtual world Second Life, intelligent component for automatic emotion recognition from text messages, and innovative affective haptic interfaces providing additional nonverbal communication channels through simulation of emotional feedback and social touch (physical co-presence). Users can not only exchange messages but also emotionally and physically feel the presence of the communication partner (e.g., family member, friend, or beloved person). The next prototype of the system will include the tablet computer. The user can realize haptic interaction with avatar, and thus influence its mood and emotion of the partner. The finger gesture language will be designed for communication with avatar. This will bring new level of immersion of on-line communication.
Samosky, Joseph T; Allen, Pete; Boronyak, Steve; Branstetter, Barton; Hein, Steven; Juhas, Mark; Nelson, Douglas A; Orebaugh, Steven; Pinto, Rohan; Smelko, Adam; Thompson, Mitch; Weaver, Robert A
2011-01-01
We are developing a simulator of peripheral nerve block utilizing a mixed-reality approach: the combination of a physical model, an MRI-derived virtual model, mechatronics and spatial tracking. Our design uses tangible (physical) interfaces to simulate surface anatomy, haptic feedback during needle insertion, mechatronic display of muscle twitch corresponding to the specific nerve stimulated, and visual and haptic feedback for the injection syringe. The twitch response is calculated incorporating the sensed output of a real neurostimulator. The virtual model is isomorphic with the physical model and is derived from segmented MRI data. This model provides the subsurface anatomy and, combined with electromagnetic tracking of a sham ultrasound probe and a standard nerve block needle, supports simulated ultrasound display and measurement of needle location and proximity to nerves and vessels. The needle tracking and virtual model also support objective performance metrics of needle targeting technique.
Miyashita, Kiyoteru; Oude Vrielink, Timo; Mylonas, George
2018-05-01
Endomicroscopy (EM) provides high resolution, non-invasive histological tissue information and can be used for scanning of large areas of tissue to assess cancerous and pre-cancerous lesions and their margins. However, current robotic solutions do not provide the accuracy and force sensitivity required to perform safe and accurate tissue scanning. A new surgical instrument has been developed that uses a cable-driven parallel mechanism (CPDM) to manipulate an EM probe. End-effector forces are determined by measuring the tensions in each cable. As a result, the instrument allows to accurately apply a contact force on a tissue, while at the same time offering high resolution and highly repeatable probe movement. 0.2 and 0.6 N force sensitivities were found for 1 and 2 DoF image acquisition methods, respectively. A back-stepping technique can be used when a higher force sensitivity is required for the acquisition of high quality tissue images. This method was successful in acquiring images on ex vivo liver tissue. The proposed approach offers high force sensitivity and precise control, which is essential for robotic EM. The technical benefits of the current system can also be used for other surgical robotic applications, including safe autonomous control, haptic feedback and palpation.
A Multi-Finger Interface with MR Actuators for Haptic Applications.
Qin, Huanhuan; Song, Aiguo; Gao, Zhan; Liu, Yuqing; Jiang, Guohua
2018-01-01
Haptic devices with multi-finger input are highly desirable in providing realistic and natural feelings when interacting with the remote or virtual environment. Compared with the conventional actuators, MR (Magneto-rheological) actuators are preferable options in haptics because of larger passive torque and torque-volume ratios. Among the existing haptic MR actuators, most of them are still bulky and heavy. If they were smaller and lighter, they would become more suitable for haptics. In this paper, a small-scale yet powerful MR actuator was designed to build a multi-finger interface for the 6 DOF haptic device. The compact structure was achieved by adopting the multi-disc configuration. Based on this configuration, the MR actuator can generate the maximum torque of 480 N.mm with dimensions of only 36 mm diameter and 18 mm height. Performance evaluation showed that it can exhibit a relatively high dynamic range and good response characteristics when compared with some other haptic MR actuators. The multi-finger interface is equipped with three MR actuators and can provide up to 8 N passive force to the thumb, index and middle fingers, respectively. An application example was used to demonstrate the effectiveness and potential of this new MR actuator based interface.
Functional Contour-following via Haptic Perception and Reinforcement Learning.
Hellman, Randall B; Tekin, Cem; van der Schaar, Mihaela; Santos, Veronica J
2018-01-01
Many tasks involve the fine manipulation of objects despite limited visual feedback. In such scenarios, tactile and proprioceptive feedback can be leveraged for task completion. We present an approach for real-time haptic perception and decision-making for a haptics-driven, functional contour-following task: the closure of a ziplock bag. This task is challenging for robots because the bag is deformable, transparent, and visually occluded by artificial fingertip sensors that are also compliant. A deep neural net classifier was trained to estimate the state of a zipper within a robot's pinch grasp. A Contextual Multi-Armed Bandit (C-MAB) reinforcement learning algorithm was implemented to maximize cumulative rewards by balancing exploration versus exploitation of the state-action space. The C-MAB learner outperformed a benchmark Q-learner by more efficiently exploring the state-action space while learning a hard-to-code task. The learned C-MAB policy was tested with novel ziplock bag scenarios and contours (wire, rope). Importantly, this work contributes to the development of reinforcement learning approaches that account for limited resources such as hardware life and researcher time. As robots are used to perform complex, physically interactive tasks in unstructured or unmodeled environments, it becomes important to develop methods that enable efficient and effective learning with physical testbeds.
A systematic review: the influence of real time feedback on wheelchair propulsion biomechanics.
Symonds, Andrew; Barbareschi, Giulia; Taylor, Stephen; Holloway, Catherine
2018-01-01
Clinical guidelines recommend that, in order to minimize upper limb injury risk, wheelchair users adopt a semi-circular pattern with a slow cadence and a large push arc. To examine whether real time feedback can be used to influence manual wheelchair propulsion biomechanics. Clinical trials and case series comparing the use of real time feedback against no feedback were included. A general review was performed and methodological quality assessed by two independent practitioners using the Downs and Black checklist. The review was completed in accordance with the Preferred Reporting Items for Systematic Reviews and Meta Analyses (PRISMA) guidelines. Six papers met the inclusion criteria. Selected studies involved 123 participants and analysed the effect of visual and, in one case, haptic feedback. Across the studies it was shown that participants were able to achieve significant changes in propulsion biomechanics, when provided with real time feedback. However, the effect of targeting a single propulsion variable might lead to unwanted alterations in other parameters. Methodological assessment identified weaknesses in external validity. Visual feedback could be used to consistently increase push arc and decrease push rate, and may be the best focus for feedback training. Further investigation is required to assess such intervention during outdoor propulsion. Implications for Rehabilitation Upper limb pain and injuries are common secondary disorders that negatively affect wheelchair users' physical activity and quality of life. Clinical guidelines suggest that manual wheelchair users should aim to propel with a semi-circular pattern with low a push rate and large push arc in the range in order to minimise upper limbs' loading. Real time visual and haptic feedback are effective tools for improving propulsion biomechanics in both complete novices and experienced manual wheelchair users.
Casellato, Claudia; Pedrocchi, Alessandra; Zorzi, Giovanna; Vernisse, Lea; Ferrigno, Giancarlo; Nardocci, Nardo
2013-05-01
New insights suggest that dystonic motor impairments could also involve a deficit of sensory processing. In this framework, biofeedback, making covert physiological processes more overt, could be useful. The present work proposes an innovative integrated setup which provides the user with an electromyogram (EMG)-based visual-haptic biofeedback during upper limb movements (spiral tracking tasks), to test if augmented sensory feedbacks can induce motor control improvement in patients with primary dystonia. The ad hoc developed real-time control algorithm synchronizes the haptic loop with the EMG reading; the brachioradialis EMG values were used to modify visual and haptic features of the interface: the higher was the EMG level, the higher was the virtual table friction and the background color proportionally moved from green to red. From recordings on dystonic and healthy subjects, statistical results showed that biofeedback has a significant impact, correlated with the local impairment, on the dystonic muscular control. These tests pointed out the effectiveness of biofeedback paradigms in gaining a better specific-muscle voluntary motor control. The flexible tool developed here shows promising prospects of clinical applications and sensorimotor rehabilitation.
Introduction to haptics for neurosurgeons.
L'Orsa, Rachael; Macnab, Chris J B; Tavakoli, Mahdi
2013-01-01
Robots are becoming increasingly relevant to neurosurgeons, extending a neurosurgeon's physical capabilities, improving navigation within the surgical landscape when combined with advanced imaging, and propelling the movement toward minimally invasive surgery. Most surgical robots, however, isolate surgeons from the full range of human senses during a procedure. This forces surgeons to rely on vision alone for guidance through the surgical corridor, which limits the capabilities of the system, requires significant operator training, and increases the surgeon's workload. Incorporating haptics into these systems, ie, enabling the surgeon to "feel" forces experienced by the tool tip of the robot, could render these limitations obsolete by making the robot feel more like an extension of the surgeon's own body. Although the use of haptics in neurosurgical robots is still mostly the domain of research, neurosurgeons who keep abreast of this emerging field will be more prepared to take advantage of it as it becomes more prevalent in operating theaters. Thus, this article serves as an introduction to the field of haptics for neurosurgeons. We not only outline the current and future benefits of haptics but also introduce concepts in the fields of robotic technology and computer control. This knowledge will allow readers to be better aware of limitations in the technology that can affect performance and surgical outcomes, and "knowing the right questions to ask" will be invaluable for surgeons who have purchasing power within their departments.
Modal-Power-Based Haptic Motion Recognition
NASA Astrophysics Data System (ADS)
Kasahara, Yusuke; Shimono, Tomoyuki; Kuwahara, Hiroaki; Sato, Masataka; Ohnishi, Kouhei
Motion recognition based on sensory information is important for providing assistance to human using robots. Several studies have been carried out on motion recognition based on image information. However, in the motion of humans contact with an object can not be evaluated precisely by image-based recognition. This is because the considering force information is very important for describing contact motion. In this paper, a modal-power-based haptic motion recognition is proposed; modal power is considered to reveal information on both position and force. Modal power is considered to be one of the defining features of human motion. A motion recognition algorithm based on linear discriminant analysis is proposed to distinguish between similar motions. Haptic information is extracted using a bilateral master-slave system. Then, the observed motion is decomposed in terms of primitive functions in a modal space. The experimental results show the effectiveness of the proposed method.
NASA Technical Reports Server (NTRS)
Wincheski, Buzz; Smits, Jan; Namkung, Min; Ingram, JoAnne; Watkins, Neal; Jordan, Jeffrey D.; Louie, Richard
2002-01-01
Carbon nanotubes (CNTs) offer great potential for advanced sensor development due to the unique electronic transport properties of the material. However, a significant obstacle to the realization of practical CNT devices is the formation of reliable and reproducible CNT to metallic contacts. In this work, scanning probe techniques are explored for both fabrication of metallic junctions and positioning of singlewalled CNTs across these junctions. The use of a haptic force feedback interface to a scanning probe microscope is used to enable movement of nanotubes over micron length scales with nanometer precision. In this case, imaging of the surface is performed with light or intermittent contact to the surface. Increased tip-to-sample interaction forces are then applied to either create junctions or position CNTs. The effect of functionalization of substrate surfaces on the movement and tribology of the materials is also studied. The application of these techniques to the fabrication of CNT-based sensors for nondestructive evaluation applications is discussed.
Liu, H; Puangmali, P; Zbyszewski, D; Elhage, O; Dasgupta, P; Dai, J S; Seneviratne, L; Althoefer, K
2010-01-01
This paper presents a novel wheeled probe for the purpose of aiding a surgeon in soft tissue abnormality identification during minimally invasive surgery (MIS), compensating the loss of haptic feedback commonly associated with MIS. Initially, a prototype for validating the concept was developed. The wheeled probe consists of an indentation depth sensor employing an optic fibre sensing scheme and a force/torque sensor. The two sensors work in unison, allowing the wheeled probe to measure the tool-tissue interaction force and the rolling indentation depth concurrently. The indentation depth sensor was developed and initially tested on a homogenous silicone phantom representing a good model for a soft tissue organ; the results show that the sensor can accurately measure the indentation depths occurring while performing rolling indentation, and has good repeatability. To validate the ability of the wheeled probe to identify abnormalities located in the tissue, the device was tested on a silicone phantom containing embedded hard nodules. The experimental data demonstrate that recording the tissue reaction force as well as rolling indentation depth signals during rolling indentation, the wheeled probe can rapidly identify the distribution of tissue stiffness and cause the embedded hard nodules to be accurately located.
A force transmission system based on a tulip-shaped electrostatic clutch for haptic display devices
NASA Astrophysics Data System (ADS)
Sasaki, Hikaru; Shikida, Mitsuhiro; Sato, Kazuo
2006-12-01
This paper describes a novel type of force transmission system for haptic display devices. The system consists of an array of end-effecter elements, a force/displacement transmitter and a single actuator producing a large force/displacement. It has tulip-shaped electrostatic clutch devices to distribute the force/displacement from the actuator among the individual end effecters. The specifications of three components were determined to stimulate touched human fingers. The components were fabricated by using micro-electromechanical systems and conventional machining technologies, and finally they were assembled by hand. The performance of the assembled transmission system was experimentally examined and it was confirmed that each projection in the arrayed end effecters could be moved individually. The actuator in a system whose total size was only 3.0 cm × 3.0 cm × 4.0 cm produced a 600 mN force and displaced individual array elements by 18 µm.
In vivo soft tissue differentiation by diffuse reflectance spectroscopy: preliminary results
NASA Astrophysics Data System (ADS)
Zam, Azhar; Stelzle, Florian; Tangermann-Gerk, Katja; Adler, Werner; Nkenke, Emeka; Neukam, Friedrich Wilhelm; Schmidt, Michael; Douplik, Alexandre
Remote laser surgery does not provide haptic feedback to operate layer by layer and preserve vulnerable anatomical structures like nerve tissue or blood vessels. The aim of this study is identification of soft tissue in vivo by diffuse reflectance spectroscopy to set the base for a feedback control system to enhance nerve preservation in oral and maxillofacial laser surgery. Various soft tissues can be identified by diffuse reflectance spectroscopy in vivo. The results may set the base for a feedback system to prevent nerve damage during oral and maxillofacial laser surgery.
Sornkarn, Nantachai; Nanayakkara, Thrishantha
2017-01-01
When humans are asked to palpate a soft tissue to locate a hard nodule, they regulate the stiffness, speed, and force of the finger during examination. If we understand the relationship between these behavioral variables and haptic information gain (transfer entropy) during manual probing, we can improve the efficacy of soft robotic probes for soft tissue palpation, such as in tumor localization in minimally invasive surgery. Here, we recorded the muscle co-contraction activity of the finger using EMG sensors to address the question as to whether joint stiffness control during manual palpation plays an important role in the haptic information gain. To address this question, we used a soft robotic probe with a controllable stiffness joint and a force sensor mounted at the base to represent the function of the tendon in a biological finger. Then, we trained a Markov chain using muscle co-contraction patterns of human subjects, and used it to control the stiffness of the soft robotic probe in the same soft tissue palpation task. The soft robotic experiments showed that haptic information gain about the depth of the hard nodule can be maximized by varying the internal stiffness of the soft probe.
Valenza, G.; Greco, A.; Citi, L.; Bianchi, M.; Barbieri, R.; Scilingo, E. P.
2016-01-01
This study proposes the application of a comprehensive signal processing framework, based on inhomogeneous point-process models of heartbeat dynamics, to instantaneously assess affective haptic perception using electrocardiogram-derived information exclusively. The framework relies on inverse-Gaussian point-processes with Laguerre expansion of the nonlinear Wiener-Volterra kernels, accounting for the long-term information given by the past heartbeat events. Up to cubic-order nonlinearities allow for an instantaneous estimation of the dynamic spectrum and bispectrum of the considered cardiovascular dynamics, as well as for instantaneous measures of complexity, through Lyapunov exponents and entropy. Short-term caress-like stimuli were administered for 4.3–25 seconds on the forearms of 32 healthy volunteers (16 females) through a wearable haptic device, by selectively superimposing two levels of force, 2 N and 6 N, and two levels of velocity, 9.4 mm/s and 65 mm/s. Results demonstrated that our instantaneous linear and nonlinear features were able to finely characterize the affective haptic perception, with a recognition accuracy of 69.79% along the force dimension, and 81.25% along the velocity dimension. PMID:27357966
Haptic interfaces: Hardware, software and human performance
NASA Technical Reports Server (NTRS)
Srinivasan, Mandayam A.
1995-01-01
Virtual environments are computer-generated synthetic environments with which a human user can interact to perform a wide variety of perceptual and motor tasks. At present, most of the virtual environment systems engage only the visual and auditory senses, and not the haptic sensorimotor system that conveys the sense of touch and feel of objects in the environment. Computer keyboards, mice, and trackballs constitute relatively simple haptic interfaces. Gloves and exoskeletons that track hand postures have more interaction capabilities and are available in the market. Although desktop and wearable force-reflecting devices have been built and implemented in research laboratories, the current capabilities of such devices are quite limited. To realize the full promise of virtual environments and teleoperation of remote systems, further developments of haptic interfaces are critical. In this paper, the status and research needs in human haptics, technology development and interactions between the two are described. In particular, the excellent performance characteristics of Phantom, a haptic interface recently developed at MIT, are highlighted. Realistic sensations of single point of contact interactions with objects of variable geometry (e.g., smooth, textured, polyhedral) and material properties (e.g., friction, impedance) in the context of a variety of tasks (e.g., needle biopsy, switch panels) achieved through this device are described and the associated issues in haptic rendering are discussed.
Perception and Haptic Rendering of Friction Moments.
Kawasaki, H; Ohtuka, Y; Koide, S; Mouri, T
2011-01-01
This paper considers moments due to friction forces on the human fingertip. A computational technique called the friction moment arc method is presented. The method computes the static and/or dynamic friction moment independent of a friction force calculation. In addition, a new finger holder to display friction moment is presented. This device incorporates a small brushless motor and disk, and connects the human's finger to an interface finger of the five-fingered haptic interface robot HIRO II. Subjects' perception of friction moment while wearing the finger holder, as well as perceptions during object manipulation in a virtual reality environment, were evaluated experimentally.
Whitwell, Robert L.; Ganel, Tzvi; Byrne, Caitlin M.; Goodale, Melvyn A.
2015-01-01
Investigators study the kinematics of grasping movements (prehension) under a variety of conditions to probe visuomotor function in normal and brain-damaged individuals. “Natural” prehensile acts are directed at the goal object and are executed using real-time vision. Typically, they also entail the use of tactile, proprioceptive, and kinesthetic sources of haptic feedback about the object (“haptics-based object information”) once contact with the object has been made. Natural and simulated (pantomimed) forms of prehension are thought to recruit different cortical structures: patient DF, who has visual form agnosia following bilateral damage to her temporal-occipital cortex, loses her ability to scale her grasp aperture to the size of targets (“grip scaling”) when her prehensile movements are based on a memory of a target previewed 2 s before the cue to respond or when her grasps are directed towards a visible virtual target but she is denied haptics-based information about the target. In the first of two experiments, we show that when DF performs real-time pantomimed grasps towards a 7.5 cm displaced imagined copy of a visible object such that her fingers make contact with the surface of the table, her grip scaling is in fact quite normal. This finding suggests that real-time vision and terminal tactile feedback are sufficient to preserve DF’s grip scaling slopes. In the second experiment, we examined an “unnatural” grasping task variant in which a tangible target (along with any proxy such as the surface of the table) is denied (i.e., no terminal tactile feedback). To do this, we used a mirror-apparatus to present virtual targets with and without a spatially coincident copy for the participants to grasp. We compared the grasp kinematics from trials with and without terminal tactile feedback to a real-time-pantomimed grasping task (one without tactile feedback) in which participants visualized a copy of the visible target as instructed in our laboratory in the past. Compared to natural grasps, removing tactile feedback increased RT, slowed the velocity of the reach, reduced in-flight grip aperture, increased the slopes relating grip aperture to target width, and reduced the final grip aperture (FGA). All of these effects were also observed in the real time-pantomime grasping task. These effects seem to be independent of those that arise from using the mirror in general as we also compared grasps directed towards virtual targets to those directed at real ones viewed directly through a pane of glass. These comparisons showed that the grasps directed at virtual targets increased grip aperture, slowed the velocity of the reach, and reduced the slopes relating grip aperture to the widths of the target. Thus, using the mirror has real consequences on grasp kinematics, reflecting the importance of task-relevant sources of online visual information for the programming and updating of natural prehensile movements. Taken together, these results provide compelling support for the view that removing terminal tactile feedback, even when the grasps are target-directed, induces a switch from real-time visual control towards one that depends more on visual perception and cognitive supervision. Providing terminal tactile feedback and real-time visual information can evidently keep the dorsal visuomotor system operating normally for prehensile acts. PMID:25999834
Whitwell, Robert L; Ganel, Tzvi; Byrne, Caitlin M; Goodale, Melvyn A
2015-01-01
Investigators study the kinematics of grasping movements (prehension) under a variety of conditions to probe visuomotor function in normal and brain-damaged individuals. "Natural" prehensile acts are directed at the goal object and are executed using real-time vision. Typically, they also entail the use of tactile, proprioceptive, and kinesthetic sources of haptic feedback about the object ("haptics-based object information") once contact with the object has been made. Natural and simulated (pantomimed) forms of prehension are thought to recruit different cortical structures: patient DF, who has visual form agnosia following bilateral damage to her temporal-occipital cortex, loses her ability to scale her grasp aperture to the size of targets ("grip scaling") when her prehensile movements are based on a memory of a target previewed 2 s before the cue to respond or when her grasps are directed towards a visible virtual target but she is denied haptics-based information about the target. In the first of two experiments, we show that when DF performs real-time pantomimed grasps towards a 7.5 cm displaced imagined copy of a visible object such that her fingers make contact with the surface of the table, her grip scaling is in fact quite normal. This finding suggests that real-time vision and terminal tactile feedback are sufficient to preserve DF's grip scaling slopes. In the second experiment, we examined an "unnatural" grasping task variant in which a tangible target (along with any proxy such as the surface of the table) is denied (i.e., no terminal tactile feedback). To do this, we used a mirror-apparatus to present virtual targets with and without a spatially coincident copy for the participants to grasp. We compared the grasp kinematics from trials with and without terminal tactile feedback to a real-time-pantomimed grasping task (one without tactile feedback) in which participants visualized a copy of the visible target as instructed in our laboratory in the past. Compared to natural grasps, removing tactile feedback increased RT, slowed the velocity of the reach, reduced in-flight grip aperture, increased the slopes relating grip aperture to target width, and reduced the final grip aperture (FGA). All of these effects were also observed in the real time-pantomime grasping task. These effects seem to be independent of those that arise from using the mirror in general as we also compared grasps directed towards virtual targets to those directed at real ones viewed directly through a pane of glass. These comparisons showed that the grasps directed at virtual targets increased grip aperture, slowed the velocity of the reach, and reduced the slopes relating grip aperture to the widths of the target. Thus, using the mirror has real consequences on grasp kinematics, reflecting the importance of task-relevant sources of online visual information for the programming and updating of natural prehensile movements. Taken together, these results provide compelling support for the view that removing terminal tactile feedback, even when the grasps are target-directed, induces a switch from real-time visual control towards one that depends more on visual perception and cognitive supervision. Providing terminal tactile feedback and real-time visual information can evidently keep the dorsal visuomotor system operating normally for prehensile acts.
Augmented versus Virtual Reality Laparoscopic Simulation: What Is the Difference?
Botden, Sanne M.B.I.; Buzink, Sonja N.; Schijven, Marlies P.
2007-01-01
Background Virtual reality (VR) is an emerging new modality for laparoscopic skills training; however, most simulators lack realistic haptic feedback. Augmented reality (AR) is a new laparoscopic simulation system offering a combination of physical objects and VR simulation. Laparoscopic instruments are used within an hybrid mannequin on tissue or objects while using video tracking. This study was designed to assess the difference in realism, haptic feedback, and didactic value between AR and VR laparoscopic simulation. Methods The ProMIS AR and LapSim VR simulators were used in this study. The participants performed a basic skills task and a suturing task on both simulators, after which they filled out a questionnaire about their demographics and their opinion of both simulators scored on a 5-point Likert scale. The participants were allotted to 3 groups depending on their experience: experts, intermediates and novices. Significant differences were calculated with the paired t-test. Results There was general consensus in all groups that the ProMIS AR laparoscopic simulator is more realistic than the LapSim VR laparoscopic simulator in both the basic skills task (mean 4.22 resp. 2.18, P < 0.000) as well as the suturing task (mean 4.15 resp. 1.85, P < 0.000). The ProMIS is regarded as having better haptic feedback (mean 3.92 resp. 1.92, P < 0.000) and as being more useful for training surgical residents (mean 4.51 resp. 2.94, P < 0.000). Conclusions In comparison with the VR simulator, the AR laparoscopic simulator was regarded by all participants as a better simulator for laparoscopic skills training on all tested features. PMID:17361356
Teaching bovine abdominal anatomy: use of a haptic simulator.
Kinnison, Tierney; Forrest, Neil David; Frean, Stephen Philip; Baillie, Sarah
2009-01-01
Traditional methods of teaching anatomy to undergraduate medical and veterinary students are being challenged and need to adapt to modern concerns and requirements. There is a move away from the use of cadavers to new technologies as a way of complementing the traditional approaches and addressing resource and ethical problems. Haptic (touch) technology, which allows the student to feel a 3D computer-generated virtual environment, provides a novel way to address some of these challenges. To evaluate the practicalities and usefulness of a haptic simulator, first year veterinary students at the Royal Veterinary College, University of London, were taught basic bovine abdominal anatomy using a rectal palpation simulator: "The Haptic Cow." Over two days, 186 students were taught in small groups and 184 provided feedback via a questionnaire. The results were positive; the majority of students considered that the simulator had been useful for appreciating both the feel and location of key internal anatomical structures, had helped with their understanding of bovine abdominal anatomy and 3D visualization, and the tutorial had been enjoyable. The students were mostly in favor of the small group tutorial format, but some requested more time on the simulator. The findings indicate that the haptic simulator is an engaging way of teaching bovine abdominal anatomy to a large number of students in an efficient manner without using cadavers, thereby addressing some of the current challenges in anatomy teaching.
The role of visuohaptic experience in visually perceived depth.
Ho, Yun-Xian; Serwe, Sascha; Trommershäuser, Julia; Maloney, Laurence T; Landy, Michael S
2009-06-01
Berkeley suggested that "touch educates vision," that is, haptic input may be used to calibrate visual cues to improve visual estimation of properties of the world. Here, we test whether haptic input may be used to "miseducate" vision, causing observers to rely more heavily on misleading visual cues. Human subjects compared the depth of two cylindrical bumps illuminated by light sources located at different positions relative to the surface. As in previous work using judgments of surface roughness, we find that observers judge bumps to have greater depth when the light source is located eccentric to the surface normal (i.e., when shadows are more salient). Following several sessions of visual judgments of depth, subjects then underwent visuohaptic training in which haptic feedback was artificially correlated with the "pseudocue" of shadow size and artificially decorrelated with disparity and texture. Although there were large individual differences, almost all observers demonstrated integration of haptic cues during visuohaptic training. For some observers, subsequent visual judgments of bump depth were unaffected by the training. However, for 5 of 12 observers, training significantly increased the weight given to pseudocues, causing subsequent visual estimates of shape to be less veridical. We conclude that haptic information can be used to reweight visual cues, putting more weight on misleading pseudocues, even when more trustworthy visual cues are available in the scene.
A mixed reality simulator for feline abdominal palpation training in veterinary medicine.
Parkes, Rebecca; Forrest, Neil; Baillie, Sarah
2009-01-01
The opportunities for veterinary students to practice feline abdominal palpation are limited as cats have a low tolerance to being examined. Therefore, a mixed reality simulator was developed to complement clinical training. Two PHANToM premium haptic devices were positioned either side of a modified toy cat. Virtual models of the chest and some abdominal contents were superimposed on the physical model. The haptic properties of the virtual models were set by seven veterinarians; values were adjusted while the simulation was being palpated until the representation was satisfactory. Feedback from the veterinarians was encouraging suggesting that the simulator has a potential role in student training.
Modeling of Explorative Procedures for Remote Object Identification
1991-09-01
haptic sensory system and the simulated foveal component of the visual system. Eventually it will allow multiple applications in remote sensing and...superposition of sensory channels. The use of a force reflecting telemanipulator and computer simulated visual foveal component are the tools which...representation of human search models is achieved by using the proprioceptive component of the haptic sensory system and the simulated foveal component of the
Post-procedural evaluation of catheter contact force characteristics
NASA Astrophysics Data System (ADS)
Koch, Martin; Brost, Alexander; Kiraly, Atilla; Strobel, Norbert; Hornegger, Joachim
2012-03-01
Minimally invasive catheter ablation of electric foci, performed in electrophysiology labs, is an attractive treatment option for atrial fibrillation (AF) - in particular if drug therapy is no longer effective or tolerated. There are different strategies to eliminate the electric foci inducing the arrhythmia. Independent of the particular strategy, it is essential to place transmural lesions. The impact of catheter contact force on the generated lesion quality has been investigated recently, and first results are promising. There are different approaches to measure catheter-tissue contact. Besides traditional haptic feedback, there are new technologies either relying on catheter tip-to-tissue contact force or on local impedance measurements at the tip of the catheter. In this paper, we present a novel tool for post-procedural ablation point evaluation and visualization of contact force characteristics. Our method is based on localizing ablation points set during AF ablation procedures. The 3-D point positions are stored together with lesion specific catheter contact force (CF) values recorded during the ablation. The force records are mapped to the spatial 3-D positions, where the energy has been applied. The tracked positions of the ablation points can be further used to generate a 3-D mesh model of the left atrium (LA). Since our approach facilitates visualization of different force characteristics for post-procedural evaluation and verification, it has the potential to improve outcome by highlighting areas where lesion quality may be less than desired.
Design and Evaluation of Shape-Changing Haptic Interfaces for Pedestrian Navigation Assistance.
Spiers, Adam J; Dollar, Aaron M
2017-01-01
Shape-changing interfaces are a category of device capable of altering their form in order to facilitate communication of information. In this work, we present a shape-changing device that has been designed for navigation assistance. 'The Animotus' (previously, 'The Haptic Sandwich' ), resembles a cube with an articulated upper half that is able to rotate and extend (translate) relative to the bottom half, which is fixed in the user's grasp. This rotation and extension, generally felt via the user's fingers, is used to represent heading and proximity to navigational targets. The device is intended to provide an alternative to screen or audio based interfaces for visually impaired, hearing impaired, deafblind, and sighted pedestrians. The motivation and design of the haptic device is presented, followed by the results of a navigation experiment that aimed to determine the role of each device DOF, in terms of facilitating guidance. An additional device, 'The Haptic Taco', which modulated its volume in response to target proximity (negating directional feedback), was also compared. Results indicate that while the heading (rotational) DOF benefited motion efficiency, the proximity (translational) DOF benefited velocity. Combination of the two DOF improved overall performance. The volumetric Taco performed comparably to the Animotus' extension DOF.
Performance Evaluation of Passive Haptic Feedback for Tactile HMI Design in CAVEs.
Lassagne, Antoine; Kemeny, Andras; Posselt, Javier; Merienne, Frederic
2018-01-01
This article presents a comparison of different haptic systems, which are designed to simulate flat Human Machine Interfaces (HMIs) like touchscreens in virtual environments (VEs) such as CAVEs, and their respective performance. We compare a tangible passive transparent slate to a classic tablet and a sensory substitution system. These systems were tested during a controlled experiment. The performance and impressions from 20 subjects were collected to understand more about the modalities in the given context. The results show that the preferences of the subjects are strongly related to the use-cases and needs. In terms of performance, passive haptics proved to be significantly useful, acting as a space reference and a real-time continuous calibration system, allowing subjects to have lower execution durations and relative errors. Sensory substitution induced perception drifts during the experiment, causing significant performance disparities, demonstrating the low robustness of perception when spatial cues are insufficiently available. Our findings offer a better understanding on the nature of perception drifts and the need of strong multisensory spatial markers for such use-cases in CAVEs. The importance of a relevant haptic modality specifically designed to match a precise use-case is also emphasized.
Frisoli, Antonio; Solazzi, Massimiliano; Reiner, Miriam; Bergamasco, Massimo
2011-06-30
The aim of this study was to understand the integration of cutaneous and kinesthetic sensory modalities in haptic perception of shape orientation. A specific robotic apparatus was employed to simulate the exploration of virtual surfaces by active touch with two fingers, with kinesthetic only, cutaneous only and combined sensory feedback. The cutaneous feedback was capable of displaying the local surface orientation at the contact point, through a small plate indenting the fingerpad at contact. A psychophysics test was conducted with SDT methodology on 6 subjects to assess the discrimination threshold of angle perception between two parallel surfaces, with three sensory modalities and two shape sizes. Results show that the cutaneous sensor modality is not affected by size of shape, but kinesthetic performance is decreasing with smaller size. Cutaneous and kinesthetic sensory cues are integrated according to a Bayesian model, so that the combined sensory stimulation always performs better than single modalities alone. Copyright © 2010 Elsevier Inc. All rights reserved.
Vibration Influences Haptic Perception of Surface Compliance During Walking
Visell, Yon; Giordano, Bruno L.; Millet, Guillaume; Cooperstock, Jeremy R.
2011-01-01
Background The haptic perception of ground compliance is used for stable regulation of dynamic posture and the control of locomotion in diverse natural environments. Although rarely investigated in relation to walking, vibrotactile sensory channels are known to be active in the discrimination of material properties of objects and surfaces through touch. This study investigated how the perception of ground surface compliance is altered by plantar vibration feedback. Methodology/Principal Findings Subjects walked in shoes over a rigid floor plate that provided plantar vibration feedback, and responded indicating how compliant it felt, either in subjective magnitude or via pairwise comparisons. In one experiment, the compliance of the floor plate was also varied. Results showed that perceived compliance of the plate increased monotonically with vibration feedback intensity, and depended to a lesser extent on the temporal or frequency distribution of the feedback. When both plate stiffness (inverse compliance) and vibration amplitude were manipulated, the effect persisted, with both factors contributing to compliance perception. A significant influence of vibration was observed even for amplitudes close to psychophysical detection thresholds. Conclusions/Significance These findings reveal that vibrotactile sensory channels are highly salient to the perception of surface compliance, and suggest that correlations between vibrotactile sensory information and motor activity may be of broader significance for the control of human locomotion than has been previously acknowledged. PMID:21464979
Robotics, motor learning, and neurologic recovery.
Reinkensmeyer, David J; Emken, Jeremy L; Cramer, Steven C
2004-01-01
Robotic devices are helping shed light on human motor control in health and injury. By using robots to apply novel force fields to the arm, investigators are gaining insight into how the nervous system models its external dynamic environment. The nervous system builds internal models gradually by experience and uses them in combination with impedance and feedback control strategies. Internal models are robust to environmental and neural noise, generalized across space, implemented in multiple brain regions, and developed in childhood. Robots are also being used to assist in repetitive movement practice following neurologic injury, providing insight into movement recovery. Robots can haptically assess sensorimotor performance, administer training, quantify amount of training, and improve motor recovery. In addition to providing insight into motor control, robotic paradigms may eventually enhance motor learning and rehabilitation beyond the levels possible with conventional training techniques.
Ferre, Manuel; Galiana, Ignacio; Aracil, Rafael
2011-01-01
This paper describes the design and calibration of a thimble that measures the forces applied by a user during manipulation of virtual and real objects. Haptic devices benefit from force measurement capabilities at their end-point. However, the heavy weight and cost of force sensors prevent their widespread incorporation in these applications. The design of a lightweight, user-adaptable, and cost-effective thimble with four contact force sensors is described in this paper. The sensors are calibrated before being placed in the thimble to provide normal and tangential forces. Normal forces are exerted directly by the fingertip and thus can be properly measured. Tangential forces are estimated by sensors strategically placed in the thimble sides. Two applications are provided in order to facilitate an evaluation of sensorized thimble performance. These applications focus on: (i) force signal edge detection, which determines task segmentation of virtual object manipulation, and (ii) the development of complex object manipulation models, wherein the mechanical features of a real object are obtained and these features are then reproduced for training by means of virtual object manipulation.
Ferre, Manuel; Galiana, Ignacio; Aracil, Rafael
2011-01-01
This paper describes the design and calibration of a thimble that measures the forces applied by a user during manipulation of virtual and real objects. Haptic devices benefit from force measurement capabilities at their end-point. However, the heavy weight and cost of force sensors prevent their widespread incorporation in these applications. The design of a lightweight, user-adaptable, and cost-effective thimble with four contact force sensors is described in this paper. The sensors are calibrated before being placed in the thimble to provide normal and tangential forces. Normal forces are exerted directly by the fingertip and thus can be properly measured. Tangential forces are estimated by sensors strategically placed in the thimble sides. Two applications are provided in order to facilitate an evaluation of sensorized thimble performance. These applications focus on: (i) force signal edge detection, which determines task segmentation of virtual object manipulation, and (ii) the development of complex object manipulation models, wherein the mechanical features of a real object are obtained and these features are then reproduced for training by means of virtual object manipulation. PMID:22247677
Calibrating Reach Distance to Visual Targets
ERIC Educational Resources Information Center
Mon-Williams, Mark; Bingham, Geoffrey P.
2007-01-01
The authors investigated the calibration of reach distance by gradually distorting the haptic feedback obtained when participants grasped visible target objects. The authors found that the modified relationship between visually specified distance and reach distance could be captured by a straight-line mapping function. Thus, the relation could be…
Magdalon, Eliane C; Michaelsen, Stella M; Quevedo, Antonio A; Levin, Mindy F
2011-09-01
Virtual reality (VR) technology is being used with increasing frequency as a training medium for motor rehabilitation. However, before addressing training effectiveness in virtual environments (VEs), it is necessary to identify if movements made in such environments are kinematically similar to those made in physical environments (PEs) and the effect of provision of haptic feedback on these movement patterns. These questions are important since reach-to-grasp movements may be inaccurate when visual or haptic feedback is altered or absent. Our goal was to compare kinematics of reaching and grasping movements to three objects performed in an immersive three-dimensional (3D) VE with haptic feedback (cyberglove/grasp system) viewed through a head-mounted display to those made in an equivalent physical environment (PE). We also compared movements in PE made with and without wearing the cyberglove/grasp haptic feedback system. Ten healthy subjects (8 women, 62.1±8.8years) reached and grasped objects requiring 3 different grasp types (can, diameter 65.6mm, cylindrical grasp; screwdriver, diameter 31.6mm, power grasp; pen, diameter 7.5mm, precision grasp) in PE and visually similar virtual objects in VE. Temporal and spatial arm and trunk kinematics were analyzed. Movements were slower and grip apertures were wider when wearing the glove in both the PE and the VE compared to movements made in the PE without the glove. When wearing the glove, subjects used similar reaching trajectories in both environments, preserved the coordination between reaching and grasping and scaled grip aperture to object size for the larger object (cylindrical grasp). However, in VE compared to PE, movements were slower and had longer deceleration times, elbow extension was greater when reaching to the smallest object and apertures were wider for the power and precision grip tasks. Overall, the differences in spatial and temporal kinematics of movements between environments were greater than those due only to wearing the cyberglove/grasp system. Differences in movement kinematics due to the viewing environment were likely due to a lack of prior experience with the virtual environment, an uncertainty of object location and the restricted field-of-view when wearing the head-mounted display. The results can be used to inform the design and disposition of objects within 3D VEs for the study of the control of prehension and for upper limb rehabilitation. Copyright © 2011 Elsevier B.V. All rights reserved.
Haptic interfaces using dielectric electroactive polymers
NASA Astrophysics Data System (ADS)
Ozsecen, Muzaffer Y.; Sivak, Mark; Mavroidis, Constantinos
2010-04-01
Quality, amplitude and frequency of the interaction forces between a human and an actuator are essential traits for haptic applications. A variety of Electro-Active Polymer (EAP) based actuators can provide these characteristics simultaneously with quiet operation, low weight, high power density and fast response. This paper demonstrates a rolled Dielectric Elastomer Actuator (DEA) being used as a telepresence device in a heart beat measurement application. In the this testing, heart signals were acquired from a remote location using a wireless heart rate sensor, sent through a network and DEA was used to haptically reproduce the heart beats at the medical expert's location. A series of preliminary human subject tests were conducted that demonstrated that a) DE based haptic feeling can be used in heart beat measurement tests and b) through subjective testing the stiffness and actuator properties of the EAP can be tuned for a variety of applications.
High-level virtual reality simulator for endourologic procedures of lower urinary tract.
Reich, Oliver; Noll, Margarita; Gratzke, Christian; Bachmann, Alexander; Waidelich, Raphaela; Seitz, Michael; Schlenker, Boris; Baumgartner, Reinhold; Hofstetter, Alfons; Stief, Christian G
2006-06-01
To analyze the limitations of existing simulators for urologic techniques, and then test and evaluate a novel virtual reality (VR) simulator for endourologic procedures of the lower urinary tract. Surgical simulation using VR has the potential to have a tremendous impact on surgical training, testing, and certification. Endourologic procedures seem to be an ideal target for VR systems. The URO-Trainer features genuine VR, obtained from digital video footage of more than 400 endourologic diagnostic and therapeutic procedures, as well as data from cross-sectional imaging. The software offers infinite random variations of the anatomy and pathologic features for diagnosis and surgical intervention. An advanced haptic force feedback is incorporated. Virtual cystoscopy and resection of bladder tumors were evaluated by 24 medical students and 12 residents at our department. The system was assessed by more than 150 international urologists with varying experience at different conventions and workshops from March 2003 to September 2004. Because of these evaluations and constant evolutions, the final version provides a genuine representation of endourologic procedures. Objective data are generated by a tutoring system that has documented evident teaching benefits for medical students and residents in cystoscopy and treatment of bladder tumors. The URO-Trainer represents the latest generation of endoscopy simulators. Authentic visual and haptic sensations, unlimited virtual cases, and an intelligent tutoring system make this modular system an important improvement in computer-based training and quality control in urology.
An electromechanical based deformable model for soft tissue simulation.
Zhong, Yongmin; Shirinzadeh, Bijan; Smith, Julian; Gu, Chengfan
2009-11-01
Soft tissue deformation is of great importance to surgery simulation. Although a significant amount of research efforts have been dedicated to simulating the behaviours of soft tissues, modelling of soft tissue deformation is still a challenging problem. This paper presents a new deformable model for simulation of soft tissue deformation from the electromechanical viewpoint of soft tissues. Soft tissue deformation is formulated as a reaction-diffusion process coupled with a mechanical load. The mechanical load applied to a soft tissue to cause a deformation is incorporated into the reaction-diffusion system, and consequently distributed among mass points of the soft tissue. Reaction-diffusion of mechanical load and non-rigid mechanics of motion are combined to govern the simulation dynamics of soft tissue deformation. An improved reaction-diffusion model is developed to describe the distribution of the mechanical load in soft tissues. A three-layer artificial cellular neural network is constructed to solve the reaction-diffusion model for real-time simulation of soft tissue deformation. A gradient based method is established to derive internal forces from the distribution of the mechanical load. Integration with a haptic device has also been achieved to simulate soft tissue deformation with haptic feedback. The proposed methodology does not only predict the typical behaviours of living tissues, but it also accepts both local and large-range deformations. It also accommodates isotropic, anisotropic and inhomogeneous deformations by simple modification of diffusion coefficients.
Advanced patient transfer assist device with intuitive interaction control.
Humphreys, Heather C; Choi, Young Mi; Book, Wayne J
2017-10-24
This research aims to improve patient transfers by developing a new type of advanced robotic assist device. It has multiple actuated degrees of freedom and a powered steerable base to maximize maneuverability around obstacles. An intuitive interface and control strategy allows the caregiver to simply push on the machine in the direction of desired patient motion. The control integrates measurements of both force and proximity to mitigate any potential large collision forces and provides operators information about obstacles with a form of haptic feedback. Electro-hydraulic pump controlled actuation provides high force density for the actuation. Nineteen participants performed tests to compare transfer operations (transferring a 250-lb mannequin between a wheelchair, chair, bed, and floor) and interaction control of a prototype device with a commercially available patient lift. The testing included a time study of the transfer operations and subjective rating of device performance. The results show that operators perform transfer tasks significantly faster and rate performance higher using the prototype patient transfer assist device than with a current market patient lift. With further development, features of the new patient lift can help facilitate patient transfers that are safer, easier, and more efficient for caregivers.
Different micromanipulation applications based on common modular control architecture
NASA Astrophysics Data System (ADS)
Sipola, Risto; Vallius, Tero; Pudas, Marko; Röning, Juha
2010-01-01
This paper validates a previously introduced scalable modular control architecture and shows how it can be used to implement research equipment. The validation is conducted by presenting different kinds of micromanipulation applications that use the architecture. Conditions of the micro-world are very different from those of the macro-world. Adhesive forces are significant compared to gravitational forces when micro-scale objects are manipulated. Manipulation is mainly conducted by automatic control relying on haptic feedback provided by force sensors. The validated architecture is a hierarchical layered hybrid architecture, including a reactive layer and a planner layer. The implementation of the architecture is modular, and the architecture has a lot in common with open architectures. Further, the architecture is extensible, scalable, portable and it enables reuse of modules. These are the qualities that we validate in this paper. To demonstrate the claimed features, we present different applications that require special control in micrometer, millimeter and centimeter scales. These applications include a device that measures cell adhesion, a device that examines properties of thin films, a device that measures adhesion of micro fibers and a device that examines properties of submerged gel produced by bacteria. Finally, we analyze how the architecture is used in these applications.
Visual Perception of Force: Comment on White (2012)
ERIC Educational Resources Information Center
Hubbard, Timothy L.
2012-01-01
White (2012) proposed that kinematic features in a visual percept are matched to stored representations containing information regarding forces (based on prior haptic experience) and that information in the matched, stored representations regarding forces is then incorporated into visual perception. Although some elements of White's (2012) account…
NASA Astrophysics Data System (ADS)
Erickson, David; Lacheray, Hervé; Lai, Gilbert; Haddadi, Amir
2014-06-01
This paper presents the latest advancements of the Haptics-based Immersive Tele-robotic System (HITS) project, a next generation Improvised Explosive Device (IED) disposal (IEDD) robotic interface containing an immersive telepresence environment for a remotely-controlled three-articulated-robotic-arm system. While the haptic feedback enhances the operator's perception of the remote environment, a third teleoperated dexterous arm, equipped with multiple vision sensors and cameras, provides stereo vision with proper visual cues, and a 3D photo-realistic model of the potential IED. This decentralized system combines various capabilities including stable and scaled motion, singularity avoidance, cross-coupled hybrid control, active collision detection and avoidance, compliance control and constrained motion to provide a safe and intuitive control environment for the operators. Experimental results and validation of the current system are presented through various essential IEDD tasks. This project demonstrates that a two-armed anthropomorphic Explosive Ordnance Disposal (EOD) robot interface can achieve complex neutralization techniques against realistic IEDs without the operator approaching at any time.
Mastoidectomy simulation with combined visual and haptic feedback.
Agus, Marco; Giachetti, Andrea; Gobbetti, Enrico; Zanetti, Gianluigi; Zorcolo, Antonio; John, Nigel W; Stone, Robert J
2002-01-01
Mastoidectomy is one of the most common surgical procedures relating to the petrous bone. In this paper we describe our preliminary results in the realization of a virtual reality mastoidectomy simulator. Our system is designed to work on patient-specific volumetric object models directly derived from 3D CT and MRI images. The paper summarizes the detailed task analysis performed in order to define the system requirements, introduces the architecture of the prototype simulator, and discusses the initial feedback received from selected end users.
Mathematical model of bone drilling for virtual surgery system
NASA Astrophysics Data System (ADS)
Alaytsev, Innokentiy K.; Danilova, Tatyana V.; Manturov, Alexey O.; Mareev, Gleb O.; Mareev, Oleg V.
2018-04-01
The bone drilling is an essential part of surgeries in ENT and Dentistry. A proper training of drilling machine handling skills is impossible without proper modelling of the drilling process. Utilization of high precision methods like FEM is limited due to the requirement of 1000 Hz update rate for haptic feedback. The study presents a mathematical model of the drilling process that accounts the properties of materials, the geometry and the rotation rate of a burr to compute the removed material volume. The simplicity of the model allows for integrating it in the high-frequency haptic thread. The precision of the model is enough for a virtual surgery system targeted on the training of the basic surgery skills.
GPU-based real-time soft tissue deformation with cutting and haptic feedback.
Courtecuisse, Hadrien; Jung, Hoeryong; Allard, Jérémie; Duriez, Christian; Lee, Doo Yong; Cotin, Stéphane
2010-12-01
This article describes a series of contributions in the field of real-time simulation of soft tissue biomechanics. These contributions address various requirements for interactive simulation of complex surgical procedures. In particular, this article presents results in the areas of soft tissue deformation, contact modelling, simulation of cutting, and haptic rendering, which are all relevant to a variety of medical interventions. The contributions described in this article share a common underlying model of deformation and rely on GPU implementations to significantly improve computation times. This consistency in the modelling technique and computational approach ensures coherent results as well as efficient, robust and flexible solutions. Copyright © 2010 Elsevier Ltd. All rights reserved.
Klatzky, Roberta L; Giudice, Nicholas A; Bennett, Christopher R; Loomis, Jack M
2014-01-01
Many developers wish to capitalize on touch-screen technology for developing aids for the blind, particularly by incorporating vibrotactile stimulation to convey patterns on their surfaces, which otherwise are featureless. Our belief is that they will need to take into account basic research on haptic perception in designing these graphics interfaces. We point out constraints and limitations in haptic processing that affect the use of these devices. We also suggest ways to use sound to augment basic information from touch, and we include evaluation data from users of a touch-screen device with vibrotactile and auditory feedback that we have been developing, called a vibro-audio interface.
Culmer, Peter; Barrie, Jenifer; Hewson, Rob; Levesley, Martin; Mon-Williams, Mark; Jayne, David; Neville, Anne
2012-06-01
Minimally invasive surgery (MIS) has heralded a revolution in surgical practice, with numerous advantages over open surgery. Nevertheless, it prevents the surgeon from directly touching and manipulating tissue and therefore severely restricts the use of valuable techniques such as palpation. Accordingly a key challenge in MIS is to restore haptic feedback to the surgeon. This paper reviews the state-of-the-art in laparoscopic palpation devices (LPDs) with particular focus on device mechanisms, sensors and data analysis. It concludes by examining the challenges that must be overcome to create effective LPD systems that measure and display haptic information to the surgeon for improved intraoperative assessment. Copyright © 2012 John Wiley & Sons, Ltd.
Melman, T; de Winter, J C F; Abbink, D A
2017-01-01
An important issue in road traffic safety is that drivers show adverse behavioral adaptation (BA) to driver assistance systems. Haptic steering guidance is an upcoming assistance system which facilitates lane-keeping performance while keeping drivers in the loop, and which may be particularly prone to BA. Thus far, experiments on haptic steering guidance have measured driver performance while the vehicle speed was kept constant. The aim of the present driving simulator study was to examine whether haptic steering guidance causes BA in the form of speeding, and to evaluate two types of haptic steering guidance designed not to suffer from BA. Twenty-four participants drove a 1.8m wide car for 13.9km on a curved road, with cones demarcating a single 2.2m narrow lane. Participants completed four conditions in a counterbalanced design: no guidance (Manual), continuous haptic guidance (Cont), continuous guidance that linearly reduced feedback gains from full guidance at 125km/h towards manual control at 130km/h and above (ContRF), and haptic guidance provided only when the predicted lateral position was outside a lateral bandwidth (Band). Participants were familiarized with each condition prior to the experimental runs and were instructed to drive as they normally would while minimizing the number of cone hits. Compared to Manual, the Cont condition yielded a significantly higher driving speed (on average by 7km/h), whereas ContRF and Band did not. All three guidance conditions yielded better lane-keeping performance than Manual, whereas Cont and ContRF yielded lower self-reported workload than Manual. In conclusion, continuous steering guidance entices drivers to increase their speed, thereby diminishing its potential safety benefits. It is possible to prevent BA while retaining safety benefits by making a design adjustment either in lateral (Band) or in longitudinal (ContRF) direction. Copyright © 2016. Published by Elsevier Ltd.
A pseudo-haptic knot diagram interface
NASA Astrophysics Data System (ADS)
Zhang, Hui; Weng, Jianguang; Hanson, Andrew J.
2011-01-01
To make progress in understanding knot theory, we will need to interact with the projected representations of mathematical knots which are of course continuous in 3D but significantly interrupted in the projective images. One way to achieve such a goal would be to design an interactive system that allows us to sketch 2D knot diagrams by taking advantage of a collision-sensing controller and explore their underlying smooth structures through a continuous motion. Recent advances of interaction techniques have been made that allow progress to be made in this direction. Pseudo-haptics that simulates haptic effects using pure visual feedback can be used to develop such an interactive system. This paper outlines one such pseudo-haptic knot diagram interface. Our interface derives from the familiar pencil-and-paper process of drawing 2D knot diagrams and provides haptic-like sensations to facilitate the creation and exploration of knot diagrams. A centerpiece of the interaction model simulates a "physically" reactive mouse cursor, which is exploited to resolve the apparent conflict between the continuous structure of the actual smooth knot and the visual discontinuities in the knot diagram representation. Another value in exploiting pseudo-haptics is that an acceleration (or deceleration) of the mouse cursor (or surface locator) can be used to indicate the slope of the curve (or surface) of whom the projective image is being explored. By exploiting these additional visual cues, we proceed to a full-featured extension to a pseudo-haptic 4D visualization system that simulates the continuous navigation on 4D objects and allows us to sense the bumps and holes in the fourth dimension. Preliminary tests of the software show that main features of the interface overcome some expected perceptual limitations in our interaction with 2D knot diagrams of 3D knots and 3D projective images of 4D mathematical objects.
Corticospinal signals recorded with MEAs can predict the volitional forearm forces in rats.
Guo, Yi; Mesut, Sahin; Foulds, Richard A; Adamovich, Sergei V
2013-01-01
We set out to investigate if volitional components in the descending tracts of the spinal cord white matter can be accessed with multi-electrode array (MEA) recording technique. Rats were trained to press a lever connected to a haptic device with force feedback to receive sugar pellets. A flexible-substrate multi-electrode array was chronically implanted into the dorsal column of the cervical spinal cord. Field potentials and multi-unit activities were recorded from the descending axons of the corticospinal tract while the rat performed a lever pressing task. Forelimb forces, recorded with the sensor attached to the lever, were reconstructed using the hand position data and the neural signals through multiple trials over three weeks. The regression coefficients found from the trial set were cross-validated on the other trials recorded on same day. Approximately 30 trials of at least 2 seconds were required for accurate model estimation. The maximum correlation coefficient between the actual and predicted force was 0.7 in the test set. Positional information and its interaction with neural signals improved the correlation coefficient by 0.1 to 0.15. These results suggest that the volitional information contained in the corticospinal tract can be extracted with multi-channel neural recordings made with parenchymal electrodes.
Multi-source micro-friction identification for a class of cable-driven robots with passive backbone
NASA Astrophysics Data System (ADS)
Tjahjowidodo, Tegoeh; Zhu, Ke; Dailey, Wayne; Burdet, Etienne; Campolo, Domenico
2016-12-01
This paper analyses the dynamics of cable-driven robots with a passive backbone and develops techniques for their dynamic identification, which are tested on the H-Man, a planar cabled differential transmission robot for haptic interaction. The mechanism is optimized for human-robot interaction by accounting for the cost-benefit-ratio of the system, specifically by eliminating the necessity of an external force sensor to reduce the overall cost. As a consequence, this requires an effective dynamic model for accurate force feedback applications which include friction behavior in the system. We first consider the significance of friction in both the actuator and backbone spaces. Subsequently, we study the required complexity of the stiction model for the application. Different models representing different levels of complexity are investigated, ranging from the conventional approach of Coulomb to an advanced model which includes hysteresis. The results demonstrate each model's ability to capture the dynamic behavior of the system. In general, it is concluded that there is a trade-off between model accuracy and the model cost.
Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu
2010-01-01
Purpose Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. Methods A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. Results The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. Conclusion A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users. PMID:20714933
Maciel, Anderson; Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu
2011-07-01
Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users.
Poor shape perception is the reason reaches-to-grasp are visually guided online.
Lee, Young-Lim; Crabtree, Charles E; Norman, J Farley; Bingham, Geoffrey P
2008-08-01
Both judgment studies and studies of feedforward reaching have shown that the visual perception of object distance, size, and shape are inaccurate. However, feedback has been shown to calibrate feedfoward reaches-to-grasp to make them accurate with respect to object distance and size. We now investigate whether shape perception (in particular, the aspect ratio of object depth to width) can be calibrated in the context of reaches-to-grasp. We used cylindrical objects with elliptical cross-sections of varying eccentricity. Our participants reached to grasp the width or the depth of these objects with the index finger and thumb. The maximum grasp aperture and the terminal grasp aperture were used to evaluate perception. Both occur before the hand has contacted an object. In Experiments 1 and 2, we investigated whether perceived shape is recalibrated by distorted haptic feedback. Although somewhat equivocal, the results suggest that it is not. In Experiment 3, we tested the accuracy of feedforward grasping with respect to shape with haptic feedback to allow calibration. Grasping was inaccurate in ways comparable to findings in shape perception judgment studies. In Experiment 4, we hypothesized that online guidance is needed for accurate grasping. Participants reached to grasp either with or without vision of the hand. The result was that the former was accurate, whereas the latter was not. We conclude that shape perception is not calibrated by feedback from reaches-to-grasp and that online visual guidance is required for accurate grasping because shape perception is poor.
Friction characteristics of trocars in laparoscopic surgery.
Alazmani, Ali; Roshan, Rupesh; Jayne, David G; Neville, Anne; Culmer, Peter
2015-04-01
This article investigates the friction characteristics of the instrument-trocar interface in laparoscopic surgery for varying linear instrument velocities, trocar seal design and material, and trocar tilt. Furthermore, the effect of applying lubrication at the instrument-trocar seal interface on friction was studied. A friction testing apparatus was designed and built to characterise the resistance force at the instrument-trocar interface as a function of the instrument's linear movement in the 12-mm trocar (at constant velocity) for different design, seal material, and angle of tilt. The resistance force depended on the trocar seal design and material properties, specifically surface roughness, elasticity, hardness, the direction of movement, and the instrument linear velocity, and varied between 0.25 and 8 N. Lubricating the shaft with silicone oil reduced the peak resistance force by 75% for all trocars and eliminated the stick-slip phenomenon evident in non-lubricated cases. The magnitude of fluctuation in resistance force depends on the trocar design and is attributed to stick-slip of the sealing mechanism and is generally higher during retraction in comparison to insertion. Trocars that have an inlet seal made of rubber/polyurethane showed higher resistance forces during retraction. Use of a lubricant significantly reduced frictional effects. Comparisons of the investigated trocars indicate that a low friction port, providing the surgeon with improved haptic feedback, can be designed by improving the tribological properties of the trocar seal interface. © IMechE 2015.
Output control of da Vinci surgical system's surgical graspers.
Johnson, Paul J; Schmidt, David E; Duvvuri, Umamaheswar
2014-01-01
The number of robot-assisted surgeries performed with the da Vinci surgical system has increased significantly over the past decade. The articulating movements of the robotic surgical grasper are controlled by grip controls at the master console. The user interface has been implicated as one contributing factor in surgical grasping errors. The goal of our study was to characterize and evaluate the user interface of the da Vinci surgical system in controlling surgical graspers. An angular manipulator with force sensors was used to increment the grip control angle as grasper output angles were measured. Input force at the grip control was simultaneously measured throughout the range of motion. Pressure film was used to assess the maximum grasping force achievable with the endoscopic grasping tool. The da Vinci robot's grip control angular input has a nonproportional relationship with the grasper instrument output. The grip control mechanism presents an intrinsic resistant force to the surgeon's fingertips and provides no haptic feedback. The da Vinci Maryland graspers are capable of applying up to 5.1 MPa of local pressure. The angular and force input at the grip control of the da Vinci robot's surgical graspers is nonproportional to the grasper instrument's output. Understanding the true relationship of the grip control input to grasper instrument output may help surgeons understand how to better control the surgical graspers and promote fewer grasping errors. Copyright © 2014 Elsevier Inc. All rights reserved.
Virtual reality robotic telesurgery simulations using MEMICA haptic system
NASA Technical Reports Server (NTRS)
Bar-Cohen, Yoseph; Mavroidis, Constantinos; Bouzit, Mourad; Dolgin, Benjamin; Harm, Deborah L.; Kopchok, George E.; White, Rodney
2001-01-01
The authors conceived a haptic mechanism called MEMICA (Remote Mechanical Mirroring using Controlled stiffness and Actuators) that can enable the design of high dexterity, rapid response, and large workspace haptic system. The development of a novel MEMICA gloves and virtual reality models are being explored to allow simulation of telesurgery and other applications. The MEMICA gloves are being designed to provide intuitive mirroring of the conditions at a virtual site where a robot simulates the presence of a human operator. The key components of MEMICA are miniature electrically controlled stiffness (ECS) elements and electrically controlled force and stiffness (ECFS) actuators that are based on the use of Electro-Rheological Fluids (ERF. In this paper the design of the MEMICA system and initial experimental results are presented.
Development of Quasi-3DOF upper limb rehabilitation system using ER brake: PLEMO-P1
NASA Astrophysics Data System (ADS)
Kikuchi, T.; Fukushima, K.; Furusho, J.; Ozawa, T.
2009-02-01
In recent years, many researchers have studied the potential of using robotics technology to assist and quantify the motor functions for neuron-rehabilitation. Some kinds of haptic devices have been developed and evaluated its efficiency with clinical tests, for example, upper limb training for patients with spasticity after stroke. However, almost all the devices are active-type (motor-driven) haptic devices and they basically require high-cost safety system compared to passive-type (brake-based) devices. In this study, we developed a new practical haptic device 'PLEMO-P1'; this system adopted ER brakes as its force generators. In this paper, the mechanism of PLEMO-P1 and its software for a reaching rehabilitation are described.
Pseudo-Haptic Feedback for Promoting Narrative Comprehension
ERIC Educational Resources Information Center
Umetsu, Kazuaki; Kashihara, Akihiro
2017-01-01
Skill in reading comprehension requires reading sentences to understand an intention embedded between the lines. In the case of narrative, it is particularly necessary to read a narrative and find essential concepts such as emotions of the characters embedded between the lines for comprehending an intention of the narrative. In this work, we focus…
ERIC Educational Resources Information Center
Duncan, Mike R.; Birrell, Bob; Williams, Toni
2005-01-01
Virtual Reality (VR) is primarily a visual technology. Elements such as haptics (touch feedback) and sound can augment an experience, but the visual cues are the prime driver of what an audience will experience from a VR presentation. At its inception in 2001 the Centre for Advanced Visualization (CFAV) at Niagara College of Arts and Technology…
An REU Experience with Micro Assembly Workcell Research
ERIC Educational Resources Information Center
Stapleton, William; Asiabanpour, Bahram; Jimenez, Jesus; Um, Dugan
2010-01-01
Under an NSF REU center grant REU-0755355 entitled "Micro/Nano Assembly Workcell Via Micro Visual Sensing and Haptic Feedback", Texas A&M University-Corpus Christi and Texas State University-San Marcos collaboratively hosted two groups of 10 students from different backgrounds for 10 weeks each in Summer 2008 and 2009 respectively.…
PROJECT HEAVEN: Preoperative Training in Virtual Reality
Iamsakul, Kiratipath; Pavlovcik, Alexander V.; Calderon, Jesus I.; Sanderson, Lance M.
2017-01-01
A cephalosomatic anastomosis (CSA; also called HEAVEN: head anastomosis venture) has been proposed as an option for patients with neurological impairments, such as spinal cord injury (SCI), and terminal medical illnesses, for which medicine is currently powerless. Protocols to prepare a patient for life after CSA do not currently exist. However, methods used in conventional neurorehabilitation can be used as a reference for developing preparatory training. Studies on virtual reality (VR) technologies have documented VR's ability to enhance rehabilitation and improve the quality of recovery in patients with neurological disabilities. VR-augmented rehabilitation resulted in increased motivation towards performing functional training and improved the biopsychosocial state of patients. In addition, VR experiences coupled with haptic feedback promote neuroplasticity, resulting in the recovery of motor functions in neurologically-impaired individuals. To prepare the recipient psychologically for life after CSA, the development of VR experiences paired with haptic feedback is proposed. This proposal aims to innovate techniques in conventional neurorehabilitation to implement preoperative psychological training for the recipient of HEAVEN. Recipient's familiarity to body movements will prevent unexpected psychological reactions from occurring after the HEAVEN procedure. PMID:28540125
PROJECT HEAVEN: Preoperative Training in Virtual Reality.
Iamsakul, Kiratipath; Pavlovcik, Alexander V; Calderon, Jesus I; Sanderson, Lance M
2017-01-01
A cephalosomatic anastomosis (CSA; also called HEAVEN: head anastomosis venture) has been proposed as an option for patients with neurological impairments, such as spinal cord injury (SCI), and terminal medical illnesses, for which medicine is currently powerless. Protocols to prepare a patient for life after CSA do not currently exist. However, methods used in conventional neurorehabilitation can be used as a reference for developing preparatory training. Studies on virtual reality (VR) technologies have documented VR's ability to enhance rehabilitation and improve the quality of recovery in patients with neurological disabilities. VR-augmented rehabilitation resulted in increased motivation towards performing functional training and improved the biopsychosocial state of patients. In addition, VR experiences coupled with haptic feedback promote neuroplasticity, resulting in the recovery of motor functions in neurologically-impaired individuals. To prepare the recipient psychologically for life after CSA, the development of VR experiences paired with haptic feedback is proposed. This proposal aims to innovate techniques in conventional neurorehabilitation to implement preoperative psychological training for the recipient of HEAVEN. Recipient's familiarity to body movements will prevent unexpected psychological reactions from occurring after the HEAVEN procedure.
Force estimation from OCT volumes using 3D CNNs.
Gessert, Nils; Beringhoff, Jens; Otte, Christoph; Schlaefer, Alexander
2018-07-01
Estimating the interaction forces of instruments and tissue is of interest, particularly to provide haptic feedback during robot-assisted minimally invasive interventions. Different approaches based on external and integrated force sensors have been proposed. These are hampered by friction, sensor size, and sterilizability. We investigate a novel approach to estimate the force vector directly from optical coherence tomography image volumes. We introduce a novel Siamese 3D CNN architecture. The network takes an undeformed reference volume and a deformed sample volume as an input and outputs the three components of the force vector. We employ a deep residual architecture with bottlenecks for increased efficiency. We compare the Siamese approach to methods using difference volumes and two-dimensional projections. Data were generated using a robotic setup to obtain ground-truth force vectors for silicon tissue phantoms as well as porcine tissue. Our method achieves a mean average error of [Formula: see text] when estimating the force vector. Our novel Siamese 3D CNN architecture outperforms single-path methods that achieve a mean average error of [Formula: see text]. Moreover, the use of volume data leads to significantly higher performance compared to processing only surface information which achieves a mean average error of [Formula: see text]. Based on the tissue dataset, our methods shows good generalization in between different subjects. We propose a novel image-based force estimation method using optical coherence tomography. We illustrate that capturing the deformation of subsurface structures substantially improves force estimation. Our approach can provide accurate force estimates in surgical setups when using intraoperative optical coherence tomography.
Simulation System for Training in Laparoscopic Surgery
NASA Technical Reports Server (NTRS)
Basdogan, Cagatay; Ho, Chih-Hao
2003-01-01
A computer-based simulation system creates a visual and haptic virtual environment for training a medical practitioner in laparoscopic surgery. Heretofore, it has been common practice to perform training in partial laparoscopic surgical procedures by use of a laparoscopic training box that encloses a pair of laparoscopic tools, objects to be manipulated by the tools, and an endoscopic video camera. However, the surgical procedures simulated by use of a training box are usually poor imitations of the actual ones. The present computer-based system improves training by presenting a more realistic simulated environment to the trainee. The system includes a computer monitor that displays a real-time image of the affected interior region of the patient, showing laparoscopic instruments interacting with organs and tissues, as would be viewed by use of an endoscopic video camera and displayed to a surgeon during a laparoscopic operation. The system also includes laparoscopic tools that the trainee manipulates while observing the image on the computer monitor (see figure). The instrumentation on the tools consists of (1) position and orientation sensors that provide input data for the simulation and (2) actuators that provide force feedback to simulate the contact forces between the tools and tissues. The simulation software includes components that model the geometries of surgical tools, components that model the geometries and physical behaviors of soft tissues, and components that detect collisions between them. Using the measured positions and orientations of the tools, the software detects whether they are in contact with tissues. In the event of contact, the deformations of the tissues and contact forces are computed by use of the geometric and physical models. The image on the computer screen shows tissues deformed accordingly, while the actuators apply the corresponding forces to the distal ends of the tools. For the purpose of demonstration, the system has been set up to simulate the insertion of a flexible catheter in a bile duct. [As thus configured, the system can also be used to simulate other endoscopic procedures (e.g., bronchoscopy and colonoscopy) that include the insertion of flexible tubes into flexible ducts.] A hybrid approach has been followed in developing the software for real-time simulation of the visual and haptic interactions (1) between forceps and the catheter, (2) between the forceps and the duct, and (3) between the catheter and the duct. The deformations of the duct are simulated by finite-element and modalanalysis procedures, using only the most significant vibration modes of the duct for computing deformations and interaction forces. The catheter is modeled as a set of virtual particles uniformly distributed along the center line of the catheter and connected to each other via linear and torsional springs and damping elements. The interactions between the forceps and the duct as well as the catheter are simulated by use of a ray-based haptic-interaction- simulating technique in which the forceps are modeled as connected line segments.
ERIC Educational Resources Information Center
White, Peter A.
2012-01-01
Forces are experienced in actions on objects. The mechanoreceptor system is stimulated by proximal forces in interactions with objects, and experiences of force occur in a context of information yielded by other sensory modalities, principally vision. These experiences are registered and stored as episodic traces in the brain. These stored…
Influence of surgical gloves on haptic perception thresholds.
Hatzfeld, Christian; Dorsch, Sarah; Neupert, Carsten; Kupnik, Mario
2018-02-01
Impairment of haptic perception by surgical gloves could reduce requirements on haptic systems for surgery. While grip forces and manipulation capabilities were not impaired in previous studies, no data is available for perception thresholds. Absolute and differential thresholds (20 dB above threshold) of 24 subjects were measured for frequencies of 25 and 250 Hz with a Ψ-method. Effects of wearing a surgical glove, moisture on the contact surface and subject's experience with gloves were incorporated in a full-factorial experimental design. Absolute thresholds of 12.8 dB and -29.6 dB (means for 25 and 250 Hz, respectively) and differential thresholds of -12.6 dB and -9.5 dB agree with previous studies. A relevant effect of the frequency on absolute thresholds was found. Comparisons of glove- and no-glove-conditions did not reveal a significant mean difference. Wearing a single surgical glove does not affect absolute and differential haptic perception thresholds. Copyright © 2017 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Hassanzadeh, Iraj; Janabi-Sharifi, Farrokh
2005-12-01
In this paper, a new open architecture for visual servo control tasks is illustrated. A Puma-560 robotic manipulator is used to prove the concept. This design enables doing hybrid forcehisual servo control in an unstructured environment in different modes. Also, it can be controlled through Internet in teleoperation mode using a haptic device. Our proposed structure includes two major parts, hardware and software. In terms of hardware, it consists of a master (host) computer, a slave (target) computer, a Puma 560 manipulator, a CCD camera, a force sensor and a haptic device. There are five DAQ cards, interfacing Puma 560 and a slave computer. An open architecture package is developed using Matlab (R), Simulink (R) and XPC target toolbox. This package has the Hardware-In-the-Loop (HIL) property, i.e., enables one to readily implement different configurations of force, visual or hybrid control in real time. The implementation includes the following stages. First of all, retrofitting of puma was carried out. Then a modular joint controller for Puma 560 was realized using Simulink (R). Force sensor driver and force control implementation were written, using sjknction blocks of Simulink (R). Visual images were captured through Image Acquisition Toolbox of Matlab (R), and processed using Image Processing Toolbox. A haptic device interface was also written in Simulink (R). Thus, this setup could be readily reconfigured and accommodate any other robotic manipulator and/or other sensors without the trouble of the external issues relevant to the control, interface and software, while providing flexibility in components modification.
Yasuda, Kazuhiro; Saichi, Kenta; Kaibuki, Naomi; Harashima, Hiroaki; Iwata, Hiroyasu
2018-05-01
Most individuals have sensory disturbances post stroke, and these deficits contribute to post-stroke balance impairment. The haptic-based biofeedback (BF) system appears to be one of the promising tools for balance rehabilitation in patients with stroke, and the BF system can increase the objectivity of feedback and encouragement than that provided by a therapist. Studies in skill science indicated that feedback or encouragement from a coach or trainer enhances motor learning effect. Nevertheless, the optimal BF system (or its concept) which would refine the interpersonal feedback between patients and therapist has not been proposed. Thus, the purpose of this study was to propose a haptic-based perception-empathy BF system which provides information regarding the patient's center-of-foot pressure (CoP) pattern to the patient and the physical therapist to enhance the motor learning effect and validate the feasibility of this balance-training regimen in patients with chronic stroke. This study used a pre-post design without control group. Nine chronic stroke patients (mean age: 64.4 ± 9.2 years) received a balance-training regimen using this BF system twice a week for 4 weeks. Testing comprised quantitative measures (i.e., CoP) and clinical balance scale (Berg Balance Scale, BBS; Functional Reach Test, FRT; and Timed-Up and Go test, TUG). Post training, patients demonstrated marginally reduced postural spatial variability (i.e., 95% confidence elliptical area), and clinical balance performance significantly improved at post-training. Although the changes in FRT and TUG exceeded the minimal detectable change (MDC), changes in BBS did not reach clinical significance (i.e., smaller than MDC). These results may provide initial knowledge (i.e., beneficial effects, utility and its limitation) of the proposed BF system in designing effective motor learning strategies for stroke rehabilitation. More studies are required addressing limitations due to research design and training method for future clinical use. Copyright © 2018 Elsevier B.V. All rights reserved.
Petermeijer, Sebastiaan M; Abbink, David A; de Winter, Joost C F
2015-02-01
The aim of this study was to compare continuous versus bandwidth haptic steering guidance in terms of lane-keeping behavior, aftereffects, and satisfaction. An important human factors question is whether operators should be supported continuously or only when tolerance limits are exceeded. We aimed to clarify this issue for haptic steering guidance by investigating costs and benefits of both approaches in a driving simulator. Thirty-two participants drove five trials, each with a different level of haptic support: no guidance (Manual); guidance outside a 0.5-m bandwidth (Band1); a hysteresis version of Band1, which guided back to the lane center once triggered (Band2); continuous guidance (Cont); and Cont with double feedback gain (ContS). Participants performed a reaction time task while driving. Toward the end of each trial, the guidance was unexpectedly disabled to investigate aftereffects. All four guidance systems prevented large lateral errors (>0.7 m). Cont and especially ContS yielded smaller lateral errors and higher time to line crossing than Manual, Band1, and Band2. Cont and ContS yielded short-lasting aftereffects, whereas Band1 and Band2 did not. Cont yielded higher self-reported satisfaction and faster reaction times than Band1. Continuous and bandwidth guidance both prevent large driver errors. Continuous guidance yields improved performance and satisfaction over bandwidth guidance at the cost of aftereffects and variability in driver torque (indicating human-automation conflicts). The presented results are useful for designers of haptic guidance systems and support critical thinking about the costs and benefits of automation support systems.
Finger-Shaped GelForce: Sensor for Measuring Surface Traction Fields for Robotic Hand.
Sato, K; Kamiyama, K; Kawakami, N; Tachi, S
2010-01-01
It is believed that the use of haptic sensors to measure the magnitude, direction, and distribution of a force will enable a robotic hand to perform dexterous operations. Therefore, we develop a new type of finger-shaped haptic sensor using GelForce technology. GelForce is a vision-based sensor that can be used to measure the distribution of force vectors, or surface traction fields. The simple structure of the GelForce enables us to develop a compact finger-shaped GelForce for the robotic hand. GelForce that is developed on the basis of an elastic theory can be used to calculate surface traction fields using a conversion equation. However, this conversion equation cannot be analytically solved when the elastic body of the sensor has a complicated shape such as the shape of a finger. Therefore, we propose an observational method and construct a prototype of the finger-shaped GelForce. By using this prototype, we evaluate the basic performance of the finger-shaped GelForce. Then, we conduct a field test by performing grasping operations using a robotic hand. The results of this test show that using the observational method, the finger-shaped GelForce can be successfully used in a robotic hand.
Parametric model of the scala tympani for haptic-rendered cochlear implantation.
Todd, Catherine; Naghdy, Fazel
2005-01-01
A parametric model of the human scala tympani has been designed for use in a haptic-rendered computer simulation of cochlear implant surgery. It will be the first surgical simulator of this kind. A geometric model of the Scala Tympani has been derived from measured data for this purpose. The model is compared with two existing descriptions of the cochlear spiral. A first approximation of the basilar membrane is also produced. The structures are imported into a force-rendering software application for system development.
Haptic device for telerobotic surgery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salisbury, Curt; Salisbury, Jr., J. Kenneth
A haptic device for telerobotic surgery, including a base; a linkage system having first and second linkage members coupled to the base; a motor that provides a motor force; a transmission including first and second driving pulleys arranged such that their faces form an angle and their axes form a plane, first and second idler pulleys offset from the plane and arranged between the first and second driving pulleys such that their axes divide the angle between the first and second driving pulleys, and a cable that traverses the first and second driving pulleys and the set of idler pulleysmore » and transfers the motor force to the linkage system; an end effector coupled to distal ends of the first and second linkage members and maneuverable relative to the base; and a controller that modulates the motor force to simulate a body part at a point portion of the end effector.« less
[Methods of resolution for haptic assistance during catheterization].
Kern, T A; Herrmann, J; Klages, S; Meiss, T; Werthschützky, R
2005-01-01
During catheterization navigation within the patient is mainly dependent on a live x-ray image on the screen. Although methods for 3D visualisation and remote navigation of the catheter are discussed and tested still precise positioning is merely the result of intense training and a high skill and level of training of the performing surgeon. This article refers to a system which can be considered as an add-on for existing procedures of catheterization. It compromises of a miniaturised force sensor located at the tip of guide-wires whose prototype is shown here. The measured forces will be presented to the surgeon amplified by an external actuator described in this article. As a result a haptic perception of the forces between the tip of the guide-wire and the vessels walls will be available and enable the surgeon to gain an impression which is comparable to palpation of living vessels from the inside
Fang, Te-Yung; Wang, Pa-Chun; Liu, Chih-Hsien; Su, Mu-Chun; Yeh, Shih-Ching
2014-02-01
Virtual reality simulation training may improve knowledge of anatomy and surgical skills. We evaluated a 3-dimensional, haptic, virtual reality temporal bone simulator for dissection training. The subjects were 7 otolaryngology residents (3 training sessions each) and 7 medical students (1 training session each). The virtual reality temporal bone simulation station included a computer with software that was linked to a force-feedback hand stylus, and the system recorded performance and collisions with vital anatomic structures. Subjects performed virtual reality dissections and completed questionnaires after the training sessions. Residents and students had favorable responses to most questions of the technology acceptance model (TAM) questionnaire. The average TAM scores were above neutral for residents and medical students in all domains, and the average TAM score for residents was significantly higher for the usefulness domain and lower for the playful domain than students. The average satisfaction questionnaire for residents showed that residents had greater overall satisfaction with cadaver temporal bone dissection training than training with the virtual reality simulator or plastic temporal bone. For medical students, the average comprehension score was significantly increased from before to after training for all anatomic structures. Medical students had significantly more collisions with the dura than residents. The residents had similar mean performance scores after the first and third training sessions for all dissection procedures. The virtual reality temporal bone simulator provided satisfactory training for otolaryngology residents and medical students. Copyright © 2013. Published by Elsevier Ireland Ltd.
Displaying Sensed Tactile Cues with a Fingertip Haptic Device.
Pacchierotti, Claudio; Prattichizzo, Domenico; Kuchenbecker, Katherine J
2015-01-01
Telerobotic systems enable humans to explore and manipulate remote environments for applications such as surgery and disaster response, but few such systems provide the operator with cutaneous feedback. This article presents a novel approach to remote cutaneous interaction; our method is compatible with any fingertip tactile sensor and any mechanical tactile display device, and it does not require a position/force or skin deformation model. Instead, it directly maps the sensed stimuli to the best possible input commands for the device's motors using a data set recorded with the tactile sensor inside the device. As a proof of concept, we considered a haptic system composed of a BioTac tactile sensor, in charge of measuring contact deformations, and a custom 3-DoF cutaneous device with a flat contact platform, in charge of applying deformations to the user's fingertip. To validate the proposed approach and discover its inherent tradeoffs, we carried out two remote tactile interaction experiments. The first one evaluated the error between the tactile sensations registered by the BioTac in a remote environment and the sensations created by the cutaneous device for six representative tactile interactions and 27 variations of the display algorithm. The normalized average errors in the best condition were 3.0 percent of the BioTac's full 12-bit scale. The second experiment evaluated human subjects' experiences for the same six remote interactions and eight algorithm variations. The average subjective rating for the best algorithm variation was 8.2 out of 10, where 10 is best.
Sharp, Ian; Patton, James; Listenberger, Molly; Case, Emily
2011-08-08
Recent research that tests interactive devices for prolonged therapy practice has revealed new prospects for robotics combined with graphical and other forms of biofeedback. Previous human-robot interactive systems have required different software commands to be implemented for each robot leading to unnecessary developmental overhead time each time a new system becomes available. For example, when a haptic/graphic virtual reality environment has been coded for one specific robot to provide haptic feedback, that specific robot would not be able to be traded for another robot without recoding the program. However, recent efforts in the open source community have proposed a wrapper class approach that can elicit nearly identical responses regardless of the robot used. The result can lead researchers across the globe to perform similar experiments using shared code. Therefore modular "switching out"of one robot for another would not affect development time. In this paper, we outline the successful creation and implementation of a wrapper class for one robot into the open-source H3DAPI, which integrates the software commands most commonly used by all robots.
Olsson, Pontus; Nysjo, Fredrik; Carlbom, Ingrid B; Johansson, Stefan
2016-01-01
Piezoelectric motors offer an attractive alternative to electromagnetic actuators in portable haptic interfaces: they are compact, have a high force-to-volume ratio, and can operate with limited or no gearing. However, the choice of a piezoelectric motor type is not obvious due to differences in performance characteristics. We present our evaluation of two commercial, operationally different, piezoelectric motors acting as actuators in two kinesthetic haptic grippers, a walking quasi-static motor and a traveling wave ultrasonic motor. We evaluate each gripper's ability to display common virtual objects including springs, dampers, and rigid walls, and conclude that the walking quasi-static motor is superior at low velocities. However, for applications where high velocity is required, traveling wave ultrasonic motors are a better option.
Measuring the Impact of Haptic Feedback Using the SOLO Taxonomy
ERIC Educational Resources Information Center
Minogue, James; Jones, Gail
2009-01-01
The application of Biggs' and Collis' Structure of Observed Learning Outcomes taxonomy in the evaluation of student learning about cell membrane transport via a computer-based learning environment is described in this study. Pre-test-post-test comparisons of student outcome data (n = 80) were made across two groups of randomly assigned students:…
NASA Astrophysics Data System (ADS)
Vrublevskis, J.; Duncan, S.; Berthoud, L.; Bowman, P.; Hills, R.; McCulloch, Y.; Pisla, D.; Vaida, C.; Gherman, B.; Hofbaur, M.; Dieber, B.; Neythalath, N.; Smith, C.; van Winnendael, M.; Duvet, L.
2018-04-01
In order to avoid the use of 'double walled' gloves, a haptic feedback Remote Manipulation (RM) system rather than a gloved isolator is needed inside a Double Walled Isolator (DWI) to handle a sample returned from Mars.
Error amplification to promote motor learning and motivation in therapy robotics.
Shirzad, Navid; Van der Loos, H F Machiel
2012-01-01
To study the effects of different feedback error amplification methods on a subject's upper-limb motor learning and affect during a point-to-point reaching exercise, we developed a real-time controller for a robotic manipulandum. The reaching environment was visually distorted by implementing a thirty degrees rotation between the coordinate systems of the robot's end-effector and the visual display. Feedback error amplification was provided to subjects as they trained to learn reaching within the visually rotated environment. Error amplification was provided either visually or through both haptic and visual means, each method with two different amplification gains. Subjects' performance (i.e., trajectory error) and self-reports to a questionnaire were used to study the speed and amount of adaptation promoted by each error amplification method and subjects' emotional changes. We found that providing haptic and visual feedback promotes faster adaptation to the distortion and increases subjects' satisfaction with the task, leading to a higher level of attentiveness during the exercise. This finding can be used to design a novel exercise regimen, where alternating between error amplification methods is used to both increase a subject's motor learning and maintain a minimum level of motivational engagement in the exercise. In future experiments, we will test whether such exercise methods will lead to a faster learning time and greater motivation to pursue a therapy exercise regimen.
Review of surgical robotics user interface: what is the best way to control robotic surgery?
Simorov, Anton; Otte, R Stephen; Kopietz, Courtni M; Oleynikov, Dmitry
2012-08-01
As surgical robots begin to occupy a larger place in operating rooms around the world, continued innovation is necessary to improve our outcomes. A comprehensive review of current surgical robotic user interfaces was performed to describe the modern surgical platforms, identify the benefits, and address the issues of feedback and limitations of visualization. Most robots currently used in surgery employ a master/slave relationship, with the surgeon seated at a work-console, manipulating the master system and visualizing the operation on a video screen. Although enormous strides have been made to advance current technology to the point of clinical use, limitations still exist. A lack of haptic feedback to the surgeon and the inability of the surgeon to be stationed at the operating table are the most notable examples. The future of robotic surgery sees a marked increase in the visualization technologies used in the operating room, as well as in the robots' abilities to convey haptic feedback to the surgeon. This will allow unparalleled sensation for the surgeon and almost eliminate inadvertent tissue contact and injury. A novel design for a user interface will allow the surgeon to have access to the patient bedside, remaining sterile throughout the procedure, employ a head-mounted three-dimensional visualization system, and allow the most intuitive master manipulation of the slave robot to date.
Boos, Amy; Qiu, Qinyin; Fluet, Gerard G; Adamovich, Sergei V
2011-01-01
This study describes the design and feasibility testing of a hand rehabilitation system that provides haptic assistance for hand opening in moderate to severe hemiplegia while subjects attempt to perform bilateral hand movements. A cable-actuated exoskeleton robot assists the subjects in performing impaired finger movements but is controlled by movement of the unimpaired hand. In an attempt to combine the neurophysiological stimuli of bilateral movement and action observation during training, visual feedback of the impaired hand is replaced by feedback of the unimpaired hand, either by using a sagittaly oriented mirror or a virtual reality setup with a pair of virtual hands presented on a flat screen controlled with movement of the unimpaired hand, providing a visual image of their paretic hand moving normally. Joint angles for both hands are measured using data gloves. The system is programmed to maintain a symmetrical relationship between the two hands as they respond to commands to open and close simultaneously. Three persons with moderate to severe hemiplegia secondary to stroke trained with the system for eight, 30 to 60 minute sessions without adverse events. Each demonstrated positive motor adaptations to training. The system was well tolerated by persons with moderate to severe upper extremity hemiplegia. Further testing of its effects on motor ability with a broader range of clinical presentations is indicated.
Doxon, Andrew J; Johnson, David E; Tan, Hong Z; Provancher, William R
2013-01-01
Many of the devices used in haptics research are over-engineered for the task and are designed with capabilities that go far beyond human perception levels. Designing devices that more closely match the limits of human perception will make them smaller, less expensive, and more useful. However, many device-centric perception thresholds have yet to be evaluated. To this end, three experiments were conducted, using one degree-of-freedom contact location feedback device in combination with a kinesthetic display, to provide a more explicit set of specifications for similar tactile-kinesthetic haptic devices. The first of these experiments evaluated the ability of humans to repeatedly localize tactile cues across the fingerpad. Subjects could localize cues to within 1.3 mm and showed bias toward the center of the fingerpad. The second experiment evaluated the minimum perceptible difference of backlash at the tactile element. Subjects were able to discriminate device backlash in excess of 0.46 mm on low-curvature models and 0.93 mm on high-curvature models. The last experiment evaluated the minimum perceptible difference of system delay between user action and device reaction. Subjects were able to discriminate delays in excess of 61 ms. The results from these studies can serve as the maximum (i.e., most demanding) device specifications for most tactile-kinesthetic haptic systems.
Haptic-assistive technologies for audition and vision sensory disabilities.
Sorgini, Francesca; Caliò, Renato; Carrozza, Maria Chiara; Oddo, Calogero Maria
2018-05-01
The aim of this review is to analyze haptic sensory substitution technologies for deaf, blind and deaf-blind individuals. The literature search has been performed in Scopus, PubMed and Google Scholar databases using selected keywords, analyzing studies from 1960s to present. Search on databases for scientific publications has been accompanied by web search for commercial devices. Results have been classified by sensory disability and functionality, and analyzed by assistive technology. Complementary analyses have also been carried out on websites of public international agencies, such as the World Health Organization (WHO), and of associations representing sensory disabled persons. The reviewed literature provides evidences that sensory substitution aids are able to mitigate in part the deficits in language learning, communication and navigation for deaf, blind and deaf-blind individuals, and that the tactile sense can be a means of communication to provide some kind of information to sensory disabled individuals. A lack of acceptance emerged from the discussion of capabilities and limitations of haptic assistive technologies. Future researches shall go towards miniaturized, custom-designed and low-cost haptic interfaces and integration with personal devices such as smartphones for a major diffusion of sensory aids among disabled. Implications for rehabilitation Systematic review of state of the art of haptic assistive technologies for vision and audition sensory disabilities. Sensory substitution systems for visual and hearing disabilities have a central role in the transmission of information for patients with sensory impairments, enabling users to interact with the not disabled community in daily activities. Visual and auditory inputs are converted in haptic feedback via different actuation technologies. The information is presented in the form of static or dynamic stimulation of the skin. Their effectiveness and ease of use make haptic sensory substitution systems suitable for patients with different levels of disabilities. They constitute a cheaper and less invasive alternative to implantable partial sensory restitution systems. Future researches are oriented towards the optimization of the stimulation parameters together with the development of miniaturized, custom-designed and low-cost aids operating in synergy in networks, aiming to increase patients' acceptability of these technologies.
Palpation imaging using a haptic system for virtual reality applications in medicine.
Khaled, W; Reichling, S; Bruhns, O T; Boese, H; Baumann, M; Monkman, G; Egersdoerfer, S; Klein, D; Tunayar, A; Freimuth, H; Lorenz, A; Pessavento, A; Ermert, H
2004-01-01
In the field of medical diagnosis, there is a strong need to determine mechanical properties of biological tissue, which are of histological and pathological relevance. Malignant tumors are significantly stiffer than surrounding healthy tissue. One of the established diagnosis procedures is the palpation of body organs and tissue. Palpation is used to measure swelling, detect bone fracture, find and measure pulse, or to locate changes in the pathological state of tissue and organs. Current medical practice routinely uses sophisticated diagnostic tests through magnetic resonance imaging (MRI), computed tomography (CT) and ultrasound (US) imaging. However, they cannot provide direct measure of tissue elasticity. Last year we presented the concept of the first haptic sensor actuator system to visualize and reconstruct mechanical properties of tissue using ultrasonic elastography and a haptic display with electrorheological fluids. We developed a real time strain imaging system for tumor diagnosis. It allows biopsies simultaneously to conventional ultrasound B-Mode and strain imaging investigations. We deduce the relative mechanical properties by using finite element simulations and numerical solution models solving the inverse problem. Various modifications on the haptic sensor actuator system have been investigated. This haptic system has the potential of inducing real time substantial forces, using a compact lightweight mechanism which can be applied to numerous areas including intraoperative navigation, telemedicine, teaching and telecommunication.
Culbertson, Heather; Kuchenbecker, Katherine J
2017-01-01
Interacting with physical objects through a tool elicits tactile and kinesthetic sensations that comprise your haptic impression of the object. These cues, however, are largely missing from interactions with virtual objects, yielding an unrealistic user experience. This article evaluates the realism of virtual surfaces rendered using haptic models constructed from data recorded during interactions with real surfaces. The models include three components: surface friction, tapping transients, and texture vibrations. We render the virtual surfaces on a SensAble Phantom Omni haptic interface augmented with a Tactile Labs Haptuator for vibration output. We conducted a human-subject study to assess the realism of these virtual surfaces and the importance of the three model components. Following a perceptual discrepancy paradigm, subjects compared each of 15 real surfaces to a full rendering of the same surface plus versions missing each model component. The realism improvement achieved by including friction, tapping, or texture in the rendering was found to directly relate to the intensity of the surface's property in that domain (slipperiness, hardness, or roughness). A subsequent analysis of forces and vibrations measured during interactions with virtual surfaces indicated that the Omni's inherent mechanical properties corrupted the user's haptic experience, decreasing realism of the virtual surface.
Evaluation of upper extremity robot-assistances in subacute and chronic stroke subjects.
Ziherl, Jaka; Novak, Domen; Olenšek, Andrej; Mihelj, Matjaž; Munih, Marko
2010-10-18
Robotic systems are becoming increasingly common in upper extremity stroke rehabilitation. Recent studies have already shown that the use of rehabilitation robots can improve recovery. This paper evaluates the effect of different modes of robot-assistances in a complex virtual environment on the subjects' ability to complete the task as well as on various haptic parameters arising from the human-robot interaction. The MIMICS multimodal system that includes the haptic robot HapticMaster and a dynamic virtual environment is used. The goal of the task is to catch a ball that rolls down a sloped table and place it in a basket above the table. Our study examines the influence of catching assistance, pick-and-place movement assistance and grasping assistance on the catching efficiency, placing efficiency and on movement-dependent parameters: mean reaching forces, deviation error, mechanical work and correlation between the grasping force and the load force. The results with groups of subjects (23 subacute hemiparetic subjects, 10 chronic hemiparetic subjects and 23 control subjects) showed that the assistance raises the catching efficiency and pick-and-place efficiency. The pick-and-place movement assistance greatly limits the movements of the subject and results in decreased work toward the basket. The correlation between the load force and the grasping force exists in a certain phase of the movement. The results also showed that the stroke subjects without assistance and the control subjects performed similarly. The robot-assistances used in the study were found to be a possible way to raise the catching efficiency and efficiency of the pick-and-place movements in subacute and chronic subjects. The observed movement parameters showed that robot-assistances we used for our virtual task should be improved to maximize physical activity.
The role of haptic feedback in laparoscopic training using the LapMentor II.
Salkini, Mohamad W; Doarn, Charles R; Kiehl, Nicholai; Broderick, Timothy J; Donovan, James F; Gaitonde, Krishnanath
2010-01-01
Laparoscopic surgery has become the standard of care for many surgical diseases. Haptic (tactile) feedback (HFB) is considered an important component of laparoscopic surgery. Virtual reality simulation (VRS) is an alternative method to teach surgical skills to surgeons in training. Newer VRS trainers such as the Simbionix Lap Mentor II provide significantly improved tactile feedback. However, VRSs are expensive and adding HFB software adds an estimated cost of $30,000 to the commercial price. The HFB provided by the Lap Mentor II has not been validated by an independent party. We used the Simbionix Lap Mentor II in this study to demonstrate the effect of adding an HFB mechanism in the VRS trainer. The study was approved by the University of Cincinnati Institutional Review Board. Twenty laparoscopically novice medical students were enrolled. Each student was asked to perform three different tasks on the Lap Mentor II and repeat each one five times. The chosen tasks demanded significant amount of traction and counter traction. The first task was to pull leaking tubes enough and clip them. The second task was stretching a jelly plate enough to see its attachments to the floor and cut these attachments. In the third task, the trainee had to separate the gallbladder from its bed on the liver. The students were randomized into two groups to perform the tasks with and without HFB. We used accuracy, speed, and economy of movement as scales to compare the performance between the two groups. The participants also completed a simple questionnaire that highlighted age, sex, and experiences in videogame usage. The two groups were comparable in age, sex, and videogame playing. No differences in the accuracy, the economy, and the speed of hand movement were noticed. In fact, adding HFB to the Lap Mentor II simulator did not contribute to any improvement in the performance of the trainees. Interestingly, we found that videogame expert players tend to have faster and more economic motion in their dominant hands. However, the performance accuracy was not significantly affected. The presence of HFB has less effect than it thought to be on the performance of the novice trainees. This may suggest that better HFB is still needed. However, there may be visual compensation for the lack of haptics. Playing videogames has a positive impact on economy, and the speed of the dominant had motion without affecting its accuracy. Further research is needed to clarify the value of haptics to the expert surgeon and compare it to the new trainees.
Haptic device for colonoscopy training simulator.
Kwon, Jun Yong; Woo, Hyun Soo; Lee, Doo Yong
2005-01-01
A new 2-DOF haptic device for colonoscopy training simulator employing flexible endoscopes, is developed. The user operates the device in translational and roll directions. The developed folding guides of the device keep the endoscope tube straight. This helps transmit large decoupled forces of the colonoscopy simulation to the user. The device also includes a mechanism to detect jiggling motion of the scopes to allow users to practice this important skill of the colonoscopy. The device includes PD controller to compensate the inertia and friction effects. This provides the users with better transparent sensation of the simulation.
Anthro-Centric Multisensory Interface for Sensory Augmentation of Tele-Surgery (ACMI-SATS)
2010-09-01
surgeon from perceiving useful kinesthetic feedback from direct interaction with the tissues present in traditional “open” procedures. Additionally... Kinesthetic and haptic signals in surgical applications are critical, and prior work with VEs has shown that errors increase without realistic...telepresence related kinesthetic sensory interactions while tactile will refer to more general or abstract tactual interactions. Figure 2: (left
ERIC Educational Resources Information Center
Lahav, Orly; Schloerb, David W.; Srinivasan, Mandayam A.
2015-01-01
Introduction: The BlindAid, a virtual system developed for orientation and mobility (O&M) training of people who are blind or have low vision, allows interaction with different virtual components (structures and objects) via auditory and haptic feedback. This research examined if and how the BlindAid that was integrated within an O&M…
Validation and learning in the Procedicus KSA virtual reality surgical simulator.
Ström, P; Kjellin, A; Hedman, L; Johnson, E; Wredmark, T; Felländer-Tsai, L
2003-02-01
Advanced simulator training within medicine is a rapidly growing field. Virtual reality simulators are being introduced as cost-saving educational tools, which also lead to increased patient safety. Fifteen medical students were included in the study. For 10 medical students performance was monitored, before and after 1 h of training, in two endoscopic simulators (the Procedicus KSA with haptic feedback and anatomical graphics and the established MIST simulator without this haptic feedback and graphics). Five medical students performed 50 tests in the Procedicus KSA in order to analyze learning curves. One of these five medical students performed multiple training sessions during 2 weeks and performed more than 300 tests. There was a significant improvement after 1 h of training regarding time, movement economy, and total score. The results in the two simulators were highly correlated. Our results show that the use of surgical simulators as a pedagogical tool in medical student training is encouraging. It shows rapid learning curves and our suggestion is to introduce endoscopic simulator training in undergraduate medical education during the course in surgery when motivation is high and before the development of "negative stereotypes" and incorrect practices.
Bortone, Ilaria; Leonardis, Daniele; Solazzi, Massimiliano; Procopio, Caterina; Crecchi, Alessandra; Bonfiglio, Luca; Frisoli, Antonio
2017-07-01
The past decade has seen the emergence of rehabilitation treatments using virtual reality environments. One of the advantages in using this technology is the potential to create positive motivation, by means of engaging environments and tasks shaped in the form of serious games. In this work, we propose a novel Neuro Rehabilitation System for children with movement disorders, that is based on serious games in immersive virtual reality with haptic feedback. The system design aims to enhance involvement and engagement of patients, to provide congruent multi-sensory afferent feedback during motor exercises, and to benefit from the flexibility of virtual reality in adapting exercises to the patient's needs. We present a feasibility study of the method conducted through an experimental rehabilitation session in a group of 4 children with Cerebral Palsy and Developmental Dyspraxia, 4 Typically Developing children and 4 healthy adults. Subjects and patients were able to accomplish the proposed rehabilitation session and average performance of the motor exercises in patients were lower, although comparable, to healthy subjects. Together with positive comments reported by children after the rehabilitation session, results are encouraging for application of the method in a prolonged rehabilitation treatment.
Force Exertion Capacity Measurements in Haptic Virtual Environments
ERIC Educational Resources Information Center
Munih, Marko; Bardorfer, Ales; Ceru, Bojan; Bajd, Tadej; Zupan, Anton
2010-01-01
An objective test for evaluating functional status of the upper limbs (ULs) in patients with muscular distrophy (MD) is presented. The method allows for quantitative assessment of the UL functional state with an emphasis on force exertion capacity. The experimental measurement setup and the methodology for the assessment of maximal exertable force…
Intuitive tactile zooming for graphics accessed by individuals who are blind and visually impaired.
Rastogi, Ravi; Pawluk, T V Dianne; Ketchum, Jessica
2013-07-01
One possibility of providing access to visual graphics for those who are visually impaired is to present them tactually: unfortunately, details easily available to vision need to be magnified to be accessible through touch. For this, we propose an "intuitive" zooming algorithm to solve potential problems with directly applying visual zooming techniques to haptic displays that sense the current location of a user on a virtual diagram with a position sensor and, then, provide the appropriate local information either through force or tactile feedback. Our technique works by determining and then traversing the levels of an object tree hierarchy of a diagram. In this manner, the zoom steps adjust to the content to be viewed, avoid clipping and do not zoom when no object is present. The algorithm was tested using a small, "mouse-like" display with tactile feedback on pictures representing houses in a community and boats on a lake. We asked the users to answer questions related to details in the pictures. Comparing our technique to linear and logarithmic step zooming, we found a significant increase in the correctness of the responses (odds ratios of 2.64:1 and 2.31:1, respectively) and usability (differences of 36% and 19%, respectively) using our "intuitive" zooming technique.
Percutaneous spinal fixation simulation with virtual reality and haptics.
Luciano, Cristian J; Banerjee, P Pat; Sorenson, Jeffery M; Foley, Kevin T; Ansari, Sameer A; Rizzi, Silvio; Germanwala, Anand V; Kranzler, Leonard; Chittiboina, Prashant; Roitberg, Ben Z
2013-01-01
In this study, we evaluated the use of a part-task simulator with 3-dimensional and haptic feedback as a training tool for percutaneous spinal needle placement. To evaluate the learning effectiveness in terms of entry point/target point accuracy of percutaneous spinal needle placement on a high-performance augmented-reality and haptic technology workstation with the ability to control the duration of computer-simulated fluoroscopic exposure, thereby simulating an actual situation. Sixty-three fellows and residents performed needle placement on the simulator. A virtual needle was percutaneously inserted into a virtual patient's thoracic spine derived from an actual patient computed tomography data set. Ten of 126 needle placement attempts by 63 participants ended in failure for a failure rate of 7.93%. From all 126 needle insertions, the average error (15.69 vs 13.91), average fluoroscopy exposure (4.6 vs 3.92), and average individual performance score (32.39 vs 30.71) improved from the first to the second attempt. Performance accuracy yielded P = .04 from a 2-sample t test in which the rejected null hypothesis assumes no improvement in performance accuracy from the first to second attempt in the test session. The experiments showed evidence (P = .04) of performance accuracy improvement from the first to the second percutaneous needle placement attempt. This result, combined with previous learning retention and/or face validity results of using the simulator for open thoracic pedicle screw placement and ventriculostomy catheter placement, supports the efficacy of augmented reality and haptics simulation as a learning tool.
A recursive Bayesian updating model of haptic stiffness perception.
Wu, Bing; Klatzky, Roberta L
2018-06-01
Stiffness of many materials follows Hooke's Law, but the mechanism underlying the haptic perception of stiffness is not as simple as it seems in the physical definition. The present experiments support a model by which stiffness perception is adaptively updated during dynamic interaction. Participants actively explored virtual springs and estimated their stiffness relative to a reference. The stimuli were simulations of linear springs or nonlinear springs created by modulating a linear counterpart with low-amplitude, half-cycle (Experiment 1) or full-cycle (Experiment 2) sinusoidal force. Experiment 1 showed that subjective stiffness increased (decreased) as a linear spring was positively (negatively) modulated by a half-sinewave force. In Experiment 2, an opposite pattern was observed for full-sinewave modulations. Modeling showed that the results were best described by an adaptive process that sequentially and recursively updated an estimate of stiffness using the force and displacement information sampled over trajectory and time. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Analysis of hand contact areas and interaction capabilities during manipulation and exploration.
Gonzalez, Franck; Gosselin, Florian; Bachta, Wael
2014-01-01
Manual human-computer interfaces for virtual reality are designed to allow an operator interacting with a computer simulation as naturally as possible. Dexterous haptic interfaces are the best suited for this goal. They give intuitive and efficient control on the environment with haptic and tactile feedback. This paper is aimed at helping in the choice of the interaction areas to be taken into account in the design of such interfaces. The literature dealing with hand interactions is first reviewed in order to point out the contact areas involved in exploration and manipulation tasks. Their frequencies of use are then extracted from existing recordings. The results are gathered in an original graphical interaction map allowing for a simple visualization of the way the hand is used, and compared with a map of mechanoreceptors densities. Then an interaction tree, mapping the relative amount of actions made available through the use of a given contact area, is built and correlated with the losses of hand function induced by amputations. A rating of some existing haptic interfaces and guidelines for their design are finally achieved to illustrate a possible use of the developed graphical tools.
Hu, Zhongwei; Sun, Wei; Zhang, Bi
2012-01-01
Understanding biomechanical responses during soft tissue cutting is important for developing surgical simulators and robot-assisted surgery with haptic feedback. The biomechanics involved in the aortic tissue cutting process is largely unknown. In this study, porcine ascending aorta was selected as a representative aortic tissue, and tissue cutting experiments were performed using a novel tissue cutting apparatus. The tissue cutting responses under various cutting conditions were investigated, including differing initial tissue lateral holding force and distance, cutting speed, cutter inclination angle, tissue anatomical orientation and thickness. The results from this study suggest that a “break-in” cutting force of about 4 – 12 N, a cutter “break-in” distance of 5 – 15 mm, and a continuous cutting force of 2 – 4 N were needed to cut through the porcine ascending aorta tissue. For all testing conditions investigated in this study, the cutting force vs. the cutter displacement curves exhibited similar characteristics. More importantly, this study demonstrated that tissue cutting involving one or more of the following conditions: a larger lateral holding force, a smaller lateral hold distance, a higher cutting speed or a larger inclination angle, could result in a smaller “break in” cutting force and a smaller “break-in” distance. In addition, it was found that the cutting force in the vessel longitudinal direction was larger than that in the circumferential direction. There was a strong correlation between the tissue thickness and the cutting force. The experimental results reported in this study could provide a basis for understanding the characteristic response of aortic tissue to scalpel cutting, and offer insight into the development of surgical simulators. PMID:23262306
Interpersonal synergies: static prehension tasks performed by two actors.
Solnik, Stanislaw; Reschechtko, Sasha; Wu, Yen-Hsun; Zatsiorsky, Vladimir M; Latash, Mark L
2016-08-01
We investigated multidigit synergies stabilizing components of the resultant force vector during joint performance of a static prehension task by two persons as compared to similar tasks performed by a single person using both hands. Subjects transferred the instrumented handle from the right hand to the left hand (one-person condition) or passed that handle to another person (two-person condition) while keeping the handle's position and orientation stationary. Only three digits were involved per hand, the thumb, the index finger, and the middle finger; the forces and moments produced by the digits were measured by six-component sensors. We estimated the performance-stabilizing synergies within the uncontrolled manifold framework by quantifying the intertrial variance structure of digit forces and moments. The analysis was performed at three levels: between hands, between virtual finger and virtual thumb (imagined digits producing the same mechanical variables as the corresponding actual digits combined) produced by the two hands (in both interpersonal and intrapersonal conditions), and between the thumb and virtual finger for one hand only. Additionally, we performed correlation and phase synchronization analyses of resultant tangential forces and internal normal forces. Overall, the one-person conditions were characterized by higher amount of intertrial variance that did not affect resultant normal force components, higher internal components of normal forces, and stronger synchronization of the normal forces generated by the hands. Our observations suggest that in two-person tasks, when participants try to achieve a common mechanical outcome, the performance-stabilizing synergies depend on non-visual information exchange, possibly via the haptic and proprioceptive systems. Therefore, synergies quantified in tasks using visual feedback only may not be generalizable to more natural tasks.
The Perception of Cooperativeness Without Any Visual or Auditory Communication.
Chang, Dong-Seon; Burger, Franziska; Bülthoff, Heinrich H; de la Rosa, Stephan
2015-12-01
Perceiving social information such as the cooperativeness of another person is an important part of human interaction. But can people perceive the cooperativeness of others even without any visual or auditory information? In a novel experimental setup, we connected two people with a rope and made them accomplish a point-collecting task together while they could not see or hear each other. We observed a consistently emerging turn-taking behavior in the interactions and installed a confederate in a subsequent experiment who either minimized or maximized this behavior. Participants experienced this only through the haptic force-feedback of the rope and made evaluations about the confederate after each interaction. We found that perception of cooperativeness was significantly affected only by the manipulation of this turn-taking behavior. Gender- and size-related judgments also significantly differed. Our results suggest that people can perceive social information such as the cooperativeness of other people even in situations where possibilities for communication are minimal.
Virtual suturing simulation based on commodity physics engine for medical learning.
Choi, Kup-Sze; Chan, Sze-Ho; Pang, Wai-Man
2012-06-01
Development of virtual-reality medical applications is usually a complicated and labour intensive task. This paper explores the feasibility of using commodity physics engine to develop a suturing simulator prototype for manual skills training in the fields of nursing and medicine, so as to enjoy the benefits of rapid development and hardware-accelerated computation. In the prototype, spring-connected boxes of finite dimension are used to simulate soft tissues, whereas needle and thread are modelled with chained segments. Spherical joints are used to simulate suture's flexibility and to facilitate thread cutting. An algorithm is developed to simulate needle insertion and thread advancement through the tissue. Two-handed manipulations and force feedback are enabled with two haptic devices. Experiments on the closure of a wound show that the prototype is able to simulate suturing procedures at interactive rates. The simulator is also used to study a curvature-adaptive suture modelling technique. Issues and limitations of the proposed approach and future development are discussed.
The Perception of Cooperativeness Without Any Visual or Auditory Communication
Chang, Dong-Seon; Burger, Franziska; de la Rosa, Stephan
2015-01-01
Perceiving social information such as the cooperativeness of another person is an important part of human interaction. But can people perceive the cooperativeness of others even without any visual or auditory information? In a novel experimental setup, we connected two people with a rope and made them accomplish a point-collecting task together while they could not see or hear each other. We observed a consistently emerging turn-taking behavior in the interactions and installed a confederate in a subsequent experiment who either minimized or maximized this behavior. Participants experienced this only through the haptic force-feedback of the rope and made evaluations about the confederate after each interaction. We found that perception of cooperativeness was significantly affected only by the manipulation of this turn-taking behavior. Gender- and size-related judgments also significantly differed. Our results suggest that people can perceive social information such as the cooperativeness of other people even in situations where possibilities for communication are minimal. PMID:27551362
Hongyi Xu; Barbic, Jernej
2017-01-01
We present an algorithm for fast continuous collision detection between points and signed distance fields, and demonstrate how to robustly use it for 6-DoF haptic rendering of contact between objects with complex geometry. Continuous collision detection is often needed in computer animation, haptics, and virtual reality applications, but has so far only been investigated for polygon (triangular) geometry representations. We demonstrate how to robustly and continuously detect intersections between points and level sets of the signed distance field. We suggest using an octree subdivision of the distance field for fast traversal of distance field cells. We also give a method to resolve continuous collisions between point clouds organized into a tree hierarchy and a signed distance field, enabling rendering of contact between rigid objects with complex geometry. We investigate and compare two 6-DoF haptic rendering methods now applicable to point-versus-distance field contact for the first time: continuous integration of penalty forces, and a constraint-based method. An experimental comparison to discrete collision detection demonstrates that the continuous method is more robust and can correctly resolve collisions even under high velocities and during complex contact.
The role of three-dimensional visualization in robotics-assisted cardiac surgery
NASA Astrophysics Data System (ADS)
Currie, Maria; Trejos, Ana Luisa; Rayman, Reiza; Chu, Michael W. A.; Patel, Rajni; Peters, Terry; Kiaii, Bob
2012-02-01
Objectives: The purpose of this study was to determine the effect of three-dimensional (3D) versus two-dimensional (2D) visualization on the amount of force applied to mitral valve tissue during robotics-assisted mitral valve annuloplasty, and the time to perform the procedure in an ex vivo animal model. In addition, we examined whether these effects are consistent between novices and experts in robotics-assisted cardiac surgery. Methods: A cardiac surgery test-bed was constructed to measure forces applied by the da Vinci surgical system (Intuitive Surgical, Sunnyvale, CA) during mitral valve annuloplasty. Both experts and novices completed roboticsassisted mitral valve annuloplasty with 2D and 3D visualization. Results: The mean time for both experts and novices to suture the mitral valve annulus and to tie sutures using 3D visualization was significantly less than that required to suture the mitral valve annulus and to tie sutures using 2D vision (p∠0.01). However, there was no significant difference in the maximum force applied by novices to the mitral valve during suturing (p = 0.3) and suture tying (p = 0.6) using either 2D or 3D visualization. Conclusion: This finding suggests that 3D visualization does not fully compensate for the absence of haptic feedback in robotics-assisted cardiac surgery. Keywords: Robotics-assisted surgery, visualization, cardiac surgery
Development and control of a magnetorheological haptic device for robot assisted surgery.
Shokrollahi, Elnaz; Goldenberg, Andrew A; Drake, James M; Eastwood, Kyle W; Kang, Matthew
2017-07-01
A prototype magnetorheological (MR) fluid-based actuator has been designed for tele-robotic surgical applications. This device is capable of generating forces up to 47 N, with input currents ranging from 0 to 1.5 A. We begin by outlining the physical design of the device, and then discuss a novel nonlinear model of the device's behavior. The model was developed using the Hammerstein-Wiener (H-W) nonlinear black-box technique and is intended to accurately capture the hysteresis behavior of the MR-fluid. Several experiments were conducted on the device to collect estimation and validation datasets to construct the model and assess its performance. Different estimating functions were used to construct the model, and their effectiveness is assessed based on goodness-of-fit and final-prediction-error measurements. A sigmoid network was found to have a goodness-of-fit of 95%. The model estimate was then used to tune a PID controller. Two control schemes were proposed to eliminate the hysteresis behavior present in the MR fluid device. One method uses a traditional force feedback control loop and the other is based on measuring the magnetic field using a Hall-effect sensor embedded within the device. The Hall-effect sensor scheme was found to be superior in terms of cost, simplicity and real-time control performance compared to the force control strategy.
NASA Astrophysics Data System (ADS)
Oyaga Landa, Francisco Javier; Deán-Ben, Xosé Luís.; Montero de Espinosa, Francisco; Razansky, Daniel
2017-03-01
Lack of haptic feedback during laser surgery hampers controlling the incision depth, leading to a high risk of undesired tissue damage. Here we present a new feedback sensing method that accomplishes non-contact realtime monitoring of laser ablation procedures by detecting shock waves emanating from the ablation spot with air-coupled transducers. Experiments in soft and hard tissue samples attained high reproducibity in real-time depth estimation of the laser-induced cuts. The advantages derived from the non-contact nature of the suggested monitoring approach are expected to greatly promote the general applicability of laser-based surgeries.
Ranky, Richard G; Sivak, Mark L; Lewis, Jeffrey A; Gade, Venkata K; Deutsch, Judith E; Mavroidis, Constantinos
2014-06-05
Cycling has been used in the rehabilitation of individuals with both chronic and post-surgical conditions. Among the challenges with implementing bicycling for rehabilitation is the recruitment of both extremities, in particular when one is weaker or less coordinated. Feedback embedded in virtual reality (VR) augmented cycling may serve to address the requirement for efficacious cycling; specifically recruitment of both extremities and exercising at a high intensity. In this paper a mechatronic rehabilitation bicycling system with an interactive virtual environment, called Virtual Reality Augmented Cycling Kit (VRACK), is presented. Novel hardware components embedded with sensors were implemented on a stationary exercise bicycle to monitor physiological and biomechanical parameters of participants while immersing them in an augmented reality simulation providing the user with visual, auditory and haptic feedback. This modular and adaptable system attaches to commercially-available stationary bicycle systems and interfaces with a personal computer for simulation and data acquisition processes. The complete bicycle system includes: a) handle bars based on hydraulic pressure sensors; b) pedals that monitor pedal kinematics with an inertial measurement unit (IMU) and forces on the pedals while providing vibratory feedback; c) off the shelf electronics to monitor heart rate and d) customized software for rehabilitation. Bench testing for the handle and pedal systems is presented for calibration of the sensors detecting force and angle. The modular mechatronic kit for exercise bicycles was tested in bench testing and human tests. Bench tests performed on the sensorized handle bars and the instrumented pedals validated the measurement accuracy of these components. Rider tests with the VRACK system focused on the pedal system and successfully monitored kinetic and kinematic parameters of the rider's lower extremities. The VRACK system, a virtual reality mechatronic bicycle rehabilitation modular system was designed to convert most bicycles in virtual reality (VR) cycles. Preliminary testing of the augmented reality bicycle system was successful in demonstrating that a modular mechatronic kit can monitor and record kinetic and kinematic parameters of several riders.
Stereo camera based virtual cane system with identifiable distance tactile feedback for the blind.
Kim, Donghun; Kim, Kwangtaek; Lee, Sangyoun
2014-06-13
In this paper, we propose a new haptic-assisted virtual cane system operated by a simple finger pointing gesture. The system is developed by two stages: development of visual information delivery assistant (VIDA) with a stereo camera and adding a tactile feedback interface with dual actuators for guidance and distance feedbacks. In the first stage, user's pointing finger is automatically detected using color and disparity data from stereo images and then a 3D pointing direction of the finger is estimated with its geometric and textural features. Finally, any object within the estimated pointing trajectory in 3D space is detected and the distance is then estimated in real time. For the second stage, identifiable tactile signals are designed through a series of identification experiments, and an identifiable tactile feedback interface is developed and integrated into the VIDA system. Our approach differs in that navigation guidance is provided by a simple finger pointing gesture and tactile distance feedbacks are perfectly identifiable to the blind.
Stereo Camera Based Virtual Cane System with Identifiable Distance Tactile Feedback for the Blind
Kim, Donghun; Kim, Kwangtaek; Lee, Sangyoun
2014-01-01
In this paper, we propose a new haptic-assisted virtual cane system operated by a simple finger pointing gesture. The system is developed by two stages: development of visual information delivery assistant (VIDA) with a stereo camera and adding a tactile feedback interface with dual actuators for guidance and distance feedbacks. In the first stage, user's pointing finger is automatically detected using color and disparity data from stereo images and then a 3D pointing direction of the finger is estimated with its geometric and textural features. Finally, any object within the estimated pointing trajectory in 3D space is detected and the distance is then estimated in real time. For the second stage, identifiable tactile signals are designed through a series of identification experiments, and an identifiable tactile feedback interface is developed and integrated into the VIDA system. Our approach differs in that navigation guidance is provided by a simple finger pointing gesture and tactile distance feedbacks are perfectly identifiable to the blind. PMID:24932864
Immersion of virtual reality for rehabilitation - Review.
Rose, Tyler; Nam, Chang S; Chen, Karen B
2018-05-01
Virtual reality (VR) shows promise in the application of healthcare and because it presents patients an immersive, often entertaining, approach to accomplish the goal of improvement in performance. Eighteen studies were reviewed to understand human performance and health outcomes after utilizing VR rehabilitation systems. We aimed to understand: (1) the influence of immersion in VR performance and health outcomes; (2) the relationship between enjoyment and potential patient adherence to VR rehabilitation routine; and (3) the influence of haptic feedback on performance in VR. Performance measures including postural stability, navigation task performance, and joint mobility showed varying relations to immersion. Limited data did not allow a solid conclusion between enjoyment and adherence, but patient enjoyment and willingness to participate were reported in care plans that incorporates VR. Finally, different haptic devices such as gloves and controllers provided both strengths and weakness in areas such movement velocity, movement accuracy, and path efficiency. Copyright © 2018 Elsevier Ltd. All rights reserved.
Programmable prostate palpation simulator using property-changing pneumatic bladder.
Talhan, Aishwari; Jeon, Seokhee
2018-05-01
The currently available prostate palpation simulators are based on either a physical mock-up or pure virtual simulation. Both cases have their inherent limitations. The former lacks flexibility in presenting abnormalities and scenarios because of the static nature of the mock-up and has usability issues because the prostate model must be replaced in different scenarios. The latter has realism issues, particularly in haptic feedback, because of the very limited performance of haptic hardware and inaccurate haptic simulation. This paper presents a highly flexible and programmable simulator with high haptic fidelity. Our new approach is based on a pneumatic-driven, property-changing, silicone prostate mock-up that can be embedded in a human torso mannequin. The mock-up has seven pneumatically controlled, multi-layered bladder cells to mimic the stiffness, size, and location changes of nodules in the prostate. The size is controlled by inflating the bladder with positive pressure in the chamber, and a hard nodule can be generated using the particle jamming technique; the fine sand in the bladder becomes stiff when it is vacuumed. The programmable valves and system identification process enable us to precisely control the size and stiffness, which results in a simulator that can realistically generate many different diseases without replacing anything. The three most common abnormalities in a prostate are selected for demonstration, and multiple progressive stages of each abnormality are carefully designed based on medical data. A human perception experiment is performed by actual medical professionals and confirms that our simulator exhibits higher realism and usability than do the conventional simulators. Copyright © 2018 Elsevier Ltd. All rights reserved.
Yim, Sunghoon; Jeon, Seokhee; Choi, Seungmoon
2016-01-01
In this paper, we present an extended data-driven haptic rendering method capable of reproducing force responses during pushing and sliding interaction on a large surface area. The main part of the approach is a novel input variable set for the training of an interpolation model, which incorporates the position of a proxy - an imaginary contact point on the undeformed surface. This allows us to estimate friction in both sliding and sticking states in a unified framework. Estimating the proxy position is done in real-time based on simulation using a sliding yield surface - a surface defining a border between the sliding and sticking regions in the external force space. During modeling, the sliding yield surface is first identified via an automated palpation procedure. Then, through manual palpation on a target surface, input data and resultant force data are acquired. The data are used to build a radial basis interpolation model. During rendering, this input-output mapping interpolation model is used to estimate force responses in real-time in accordance with the interaction input. Physical performance evaluation demonstrates that our approach achieves reasonably high estimation accuracy. A user study also shows plausible perceptual realism under diverse and extensive exploration.
Simple display system of mechanical properties of cells and their dispersion.
Shimizu, Yuji; Kihara, Takanori; Haghparast, Seyed Mohammad Ali; Yuba, Shunsuke; Miyake, Jun
2012-01-01
The mechanical properties of cells are unique indicators of their states and functions. Though, it is difficult to recognize the degrees of mechanical properties, due to small size of the cell and broad distribution of the mechanical properties. Here, we developed a simple virtual reality system for presenting the mechanical properties of cells and their dispersion using a haptic device and a PC. This system simulates atomic force microscopy (AFM) nanoindentation experiments for floating cells in virtual environments. An operator can virtually position the AFM spherical probe over a round cell with the haptic handle on the PC monitor and feel the force interaction. The Young's modulus of mesenchymal stem cells and HEK293 cells in the floating state was measured by AFM. The distribution of the Young's modulus of these cells was broad, and the distribution complied with a log-normal pattern. To represent the mechanical properties together with the cell variance, we used log-normal distribution-dependent random number determined by the mode and variance values of the Young's modulus of these cells. The represented Young's modulus was determined for each touching event of the probe surface and the cell object, and the haptic device-generating force was calculated using a Hertz model corresponding to the indentation depth and the fixed Young's modulus value. Using this system, we can feel the mechanical properties and their dispersion in each cell type in real time. This system will help us not only recognize the degrees of mechanical properties of diverse cells but also share them with others.
Simple Display System of Mechanical Properties of Cells and Their Dispersion
Shimizu, Yuji; Kihara, Takanori; Haghparast, Seyed Mohammad Ali; Yuba, Shunsuke; Miyake, Jun
2012-01-01
The mechanical properties of cells are unique indicators of their states and functions. Though, it is difficult to recognize the degrees of mechanical properties, due to small size of the cell and broad distribution of the mechanical properties. Here, we developed a simple virtual reality system for presenting the mechanical properties of cells and their dispersion using a haptic device and a PC. This system simulates atomic force microscopy (AFM) nanoindentation experiments for floating cells in virtual environments. An operator can virtually position the AFM spherical probe over a round cell with the haptic handle on the PC monitor and feel the force interaction. The Young's modulus of mesenchymal stem cells and HEK293 cells in the floating state was measured by AFM. The distribution of the Young's modulus of these cells was broad, and the distribution complied with a log-normal pattern. To represent the mechanical properties together with the cell variance, we used log-normal distribution-dependent random number determined by the mode and variance values of the Young's modulus of these cells. The represented Young's modulus was determined for each touching event of the probe surface and the cell object, and the haptic device-generating force was calculated using a Hertz model corresponding to the indentation depth and the fixed Young's modulus value. Using this system, we can feel the mechanical properties and their dispersion in each cell type in real time. This system will help us not only recognize the degrees of mechanical properties of diverse cells but also share them with others. PMID:22479595
A review of invasive and non-invasive sensory feedback in upper limb prostheses.
Svensson, Pamela; Wijk, Ulrika; Björkman, Anders; Antfolk, Christian
2017-06-01
The constant challenge to restore sensory feedback in prosthetic hands has provided several research solutions, but virtually none has reached clinical fruition. A prosthetic hand with sensory feedback that closely imitates an intact hand and provides a natural feeling may induce the prosthetic hand to be included in the body image and also reinforces the control of the prosthesis. Areas covered: This review presents non-invasive sensory feedback systems such as mechanotactile, vibrotactile, electrotactile and combinational systems which combine the modalities; multi-haptic feedback. Invasive sensory feedback has been tried less, because of the inherent risk, but it has successfully shown to restore some afferent channels. In this review, invasive methods are also discussed, both extraneural and intraneural electrodes, such as cuff electrodes and transverse intrafascicular multichannel electrodes. The focus of the review is on non-invasive methods of providing sensory feedback to upper-limb amputees. Expert commentary: Invoking embodiment has shown to be of importance for the control of prosthesis and acceptance by the prosthetic wearers. It is a challenge to provide conscious feedback to cover the lost sensibility of a hand, not be overwhelming and confusing for the user, and to integrate technology within the constraint of a wearable prosthesis.
Opportunities of hydrostatically coupled dielectric elastomer actuators for haptic interfaces
NASA Astrophysics Data System (ADS)
Carpi, Federico; Frediani, Gabriele; De Rossi, Danilo
2011-04-01
As a means to improve versatility and safety of dielectric elastomer actuators (DEAs) for several fields of application, so-called 'hydrostatically coupled' DEAs (HC-DEAs) have recently been described. HC-DEAs are based on an incompressible fluid that mechanically couples a DE-based active part to a passive part interfaced to the load, so as to enable hydrostatic transmission. This paper presents ongoing developments of HC-DEAs and potential applications in the field of haptics. Three specific examples are considered. The first deals with a wearable tactile display used to provide users with tactile feedback during electronic navigation in virtual environments. The display consists of HCDEAs arranged in contact with finger tips. As a second example, an up-scaled prototype version of an 8-dots refreshable cell for dynamic Braille displays is shown. Each Braille dot consists of a miniature HC-DEA, with a diameter lower than 2 mm. The third example refers to a device for finger rehabilitation, conceived to work as a sort of active version of a rehabilitation squeezing ball. The device is designed to dynamically change its compliance according to an electric control. The three examples of applications intend to show the potential of the new technology and the prospective opportunities for haptic interfaces.
ProMIS augmented reality training of laparoscopic procedures face validity.
Botden, Sanne M B I; Buzink, Sonja N; Schijven, Marlies P; Jakimowicz, Jack J
2008-01-01
Conventional video trainers lack the ability to assess the trainee objectively, but offer modalities that are often missing in virtual reality simulation, such as realistic haptic feedback. The ProMIS augmented reality laparoscopic simulator retains the benefit of a traditional box trainer, by using original laparoscopic instruments and tactile tasks, but additionally generates objective measures of performance. Fifty-five participants performed a "basic skills" and "suturing and knot-tying" task on ProMIS, after which they filled out a questionnaire regarding realism, haptics, and didactic value of the simulator, on a 5-point-Likert scale. The participants were allotted to 2 experience groups: "experienced" (>50 procedures and >5 sutures; N = 27), and "moderately experienced" (<50 procedures and <5 sutures; N = 28). General consensus among all participants, particularly the experienced, was that ProMIS is a useful tool for training (mean: 4.67, SD: 0.48). It was considered very realistic (mean: 4.44, SD: 0.66), with good haptics (mean: 4.10, SD: 0.97) and didactic value (mean 4.10, SD: 0.65). This study established the face validity of the ProMIS augmented reality simulator for "basic skills" and "suturing and knot-tying" tasks. ProMIS was considered a good tool for training in laparoscopic skills for surgical residents and surgeons.
The role of haptic cues from rough and slippery surfaces in human postural control
NASA Technical Reports Server (NTRS)
Jeka, J. J.; Lackner, J. R.
1995-01-01
Haptic information is critically important in complex sensory-motor tasks such as manipulating objects. Its comparable importance in spatial orientation is only beginning to be recognized. We have shown that postural sway in humans is significantly reduced by lightly touching a stable surface with a fingertip at contact force levels far below those physically necessary to stabilize the body. To investigate further the functional relationship between contact forces at the hand and postural equilibrium, we had subjects stand in the tandem Romberg stance while being allowed physically supportive (force contact) and non-physically supportive (touch contact) amounts of index fingertip force on surfaces with different frictional characteristics. Mean sway amplitude (MSA) was reduced by over 50% with both touch and force contact of the fingertip, compared to standing without fingertip contact. No differences in MSA were observed when touching rough or slippery surfaces. The amplitude of EMG activity in the peroneal muscles and the timing relationships between fingertip forces, body sway and EMG activity suggested that with touch contact of the finger or with force contact on a slippery surface long-loop "reflexes" involving postural muscles were stabilizing sway. With force contact of the fingertip on a rough surface, MSA reduction was achieved primarily through physical support of the body. This pattern of results indicates that light touch contact cues from the fingertip in conjunction with proprioceptive signals about arm configuration are providing information about body sway that can be used to reduce MSA through postural muscle activation.
A novel method of fabricating laminated silicone stack actuators with pre-strained dielectric layers
NASA Astrophysics Data System (ADS)
Hinitt, Andrew D.; Conn, Andrew T.
2014-03-01
In recent studies, stack based Dielectric Elastomer Actuators (DEAs) have been successfully used in haptic feedback and sensing applications. However, limitations in the fabrication method, and materials used to con- struct stack actuators constrain their force and displacement output per unit volume. This paper focuses on a fabrication process enabling a stacked elastomer actuator to withstand the high tensile forces needed for high power applications, such as mimetics for mammalian muscle contraction (i.e prostheses), whilst requiring low voltage for thickness-mode contractile actuation. Spun elastomer layers are bonded together in a pre-strained state using a conductive adhesive filler, forming a Laminated Inter-Penetrating Network (L-IPN) with repeatable and uniform electrode thickness. The resulting structure utilises the stored strain energy of the dielectric elas- tomer to compress the cured electrode composite material. The method is used to fabricate an L-IPN example, which demonstrated that the bonded L-IPN has high tensile strength normal to the lamination. Additionally, the uniformity and retained dielectric layer pre-strain of the L-IPN are confirmed. The described method is envisaged to be used in a semi-automated assembly of large-scale multi-layer stacks of pre-strained dielectric layers possessing a tensile strength in the range generated by mammalian muscle.
Force sharing and other collaborative strategies in a dyadic force perception task
Tatti, Fabio
2018-01-01
When several persons perform a physical task jointly, such as transporting an object together, the interaction force that each person experiences is the sum of the forces applied by all other persons on the same object. Therefore, there is a fundamental ambiguity about the origin of the force that each person experiences. This study investigated the ability of a dyad (two persons) to identify the direction of a small force produced by a haptic device and applied to a jointly held object. In this particular task, the dyad might split the force produced by the haptic device (the external force) in an infinite number of ways, depending on how the two partners interacted physically. A major objective of this study was to understand how the two partners coordinated their action to perceive the direction of the third force that was applied to the jointly held object. This study included a condition where each participant responded independently and another one where the two participants had to agree upon a single negotiated response. The results showed a broad range of behaviors. In general, the external force was not split in a way that would maximize the joint performance. In fact, the external force was often split very unequally, leaving one person without information about the external force. However, the performance was better than expected in this case, which led to the discovery of an unanticipated strategy whereby the person who took all the force transmitted this information to the partner by moving the jointly held object. When the dyad could negotiate the response, we found that the participant with less force information tended to switch his or her response more often. PMID:29474433
Hadavand, Mostafa; Mirbagheri, Alireza; Behzadipour, Saeed; Farahmand, Farzam
2014-06-01
An effective master robot for haptic tele-surgery applications needs to provide a solution for the inversed movements of the surgical tool, in addition to sufficient workspace and manipulability, with minimal moving inertia. A novel 4 + 1-DOF mechanism was proposed, based on a triple parallelogram linkage, which provided a Remote Center of Motion (RCM) at the back of the user's hand. The kinematics of the robot was analyzed and a prototype was fabricated and evaluated by experimental tests. With a RCM at the back of the user's hand the actuators far from the end effector, the robot could produce the sensation of hand-inside surgery with minimal moving inertia. The target workspace was achieved with an acceptable manipulability. The trajectory tracking experiments revealed small errors, due to backlash at the joints. The proposed mechanism meets the basic requirements of an effective master robot for haptic tele-surgery applications. Copyright © 2013 John Wiley & Sons, Ltd.
a New ER Fluid Based Haptic Actuator System for Virtual Reality
NASA Astrophysics Data System (ADS)
Böse, H.; Baumann, M.; Monkman, G. J.; Egersdörfer, S.; Tunayar, A.; Freimuth, H.; Ermert, H.; Khaled, W.
The concept and some steps in the development of a new actuator system which enables the haptic perception of mechanically inhomogeneous virtual objects are introduced. The system consists of a two-dimensional planar array of actuator elements containing an electrorheological (ER) fluid. When a user presses his fingers onto the surface of the actuator array, he perceives locally variable resistance forces generated by vertical pistons which slide in the ER fluid through the gaps between electrode pairs. The voltage in each actuator element can be individually controlled by a novel sophisticated switching technology based on optoelectric gallium arsenide elements. The haptic information which is represented at the actuator array can be transferred from a corresponding sensor system based on ultrasonic elastography. The combined sensor-actuator system may serve as a technology platform for various applications in virtual reality, like telemedicine where the information on the consistency of tissue of a real patient is detected by the sensor part and recorded by the actuator part at a remote location.
Vigaru, Bogdan; Sulzer, James; Gassert, Roger
2016-01-01
Our hands and fingers are involved in almost all activities of daily living and, as such, have a disproportionately large neural representation. Functional magnetic resonance imaging investigations into the neural control of the hand have revealed great advances, but the harsh MRI environment has proven to be a challenge to devices capable of delivering a large variety of stimuli necessary for well-controlled studies. This paper presents a fMRI-compatible haptic interface to investigate the neural mechanisms underlying precision grasp control. The interface, located at the scanner bore, is controlled remotely through a shielded electromagnetic actuation system positioned at the end of the scanner bed and then through a high stiffness, low inertia cable transmission. We present the system design, taking into account requirements defined by the biomechanics and dynamics of the human hand, as well as the fMRI environment. Performance evaluation revealed a structural stiffness of 3.3 N/mm, renderable forces up to 94 N, and a position control bandwidth of at least 19 Hz. MRI-compatibility tests showed no degradation in the operation of the haptic interface or the image quality. A preliminary fMRI experiment during a pilot study validated the usability of the haptic interface, illustrating the possibilities offered by this device. PMID:26441454
Daud Albasini, Omar A.; Oboe, Roberto; Tonin, Paolo; Paolucci, Stefano; Sandrini, Giorgio; Piron, Lamberto
2013-01-01
Background. Haptic robots allow the exploitation of known motor learning mechanisms, representing a valuable option for motor treatment after stroke. The aim of this feasibility multicentre study was to test the clinical efficacy of a haptic prototype, for the recovery of hand function after stroke. Methods. A prospective pilot clinical trial was planned on 15 consecutive patients enrolled in 3 rehabilitation centre in Italy. All the framework features of the haptic robot (e.g., control loop, external communication, and graphic rendering for virtual reality) were implemented into a real-time MATLAB/Simulink environment, controlling a five-bar linkage able to provide forces up to 20 [N] at the end effector, used for finger and hand rehabilitation therapies. Clinical (i.e., Fugl-Meyer upper extremity scale; nine hold pegboard test) and kinematics (i.e., time; velocity; jerk metric; normalized jerk of standard movements) outcomes were assessed before and after treatment to detect changes in patients' motor performance. Reorganization of cortical activation was detected in one patient by fMRI. Results and Conclusions. All patients showed significant improvements in both clinical and kinematic outcomes. Additionally, fMRI results suggest that the proposed approach may promote a better cortical activation in the brain. PMID:24319496
Looking and touching: What extant approaches reveal about the structure of early word knowledge
Hendrickson, Kristi; Mitsven, Samantha; Poulin-Dubois, Diane; Zesiger, Pascal; Friend, Margaret
2014-01-01
The goal of the current study is to assess the temporal dynamics of vision and action to evaluate the underlying word representations that guide infants’ responses. Sixteen-month-old infants participated in a two-alternative forced-choice word-picture matching task. We conducted a moment-by-moment analysis of looking and reaching behaviors as they occurred in tandem to assess the speed with which a prompted word was processed (visual reaction time) as a function of the type of haptic response: Target, Distractor, or No Touch. Visual reaction times (visual RTs) were significantly slower during No Touches compared to Distractor and Target Touches, which were statistically indistinguishable. The finding that visual RTs were significantly faster during Distractor Touches compared to No Touches suggests that incorrect and absent haptic responses appear to index distinct knowledge states: incorrect responses are associated with partial knowledge whereas absent responses appear to reflect a true failure to map lexical items to their target referents. Further, we found that those children who were faster at processing words were also those children who exhibited better haptic performance. This research provides a methodological clarification on knowledge measured by the visual and haptic modalities and new evidence for a continuum of word knowledge in the second year of life. PMID:25444711
Turolla, Andrea; Daud Albasini, Omar A; Oboe, Roberto; Agostini, Michela; Tonin, Paolo; Paolucci, Stefano; Sandrini, Giorgio; Venneri, Annalena; Piron, Lamberto
2013-01-01
Background. Haptic robots allow the exploitation of known motor learning mechanisms, representing a valuable option for motor treatment after stroke. The aim of this feasibility multicentre study was to test the clinical efficacy of a haptic prototype, for the recovery of hand function after stroke. Methods. A prospective pilot clinical trial was planned on 15 consecutive patients enrolled in 3 rehabilitation centre in Italy. All the framework features of the haptic robot (e.g., control loop, external communication, and graphic rendering for virtual reality) were implemented into a real-time MATLAB/Simulink environment, controlling a five-bar linkage able to provide forces up to 20 [N] at the end effector, used for finger and hand rehabilitation therapies. Clinical (i.e., Fugl-Meyer upper extremity scale; nine hold pegboard test) and kinematics (i.e., time; velocity; jerk metric; normalized jerk of standard movements) outcomes were assessed before and after treatment to detect changes in patients' motor performance. Reorganization of cortical activation was detected in one patient by fMRI. Results and Conclusions. All patients showed significant improvements in both clinical and kinematic outcomes. Additionally, fMRI results suggest that the proposed approach may promote a better cortical activation in the brain.
Atashzar, Seyed Farokh; Shahbazi, Mahya; Ward, Christopher; Samotus, Olivia; Delrobaei, Mehdi; Rahimi, Fariborz; Lee, Jack; Jackman, Mallory; Jog, Mandar S; Patel, Rajni V
2016-01-01
Abnormality of sensorimotor integration in the basal ganglia and cortex has been reported in the literature for patients with task-specific focal hand dystonia (FHD). In this study, we investigate the effect of manipulation of kinesthetic input in people living with writer's cramp disorder (a major form of FHD). For this purpose, severity of dystonia is studied for 11 participants while the symptoms of seven participants have been tracked during five sessions of assessment and Botulinum toxin injection (BoNT-A) therapy (one of the current suggested therapies for dystonia). BoNT-A therapy is delivered in the first and the third session. The goal is to analyze the effect of haptic manipulation as a potential assistive technique during BoNT-A therapy. The trial includes writing, hovering, and spiral/sinusoidal drawing subtasks. In each session, the subtasks are repeated twice when (a) a participant uses a normal pen, and (b) when the participant uses a robotics-assisted system (supporting the pen) which provides a compliant virtual writing surface and manipulates the kinesthetic sensory input. The results show (p-value using one-sample t-tests) that reducing the writing surface rigidity significantly decreases the severity of dystonia and results in better control of grip pressure (an indicator of dystonic cramping). It is also shown that (p-value based on paired-samples t-test) using the proposed haptic manipulation strategy, it is possible to augment the effectiveness of BoNT-A therapy. The outcome of this study is then used in the design of an actuated pen as a writing-assistance tool that can provide compliant haptic interaction during writing for FHD patients.
Currie, Maria E; Trejos, Ana Luisa; Rayman, Reiza; Chu, Michael W A; Patel, Rajni; Peters, Terry; Kiaii, Bob B
2013-01-01
The purpose of this study was to determine the effect of three-dimensional (3D) binocular, stereoscopic, and two-dimensional (2D) monocular visualization on robotics-assisted mitral valve annuloplasty versus conventional techniques in an ex vivo animal model. In addition, we sought to determine whether these effects were consistent between novices and experts in robotics-assisted cardiac surgery. A cardiac surgery test-bed was constructed to measure forces applied during mitral valve annuloplasty. Sutures were passed through the porcine mitral valve annulus by the participants with different levels of experience in robotics-assisted surgery and tied in place using both robotics-assisted and conventional surgery techniques. The mean time for both the experts and the novices using 3D visualization was significantly less than that required using 2D vision (P < 0.001). However, there was no significant difference in the maximum force applied by the novices to the mitral valve during suturing (P = 0.7) and suture tying (P = 0.6) using either 2D or 3D visualization. The mean time required and forces applied by both the experts and the novices were significantly less using the conventional surgical technique than when using the robotic system with either 2D or 3D vision (P < 0.001). Despite high-quality binocular images, both the experts and the novices applied significantly more force to the cardiac tissue during 3D robotics-assisted mitral valve annuloplasty than during conventional open mitral valve annuloplasty. This finding suggests that 3D visualization does not fully compensate for the absence of haptic feedback in robotics-assisted cardiac surgery.
Look and Feel: Haptic Interaction for Biomedicine
1995-10-01
algorithm that is evaluated within the topology of the model. During each time step, forces are summed for each mobile atom based on external forces...volumetric properties; (b) conserving computation power by rendering media local to the interaction point; and (c) evaluating the simulation within...alteration of the model topology. Simulation of the DSM state is accomplished by a multi-step algorithm that is evaluated within the topology of the
Force and torque modelling of drilling simulation for orthopaedic surgery.
MacAvelia, Troy; Ghasempoor, Ahmad; Janabi-Sharifi, Farrokh
2014-01-01
The advent of haptic simulation systems for orthopaedic surgery procedures has provided surgeons with an excellent tool for training and preoperative planning purposes. This is especially true for procedures involving the drilling of bone, which require a great amount of adroitness and experience due to difficulties arising from vibration and drill bit breakage. One of the potential difficulties with the drilling of bone is the lack of consistent material evacuation from the drill's flutes as the material tends to clog. This clogging leads to significant increases in force and torque experienced by the surgeon. Clogging was observed for feed rates greater than 0.5 mm/s and spindle speeds less than 2500 rpm. The drilling simulation systems that have been created to date do not address the issue of drill flute clogging. This paper presents force and torque prediction models that account for this phenomenon. The two coefficients of friction required by these models were determined via a set of calibration experiments. The accuracy of both models was evaluated by an additional set of validation experiments resulting in average R² regression correlation values of 0.9546 and 0.9209 for the force and torque prediction models, respectively. The resulting models can be adopted by haptic simulation systems to provide a more realistic tactile output.
Hing, James T; Brooks, Ari D; Desai, Jaydev P
2007-02-01
A methodology for modeling the needle and soft-tissue interaction during needle insertion is presented. The approach consists of the measurement of needle and tissue motion using a dual C-arm fluoroscopy system. Our dual C-arm fluoroscopy setup allows real time 3-D extraction of the displacement of implanted fiducials in the soft tissue during needle insertion to obtain the necessary parameters for accurate modeling of needle and soft-tissue interactions. The needle and implanted markers in the tissue are tracked during the insertion and withdrawal of the needle at speeds of 1.016 mm/s, 12.7 mm/s and 25.4 mm/s. Both image and force data are utilized to determine important parameters such as the approximate cutting force, puncture force, the local effective modulus (LEM) during puncture, and the relaxation of tissue. We have also validated the LEM computed from our finite element model with arbitrary needle puncture tasks. Based on these measurements, we developed a model for needle insertion and withdrawal that can be used to generate a 1-DOF force versus position profile that can be experienced by a user operating a haptic device. This profile was implemented on a 7-DOf haptic device designed in our laboratory.
Defense applications of the CAVE (CAVE automatic virtual environment)
NASA Astrophysics Data System (ADS)
Isabelle, Scott K.; Gilkey, Robert H.; Kenyon, Robert V.; Valentino, George; Flach, John M.; Spenny, Curtis H.; Anderson, Timothy R.
1997-07-01
The CAVE is a multi-person, room-sized, high-resolution, 3D video and auditory environment, which can be used to present very immersive virtual environment experiences. This paper describes the CAVE technology and the capability of the CAVE system as originally developed at the Electronics Visualization Laboratory of the University of Illinois- Chicago and as more recently implemented by Wright State University (WSU) in the Armstrong Laboratory at Wright- Patterson Air Force Base (WPAFB). One planned use of the WSU/WPAFB CAVE is research addressing the appropriate design of display and control interfaces for controlling uninhabited aerial vehicles. The WSU/WPAFB CAVE has a number of features that make it well-suited to this work: (1) 360 degrees surround, plus floor, high resolution visual displays, (2) virtual spatialized audio, (3) the ability to integrate real and virtual objects, and (4) rapid and flexible reconfiguration. However, even though the CAVE is likely to have broad utility for military applications, it does have certain limitations that may make it less well- suited to applications that require 'natural' haptic feedback, vestibular stimulation, or an ability to interact with close detailed objects.
Kinematic/Dynamic Characteristics for Visual and Kinesthetic Virtual Environments
NASA Technical Reports Server (NTRS)
Bortolussi, Michael R. (Compiler); Adelstein, B. D.; Gold, Miriam
1996-01-01
Work was carried out on two topics of principal importance to current progress in virtual environment research at NASA Ames and elsewhere. The first topic was directed at maximizing the temporal dynamic response of visually presented Virtual Environments (VEs) through reorganization and optimization of system hardware and software. The final results of this portion of the work was a VE system in the Advanced Display and Spatial Perception Laboratory at NASA Ames capable of updating at 60 Hz (the maximum hardware refresh rate) with latencies approaching 30 msec. In the course of achieving this system performance, specialized hardware and software tools for measurement of VE latency and analytic models correlating update rate and latency for different system configurations were developed. The second area of activity was the preliminary development and analysis of a novel kinematic architecture for three Degree Of Freedom (DOF) haptic interfaces--devices that provide force feedback for manipulative interaction with virtual and remote environments. An invention disclosure was filed on this work and a patent application is being pursued by NASA Ames. Activities in these two areas are expanded upon below.
Bouquet de Joliniere, Jean; Librino, Armando; Dubuisson, Jean-Bernard; Khomsi, Fathi; Ben Ali, Nordine; Fadhlaoui, Anis; Ayoubi, J. M.; Feki, Anis
2016-01-01
Minimally invasive surgery (MIS) can be considered as the greatest surgical innovation over the past 30 years. It revolutionized surgical practice with well-proven advantages over traditional open surgery: reduced surgical trauma and incision-related complications, such as surgical-site infections, postoperative pain and hernia, reduced hospital stay, and improved cosmetic outcome. Nonetheless, proficiency in MIS can be technically challenging as conventional laparoscopy is associated with several limitations as the two-dimensional (2D) monitor reduction in-depth perception, camera instability, limited range of motion, and steep learning curves. The surgeon has a low force feedback, which allows simple gestures, respect for tissues, and more effective treatment of complications. Since the 1980s, several computer sciences and robotics projects have been set up to overcome the difficulties encountered with conventional laparoscopy, to augment the surgeon’s skills, achieve accuracy and high precision during complex surgery, and facilitate widespread of MIS. Surgical instruments are guided by haptic interfaces that replicate and filter hand movements. Robotically assisted technology offers advantages that include improved three-dimensional stereoscopic vision, wristed instruments that improve dexterity, and tremor canceling software that improves surgical precision. PMID:27200358
Validation of the PASSPORT V2 training environment for arthroscopic skills.
Stunt, J J; Kerkhoffs, G M M J; Horeman, T; van Dijk, C N; Tuijthof, G J M
2016-06-01
Virtual reality simulators used in the education of orthopaedic residents often lack realistic haptic feedback. To solve this, the (Practice Arthroscopic Surgical Skills for Perfect Operative Real-life Treatment) PASSPORT simulator was developed, which was subjected to fundamental changes: improved realism and user interface. The purpose was to demonstrate its face and construct validity. Thirty-one participants were divided into three groups having different levels of arthroscopic experience. Participants answered questions regarding general information and the outer appearance of the simulator for face validity. Construct validity was assessed with one standardized navigation task, which was timed. Face validity, educational value and user-friendliness were determined with two representative exercises and by asking participants to fill out the questionnaire. A value of 7 or greater was considered sufficient. Construct validity was demonstrated between experts and novices. Median task time for the fifth trial was 55 s (range 17-139 s) for the novices, 33 s (range 17-59 s) for the intermediates, and 26 s (range 14-52 s) for the experts. Median task times of three trials were not significantly different between the novices and intermediates, and none of the trials between intermediates and experts. Face validity, educational value and user-friendliness were perceived as sufficient (median >7). The presence of realistic tactile feedback was considered the biggest asset of the simulator. Proper preparation for arthroscopic operations will increase the quality of real-life surgery and patients' safety. The PASSPORT simulator can assist in achieving this, as it showed construct and face validity, and its physical nature offered adequate haptic feedback during training. This indicates that PASSPORT has potential to evolve as a valuable training modality.
Neodymium:YAG laser cutting of intraocular lens haptics in vitro and in vivo.
Feder, J M; Rosenberg, M A; Farber, M D
1989-09-01
Various complications following intraocular lens (IOL) surgery result in explantation of the lenses. Haptic fibrosis may necessitate cutting the IOL haptics prior to removal. In this study we used the neodymium: YAG (Nd:YAG) laser to cut polypropylene and poly(methyl methacrylate) (PMMA) haptics in vitro and in rabbit eyes. In vitro we were able to cut 100% of both haptic types successfully (28 PMMA and 30 polypropylene haptics). In rabbit eyes we were able to cut 50% of the PMMA haptics and 43% of the polypropylene haptics. Poly(methyl methacrylate) haptics were easier to cut in vitro and in vivo than polypropylene haptics, requiring fewer shots for transection. Complications of Nd:YAG laser use frequently interfered with haptic transections in rabbit eyes. Haptic transection may be more easily accomplished in human eyes.
Looking and touching: what extant approaches reveal about the structure of early word knowledge.
Hendrickson, Kristi; Mitsven, Samantha; Poulin-Dubois, Diane; Zesiger, Pascal; Friend, Margaret
2015-09-01
The goal of the current study is to assess the temporal dynamics of vision and action to evaluate the underlying word representations that guide infants' responses. Sixteen-month-old infants participated in a two-alternative forced-choice word-picture matching task. We conducted a moment-by-moment analysis of looking and reaching behaviors as they occurred in tandem to assess the speed with which a prompted word was processed (visual reaction time) as a function of the type of haptic response: Target, Distractor, or No Touch. Visual reaction times (visual RTs) were significantly slower during No Touches compared to Distractor and Target Touches, which were statistically indistinguishable. The finding that visual RTs were significantly faster during Distractor Touches compared to No Touches suggests that incorrect and absent haptic responses appear to index distinct knowledge states: incorrect responses are associated with partial knowledge whereas absent responses appear to reflect a true failure to map lexical items to their target referents. Further, we found that those children who were faster at processing words were also those children who exhibited better haptic performance. This research provides a methodological clarification on knowledge measured by the visual and haptic modalities and new evidence for a continuum of word knowledge in the second year of life. © 2014 The Authors Developmental Science Published by John Wiley & Sons Ltd.
Yovanoff, Mary; Pepley, David; Mirkin, Katelin; Moore, Jason; Han, David; Miller, Scarlett
2017-01-01
While Virtual Reality (VR) has emerged as a viable method for training new medical residents, it has not yet reached all areas of training. One area lacking such development is surgical residency programs where there are large learning curves associated with skill development. In order to address this gap, a Dynamic Haptic Robotic Trainer (DHRT) was developed to help train surgical residents in the placement of ultrasound guided Internal Jugular Central Venous Catheters and to incorporate personalized learning. In order to accomplish this, a 2-part study was conducted to: (1) systematically analyze the feedback given to 18 third year medical students by trained professionals to identify the items necessary for a personalized learning system and (2) develop and experimentally test the usability of the personalized learning interface within the DHRT system. The results can be used to inform the design of VR and personalized learning systems within the medical community. PMID:29123361
Development of Remote-Type Haptic Catheter Sensor System using Piezoelectric Transducer
NASA Astrophysics Data System (ADS)
Haruta, Mineyuki; Murayama, Yoshinobu; Omata, Sadao
This study describes the development of Remote-Type Haptic Catheter Sensor System which enables the mechanical property evaluation of a blood vessel. This system consists of a feedback circuit and a piezoelectric ultrasound transducer, and is operated based on a phase shift method so that the entire system oscillates at its inherent resonance frequency. Ultrasound reflected by the blood vessel makes a phase shift of the resonance system depending on the acoustic impedance of the reflector. The phase shift is then measured as a change in resonance frequency of the system; therefore, the detection resolution is highly improved. The correlation between the acoustic impedance and the resonance frequency change of the sensor system was demonstrated using silicone rubbers, metals and actual blood vessels from a pig. The performance of the sensor was also examined using vessel shaped phantom model. Finally, the discussion surveys a possibility of the novel sensor system in an application for intra vascular diagnosis.
Modeling and modification of medical 3D objects. The benefit of using a haptic modeling tool.
Kling-Petersen, T; Rydmark, M
2000-01-01
The Computer Laboratory of the medical faculty in Goteborg (Mednet) has since the end of 1998 been one of a limited numbers of participants in the development of a new modeling tool together with SensAble Technologies Inc [http:¿www.sensable.com/]. The software called SensAble FreeForm was officially released at Siggraph September 1999. Briefly, the software mimics the modeling techniques traditionally used by clay artists. An imported model or a user defined block of "clay" can be modified using different tools such as a ball, square block, scrape etc via the use of a SensAble Technologies PHANToM haptic arm. The model will deform in 3D as a result of touching the "clay" with any selected tool and the amount of deformation is linear to the force applied. By getting instantaneous haptic as well as visual feedback, precise and intuitive changes are easily made. While SensAble FreeForm lacks several of the features normally associated with a 3D modeling program (such as text handling, application of surface and bumpmaps, high-end rendering engines, etc) it's strength lies in the ability to rapidly create non-geometric 3D models. For medical use, very few anatomically correct models are created from scratch. However, FreeForm features tools enable advanced modification of reconstructed or 3D scanned models. One of the main problems with 3D laserscanning of medical specimens is that the technique usually leaves holes or gaps in the dataset corresponding to areas in shadows such as orifices, deep grooves etc. By using FreeForms different tools, these defects are easily corrected and gaps are filled out. Similarly, traditional 3D reconstruction (based on serial sections etc) often shows artifacts as a result of the triangulation and/or tessellation processes. These artifacts usually manifest as unnatural ridges or uneven areas ("the accordion effect"). FreeForm contains a smoothing algorithm that enables the user to select an area to be modified and subsequently apply any given amount of smoothing to the object. While the final objects need to be exported for further 3D graphic manipulation, FreeForm addresses one of the most time consuming problems of 3D modeling: modification and creation of non-geometric 3D objects.
Development master arm of 2-DOF planar parallel manipulator for In-Vitro Fertilization
NASA Astrophysics Data System (ADS)
Thamrongaphichartkul, Kitti; Vongbunyong, Supachai; Nuntakarn, Lalana
2018-01-01
Micromanipulator is a mechanical device used for manipulating miniature objects in the order of micron. It is widely used in In-Vitro Fertilization (IVF) in which sperms will be held in a micro-needle and penetrate to an oocyte for fertilization. IVF needs to be performed by high skill embryologists to control the movement of the needle accurately due to the lack of tactile perception of the user. Haptic device is a device that can transmit and simulate position, velocity and force in order to enhance interaction between the user and system. However, commercially available haptic devices have unnecessary degrees of freedom and limited workspace which are inappropriate for IVF process. This paper focuses on development of a haptic device for using in IVF process. It will be used as a master arm for the master-slave system for IVF process in order to enhance the ability of users to control the micromanipulator. As a result, the embryologist is able to carry out the IVF process more effectively with having tactile perception.
Performance evaluation of haptic hand-controllers in a robot-assisted surgical system.
Zareinia, Kourosh; Maddahi, Yaser; Ng, Canaan; Sepehri, Nariman; Sutherland, Garnette R
2015-12-01
This paper presents the experimental evaluation of three commercially available haptic hand-controllers to evaluate which was more suitable to the participants. Two surgeons and seven engineers performed two peg-in-hole tasks with different levels of difficulty. Each operator guided the end-effector of a Kuka manipulator that held surgical forceps and was equipped with a surgical microscope. Sigma 7, HD(2) and PHANToM Premium 3.0 hand-controllers were compared. Ten measures were adopted to evaluate operators' performances with respect to effort, speed and accuracy in completing a task, operator improvement during the tests, and the force applied by each haptic device. The best performance was observed with the Premium 3.0; the hand-piece was able to be held in a similar way to that used by surgeons to hold conventional tools. Hand-controllers with a linkage structure similar to the human upper extremity take advantage of the inherent human brain connectome, resulting in improved surgeon performance during robotic-assisted surgery. Copyright © 2015 John Wiley & Sons, Ltd.
Wave impedance selection for passivity-based bilateral teleoperation
NASA Astrophysics Data System (ADS)
D'Amore, Nicholas John
When a task must be executed in a remote or dangerous environment, teleoperation systems may be employed to extend the influence of the human operator. In the case of manipulation tasks, haptic feedback of the forces experienced by the remote (slave) system is often highly useful in improving an operator's ability to perform effectively. In many of these cases (especially teleoperation over the internet and ground-to-space teleoperation), substantial communication latency exists in the control loop and has the strong tendency to cause instability of the system. The first viable solution to this problem in the literature was based on a scattering/wave transformation from transmission line theory. This wave transformation requires the designer to select a wave impedance parameter appropriate to the teleoperation system. It is widely recognized that a small value of wave impedance is well suited to free motion and a large value is preferable for contact tasks. Beyond this basic observation, however, very little guidance exists in the literature regarding the selection of an appropriate value. Moreover, prior research on impedance selection generally fails to account for the fact that in any realistic contact task there will simultaneously exist contact considerations (perpendicular to the surface of contact) and quasi-free-motion considerations (parallel to the surface of contact). The primary contribution of the present work is to introduce an approximate linearized optimum for the choice of wave impedance and to apply this quasi-optimal choice to the Cartesian reality of such a contact task, in which it cannot be expected that a given joint will be either perfectly normal to or perfectly parallel to the motion constraint. The proposed scheme selects a wave impedance matrix that is appropriate to the conditions encountered by the manipulator. This choice may be implemented as a static wave impedance value or as a time-varying choice updated according to the instantaneous conditions encountered. A Lyapunov-like analysis is presented demonstrating that time variation in wave impedance will not violate the passivity of the system. Experimental trials, both in simulation and on a haptic feedback device, are presented validating the technique. Consideration is also given to the case of an uncertain environment, in which an a priori impedance choice may not be possible.
2014-01-01
Background Cycling has been used in the rehabilitation of individuals with both chronic and post-surgical conditions. Among the challenges with implementing bicycling for rehabilitation is the recruitment of both extremities, in particular when one is weaker or less coordinated. Feedback embedded in virtual reality (VR) augmented cycling may serve to address the requirement for efficacious cycling; specifically recruitment of both extremities and exercising at a high intensity. Methods In this paper a mechatronic rehabilitation bicycling system with an interactive virtual environment, called Virtual Reality Augmented Cycling Kit (VRACK), is presented. Novel hardware components embedded with sensors were implemented on a stationary exercise bicycle to monitor physiological and biomechanical parameters of participants while immersing them in an augmented reality simulation providing the user with visual, auditory and haptic feedback. This modular and adaptable system attaches to commercially-available stationary bicycle systems and interfaces with a personal computer for simulation and data acquisition processes. The complete bicycle system includes: a) handle bars based on hydraulic pressure sensors; b) pedals that monitor pedal kinematics with an inertial measurement unit (IMU) and forces on the pedals while providing vibratory feedback; c) off the shelf electronics to monitor heart rate and d) customized software for rehabilitation. Bench testing for the handle and pedal systems is presented for calibration of the sensors detecting force and angle. Results The modular mechatronic kit for exercise bicycles was tested in bench testing and human tests. Bench tests performed on the sensorized handle bars and the instrumented pedals validated the measurement accuracy of these components. Rider tests with the VRACK system focused on the pedal system and successfully monitored kinetic and kinematic parameters of the rider’s lower extremities. Conclusions The VRACK system, a virtual reality mechatronic bicycle rehabilitation modular system was designed to convert most bicycles in virtual reality (VR) cycles. Preliminary testing of the augmented reality bicycle system was successful in demonstrating that a modular mechatronic kit can monitor and record kinetic and kinematic parameters of several riders. PMID:24902780
Chamorro-Moriana, Gema; Moreno, Antonio José
2018-01-01
This systematic review synthesized and analyzed clinical findings related to the effectiveness of innovative technological feedback for tackling functional gait recovery. An electronic search of PUBMED, PEDro, WOS, CINAHL, and DIALNET was conducted from January 2011 to December 2016. The main inclusion criteria were: patients with modified or abnormal gait; application of technology-based feedback to deal with functional recovery of gait; any comparison between different kinds of feedback applied by means of technology, or any comparison between technological and non-technological feedback; and randomized controlled trials. Twenty papers were included. The populations were neurological patients (75%), orthopedic and healthy subjects. All participants were adults, bar one. Four studies used exoskeletons, 6 load platforms and 5 pressure sensors. The breakdown of the type of feedback used was as follows: 60% visual, 40% acoustic and 15% haptic. 55% used terminal feedback versus 65% simultaneous feedback. Prescriptive feedback was used in 60% of cases, while 50% used descriptive feedback. 62.5% and 58.33% of the trials showed a significant effect in improving step length and speed, respectively. Efficacy in improving other gait parameters such as balance or range of movement is observed in more than 75% of the studies with significant outcomes. Conclusion: Treatments based on feedback using innovative technology in patients with abnormal gait are mostly effective in improving gait parameters and therefore useful for the functional recovery of patients. The most frequently highlighted types of feedback were immediate visual feedback followed by terminal and immediate acoustic feedback. PMID:29316645
Chamorro-Moriana, Gema; Moreno, Antonio José; Sevillano, José Luis
2018-01-06
This systematic review synthesized and analyzed clinical findings related to the effectiveness of innovative technological feedback for tackling functional gait recovery. An electronic search of PUBMED, PEDro, WOS, CINAHL, and DIALNET was conducted from January 2011 to December 2016. The main inclusion criteria were: patients with modified or abnormal gait; application of technology-based feedback to deal with functional recovery of gait; any comparison between different kinds of feedback applied by means of technology, or any comparison between technological and non-technological feedback; and randomized controlled trials. Twenty papers were included. The populations were neurological patients (75%), orthopedic and healthy subjects. All participants were adults, bar one. Four studies used exoskeletons, 6 load platforms and 5 pressure sensors. The breakdown of the type of feedback used was as follows: 60% visual, 40% acoustic and 15% haptic. 55% used terminal feedback versus 65% simultaneous feedback. Prescriptive feedback was used in 60% of cases, while 50% used descriptive feedback. 62.5% and 58.33% of the trials showed a significant effect in improving step length and speed, respectively. Efficacy in improving other gait parameters such as balance or range of movement is observed in more than 75% of the studies with significant outcomes. Treatments based on feedback using innovative technology in patients with abnormal gait are mostly effective in improving gait parameters and therefore useful for the functional recovery of patients. The most frequently highlighted types of feedback were immediate visual feedback followed by terminal and immediate acoustic feedback.
Baillie, Sarah; Crossan, Andrew; Brewster, Stephen A; May, Stephen A; Mellor, Dominic J
2010-10-01
Simulators provide a potential solution to some of the challenges faced when teaching internal examinations to medical or veterinary students. A virtual reality simulator, the Haptic Cow, has been developed to teach bovine rectal palpation to veterinary students, and significant training benefits have been demonstrated. However, the training needs to be delivered by an instructor, a requirement that limits availability. This article describes the development and evaluation of an automated version that students could use on their own. An automated version was developed based on a recording of an expert's examination. The performance of two groups of eight students was compared. All students had undergone the traditional training in the course, namely lectures and laboratory practicals, and then group S used the simulator whereas group R had no additional training. The students were set the task of finding the uterus when examining cows. The simulator was then made available to students, and feedback about the "usability" was gathered with a questionnaire. The group whose training had been supplemented with a simulator session were significantly better at finding the uterus. The questionnaire feedback was positive and indicated that students found the simulator easy to use. The automated simulator equipped students with useful skills for examining cows. In addition, a simulator that does not need the presence of an instructor will increase the availability of training for students and be a more sustainable option for institutions.
Gibo, Tricia L; Bastian, Amy J; Okamura, Allison M
2014-03-01
When grasping and manipulating objects, people are able to efficiently modulate their grip force according to the experienced load force. Effective grip force control involves providing enough grip force to prevent the object from slipping, while avoiding excessive force to avoid damage and fatigue. During indirect object manipulation via teleoperation systems or in virtual environments, users often receive limited somatosensory feedback about objects with which they interact. This study examines the effects of force feedback, accuracy demands, and training on grip force control during object interaction in a virtual environment. The task required subjects to grasp and move a virtual object while tracking a target. When force feedback was not provided, subjects failed to couple grip and load force, a capability fundamental to direct object interaction. Subjects also exerted larger grip force without force feedback and when accuracy demands of the tracking task were high. In addition, the presence or absence of force feedback during training affected subsequent performance, even when the feedback condition was switched. Subjects' grip force control remained reminiscent of their employed grip during the initial training. These results motivate the use of force feedback during telemanipulation and highlight the effect of force feedback during training.
NASA Technical Reports Server (NTRS)
Begault, Durand R.; Bittner, Rachel M.; Anderson, Mark R.
2012-01-01
Auditory communication displays within the NextGen data link system may use multiple synthetic speech messages replacing traditional ATC and company communications. The design of an interface for selecting amongst multiple incoming messages can impact both performance (time to select, audit and release a message) and preference. Two design factors were evaluated: physical pressure-sensitive switches versus flat panel "virtual switches", and the presence or absence of auditory feedback from switch contact. Performance with stimuli using physical switches was 1.2 s faster than virtual switches (2.0 s vs. 3.2 s); auditory feedback provided a 0.54 s performance advantage (2.33 s vs. 2.87 s). There was no interaction between these variables. Preference data were highly correlated with performance.
Force-controlled automatic microassembly of tissue engineering scaffolds
NASA Astrophysics Data System (ADS)
Zhao, Guoyong; Teo, Chee Leong; Hutmacher, Dietmar Werner; Burdet, Etienne
2010-03-01
This paper presents an automated system for 3D assembly of tissue engineering (TE) scaffolds made from biocompatible microscopic building blocks with relatively large fabrication error. It focuses on the pin-into-hole force control developed for this demanding microassembly task. A beam-like gripper with integrated force sensing at a 3 mN resolution with a 500 mN measuring range is designed, and is used to implement an admittance force-controlled insertion using commercial precision stages. Visual-based alignment followed by an insertion is complemented by a haptic exploration strategy using force and position information. The system demonstrates fully automated construction of TE scaffolds with 50 microparts whose dimension error is larger than 5%.
Grasping trajectories in a virtual environment adhere to Weber's law.
Ozana, Aviad; Berman, Sigal; Ganel, Tzvi
2018-06-01
Virtual-reality and telerobotic devices simulate local motor control of virtual objects within computerized environments. Here, we explored grasping kinematics within a virtual environment and tested whether, as in normal 3D grasping, trajectories in the virtual environment are performed analytically, violating Weber's law with respect to object's size. Participants were asked to grasp a series of 2D objects using a haptic system, which projected their movements to a virtual space presented on a computer screen. The apparatus also provided object-specific haptic information upon "touching" the edges of the virtual targets. The results showed that grasping movements performed within the virtual environment did not produce the typical analytical trajectory pattern obtained during 3D grasping. Unlike as in 3D grasping, grasping trajectories in the virtual environment adhered to Weber's law, which indicates relative resolution in size processing. In addition, the trajectory patterns differed from typical trajectories obtained during 3D grasping, with longer times to complete the movement, and with maximum grip apertures appearing relatively early in the movement. The results suggest that grasping movements within a virtual environment could differ from those performed in real space, and are subjected to irrelevant effects of perceptual information. Such atypical pattern of visuomotor control may be mediated by the lack of complete transparency between the interface and the virtual environment in terms of the provided visual and haptic feedback. Possible implications of the findings to movement control within robotic and virtual environments are further discussed.
Vertigo in virtual reality with haptics: case report.
Viirre, Erik; Ellisman, Mark
2003-08-01
A researcher was working with a desktop virtual environment system. The system was displaying vector fields of a cyclonic weather system, and the system incorporated a haptic display of the forces in the cyclonic field. As the subject viewed the rotating cyclone field, they would move a handle "through" the representation of the moving winds and "feel" the forces buffeting the handle as it moved. Stopping after using the system for about 10 min, the user experienced an immediate sensation of postural instability for several minutes. Several hours later, there was the onset of vertigo with head turns. This vertigo lasted several hours and was accompanied with nausea and motion illusions that exacerbated by head movements. Symptoms persisted mildly the next day and were still present the third and fourth day, but by then were only provoked by head movements. There were no accompanying symptoms or history to suggest an inner ear disorder. Physical examination of inner ear and associated neurologic function was normal. No other users of this system have reported similar symptoms. This case suggests that some individuals may be susceptible to the interaction of displays with motion and movement forces and as a result experience motion illusions. Operators of such systems should be aware of this potential and minimize exposure if vertigo occurs.
The effects of perceptual priming on 4-year-olds' haptic-to-visual cross-modal transfer.
Kalagher, Hilary
2013-01-01
Four-year-old children often have difficulty visually recognizing objects that were previously experienced only haptically. This experiment attempts to improve their performance in these haptic-to-visual transfer tasks. Sixty-two 4-year-old children participated in priming trials in which they explored eight unfamiliar objects visually, haptically, or visually and haptically together. Subsequently, all children participated in the same haptic-to-visual cross-modal transfer task. In this task, children haptically explored the objects that were presented in the priming phase and then visually identified a match from among three test objects, each matching the object on only one dimension (shape, texture, or color). Children in all priming conditions predominantly made shape-based matches; however, the most shape-based matches were made in the Visual and Haptic condition. All kinds of priming provided the necessary memory traces upon which subsequent haptic exploration could build a strong enough representation to enable subsequent visual recognition. Haptic exploration patterns during the cross-modal transfer task are discussed and the detailed analyses provide a unique contribution to our understanding of the development of haptic exploratory procedures.
Braccio di Ferro: a new haptic workstation for neuromotor rehabilitation.
Casadio, Maura; Sanguineti, Vittorio; Morasso, Pietro G; Arrichiello, Vincenzo
2006-01-01
This technical note describes a new robotic workstation for neurological rehabilitation, shortly named Braccio di Ferro. It has been designed by having in mind the range of forces and the frequency bandwidth that characterize the interaction between a patient and a physical therapist, as well as a number of requirements that we think are essential for allowing a natural haptic interaction: back-driveability, very low friction and inertia, mechanical robustness, the possibility to operate in different planes, and an open software environment, which allows the operator to add new functionalities and design personalized rehabilitation protocols. Braccio di Ferro is an open system and, in the spirit of open source design, is intended to foster the dissemination of robot therapy. Moreover, its combination of features is not present in commercially available systems.
ERIC Educational Resources Information Center
Ausburn, Floyd B.
A U.S. Air Force study was designed to develop instruction based on the supplantation theory, in which tasks are performed (supplanted) for individuals who are unable to perform them due to their cognitive style. The study examined the effects of linear and multiple imagery in presenting a task requiring visual comparison and location to…
Alió, Jorge L; Plaza-Puche, Ana B; Javaloy, Jaime; Ayala, María José; Vega-Estrada, Alfredo
2013-04-01
To compare the visual and intraocular optical quality outcomes with different designs of the refractive rotationally asymmetric multifocal intraocular lens (MFIOL) (Lentis Mplus; Oculentis GmbH, Berlin, Germany) with or without capsular tension ring (CTR) implantation. One hundred thirty-five consecutive eyes of 78 patients with cataract (ages 36 to 82 years) were divided into three groups: 43 eyes implanted with the C-Loop haptic design without CTR (C-Loop haptic only group); 47 eyes implanted with the C-Loop haptic design with CTR (C-Loop haptic with CTR group); and 45 eyes implanted with the plate-haptic design (plate-haptic group). Visual acuity, contrast sensitivity, defocus curve, and ocular and intraocular optical quality were evaluated at 3 months postoperatively. Significant differences in the postoperative sphere were found (P = .01), with a more myopic postoperative refraction for the C-Loop haptic only group. No significant differences were detected in photopic and scotopic contrast sensitivity among groups (P ⩾ .05). Significantly better visual acuities were present in the C-Loop haptic with CTR group for the defocus levels of -2.0, -1.5, -1.0, and -0.50 D (P ⩽.03). Statistically significant differences among groups were found in total intraocular root mean square (RMS), high-order intraocular RMS, and intraocular coma-like RMS aberrations (P ⩽.04), with lower values from the plate-haptic group. The plate-haptic design and the C-Loop haptic design with CTR implantation both allow good visual rehabilitation. However, better refractive predictability and intraocular optical quality was obtained with the plate-haptic design without CTR implantation. The plate-haptic design seems to be a better design to support rotational asymmetric MFIOL optics. Copyright 2013, SLACK Incorporated.