Sample records for creating graphical haptic

  1. G2H--graphics-to-haptic virtual environment development tool for PC's.

    PubMed

    Acosta, E; Temkin, B; Krummel, T M; Heinrichs, W L

    2000-01-01

    For surgical training and preparations, the existing surgical virtual environments have shown great improvement. However, these improvements are more in the visual aspect. The incorporation of haptics into virtual reality base surgical simulations would enhance the sense of realism greatly. To aid in the development of the haptic surgical virtual environment we have created a graphics to haptic, G2H, virtual environment developer tool. G2H transforms graphical virtual environments (created or imported) to haptic virtual environments without programming. The G2H capability has been demonstrated using the complex 3D pelvic model of Lucy 2.0, the Stanford Visible Female. The pelvis was made haptic using G2H without any further programming effort.

  2. Augmented reality and haptic interfaces for robot-assisted surgery.

    PubMed

    Yamamoto, Tomonori; Abolhassani, Niki; Jung, Sung; Okamura, Allison M; Judkins, Timothy N

    2012-03-01

    Current teleoperated robot-assisted minimally invasive surgical systems do not take full advantage of the potential performance enhancements offered by various forms of haptic feedback to the surgeon. Direct and graphical haptic feedback systems can be integrated with vision and robot control systems in order to provide haptic feedback to improve safety and tissue mechanical property identification. An interoperable interface for teleoperated robot-assisted minimally invasive surgery was developed to provide haptic feedback and augmented visual feedback using three-dimensional (3D) graphical overlays. The software framework consists of control and command software, robot plug-ins, image processing plug-ins and 3D surface reconstructions. The feasibility of the interface was demonstrated in two tasks performed with artificial tissue: palpation to detect hard lumps and surface tracing, using vision-based forbidden-region virtual fixtures to prevent the patient-side manipulator from entering unwanted regions of the workspace. The interoperable interface enables fast development and successful implementation of effective haptic feedback methods in teleoperation. Copyright © 2011 John Wiley & Sons, Ltd.

  3. A prototype percutaneous transhepatic cholangiography training simulator with real-time breathing motion.

    PubMed

    Villard, P F; Vidal, F P; Hunt, C; Bello, F; John, N W; Johnson, S; Gould, D A

    2009-11-01

    We present here a simulator for interventional radiology focusing on percutaneous transhepatic cholangiography (PTC). This procedure consists of inserting a needle into the biliary tree using fluoroscopy for guidance. The requirements of the simulator have been driven by a task analysis. The three main components have been identified: the respiration, the real-time X-ray display (fluoroscopy) and the haptic rendering (sense of touch). The framework for modelling the respiratory motion is based on kinematics laws and on the Chainmail algorithm. The fluoroscopic simulation is performed on the graphic card and makes use of the Beer-Lambert law to compute the X-ray attenuation. Finally, the haptic rendering is integrated to the virtual environment and takes into account the soft-tissue reaction force feedback and maintenance of the initial direction of the needle during the insertion. Five training scenarios have been created using patient-specific data. Each of these provides the user with variable breathing behaviour, fluoroscopic display tuneable to any device parameters and needle force feedback. A detailed task analysis has been used to design and build the PTC simulator described in this paper. The simulator includes real-time respiratory motion with two independent parameters (rib kinematics and diaphragm action), on-line fluoroscopy implemented on the Graphics Processing Unit and haptic feedback to feel the soft-tissue behaviour of the organs during the needle insertion.

  4. Algorithms for Haptic Rendering of 3D Objects

    NASA Technical Reports Server (NTRS)

    Basdogan, Cagatay; Ho, Chih-Hao; Srinavasan, Mandayam

    2003-01-01

    Algorithms have been developed to provide haptic rendering of three-dimensional (3D) objects in virtual (that is, computationally simulated) environments. The goal of haptic rendering is to generate tactual displays of the shapes, hardnesses, surface textures, and frictional properties of 3D objects in real time. Haptic rendering is a major element of the emerging field of computer haptics, which invites comparison with computer graphics. We have already seen various applications of computer haptics in the areas of medicine (surgical simulation, telemedicine, haptic user interfaces for blind people, and rehabilitation of patients with neurological disorders), entertainment (3D painting, character animation, morphing, and sculpting), mechanical design (path planning and assembly sequencing), and scientific visualization (geophysical data analysis and molecular manipulation).

  5. Toward an improved haptic zooming algorithm for graphical information accessed by individuals who are blind and visually impaired.

    PubMed

    Rastogi, Ravi; Pawluk, Dianne T V

    2013-01-01

    An increasing amount of information content used in school, work, and everyday living is presented in graphical form. Unfortunately, it is difficult for people who are blind or visually impaired to access this information, especially when many diagrams are needed. One problem is that details, even in relatively simple visual diagrams, can be very difficult to perceive using touch. With manually created tactile diagrams, these details are often presented in separate diagrams which must be selected from among others. Being able to actively zoom in on an area of a single diagram so that the details can be presented at a reasonable size for exploration purposes seems a simpler approach for the user. However, directly using visual zooming methods have some limitations when used haptically. Therefore, a new zooming method is proposed to avoid these pitfalls. A preliminary experiment was performed to examine the usefulness of the algorithm compared to not using zooming. The results showed that the number of correct responses improved with the developed zooming algorithm and participants found it to be more usable than not using zooming for exploration of a floor map.

  6. Using the PhysX engine for physics-based virtual surgery with force feedback.

    PubMed

    Maciel, Anderson; Halic, Tansel; Lu, Zhonghua; Nedel, Luciana P; De, Suvranu

    2009-09-01

    The development of modern surgical simulators is highly challenging, as they must support complex simulation environments. The demand for higher realism in such simulators has driven researchers to adopt physics-based models, which are computationally very demanding. This poses a major problem, since real-time interactions must permit graphical updates of 30 Hz and a much higher rate of 1 kHz for force feedback (haptics). Recently several physics engines have been developed which offer multi-physics simulation capabilities, including rigid and deformable bodies, cloth and fluids. While such physics engines provide unique opportunities for the development of surgical simulators, their higher latencies, compared to what is necessary for real-time graphics and haptics, offer significant barriers to their use in interactive simulation environments. In this work, we propose solutions to this problem and demonstrate how a multimodal surgical simulation environment may be developed based on NVIDIA's PhysX physics library. Hence, models that are undergoing relatively low-frequency updates in PhysX can exist in an environment that demands much higher frequency updates for haptics. We use a collision handling layer to interface between the physical response provided by PhysX and the haptic rendering device to provide both real-time tissue response and force feedback. Our simulator integrates a bimanual haptic interface for force feedback and per-pixel shaders for graphics realism in real time. To demonstrate the effectiveness of our approach, we present the simulation of the laparoscopic adjustable gastric banding (LAGB) procedure as a case study. To develop complex and realistic surgical trainers with realistic organ geometries and tissue properties demands stable physics-based deformation methods, which are not always compatible with the interaction level required for such trainers. We have shown that combining different modelling strategies for behaviour, collision and graphics is possible and desirable. Such multimodal environments enable suitable rates to simulate the major steps of the LAGB procedure.

  7. Improving manual skills in persons with disabilities (PWD) through a multimodal assistance system.

    PubMed

    Covarrubias, Mario; Gatti, Elia; Bordegoni, Monica; Cugini, Umberto; Mansutti, Alessandro

    2014-07-01

    In this research work, we present a Multimodal Guidance System (MGS) whose aim is to provide dynamic assistance to persons with disabilities (PWD) while performing manual activities such as drawing, coloring in and foam-cutting tasks. The MGS provides robotic assistance in the execution of 2D tasks through haptic and sound interactions. Haptic technology provides the virtual path of 2D shapes through the point-based approach, while sound technology provides audio feedback inputs related to the hand's velocity while sketching and filling or cutting operations. By combining this Multimodal System with the haptic assistance, we have created a new approach with possible applications to such diverse fields as physical rehabilitation, scientific investigation of sensorimotor learning and assessment of hand movements in PWD. The MGS has been tested by people with specific disorders affecting coordination, such as Down syndrome and developmental disabilities, under the supervision of their teachers and care assistants inside their learning environment. A Graphic User Interface has been designed for teachers and care assistants in order to provide training during the test sessions. Our results provide conclusive evidence that the effect of using the MGS increases the accuracy in the tasks operations. The Multimodal Guidance System (MGS) is an interface that offers haptic and sound feedback while performing manual tasks. Several studies demonstrated that the haptic guidance systems can help people in recovering cognitive function at different levels of complexity and impairment. The applications supported by our device could also have an important role in supporting physical therapist and cognitive psychologist in helping patients to recover motor and visuo-spatial abilities.

  8. Haptic/graphic rehabilitation: integrating a robot into a virtual environment library and applying it to stroke therapy.

    PubMed

    Sharp, Ian; Patton, James; Listenberger, Molly; Case, Emily

    2011-08-08

    Recent research that tests interactive devices for prolonged therapy practice has revealed new prospects for robotics combined with graphical and other forms of biofeedback. Previous human-robot interactive systems have required different software commands to be implemented for each robot leading to unnecessary developmental overhead time each time a new system becomes available. For example, when a haptic/graphic virtual reality environment has been coded for one specific robot to provide haptic feedback, that specific robot would not be able to be traded for another robot without recoding the program. However, recent efforts in the open source community have proposed a wrapper class approach that can elicit nearly identical responses regardless of the robot used. The result can lead researchers across the globe to perform similar experiments using shared code. Therefore modular "switching out"of one robot for another would not affect development time. In this paper, we outline the successful creation and implementation of a wrapper class for one robot into the open-source H3DAPI, which integrates the software commands most commonly used by all robots.

  9. Input and output for surgical simulation: devices to measure tissue properties in vivo and a haptic interface for laparoscopy simulators.

    PubMed

    Ottensmeyer, M P; Ben-Ur, E; Salisbury, J K

    2000-01-01

    Current efforts in surgical simulation very often focus on creating realistic graphical feedback, but neglect some or all tactile and force (haptic) feedback that a surgeon would normally receive. Simulations that do include haptic feedback do not typically use real tissue compliance properties, favoring estimates and user feedback to determine realism. When tissue compliance data are used, there are virtually no in vivo property measurements to draw upon. Together with the Center for Innovative Minimally Invasive Therapy at the Massachusetts General Hospital, the Haptics Group is developing tools to introduce more comprehensive haptic feedback in laparoscopy simulators and to provide biological tissue material property data for our software simulation. The platform for providing haptic feedback is a PHANToM Haptic Interface, produced by SensAble Technologies, Inc. Our devices supplement the PHANToM to provide for grasping and optionally, for the roll axis of the tool. Together with feedback from the PHANToM, which provides the pitch, yaw and thrust axes of a typical laparoscopy tool, we can recreate all of the haptic sensations experienced during laparoscopy. The devices integrate real laparoscopy toolhandles and a compliant torso model to complete the set of visual and tactile sensations. Biological tissues are known to exhibit non-linear mechanical properties, and change their properties dramatically when removed from a living organism. To measure the properties in vivo, two devices are being developed. The first is a small displacement, 1-D indenter. It will measure the linear tissue compliance (stiffness and damping) over a wide range of frequencies. These data will be used as inputs to a finite element or other model. The second device will be able to deflect tissues in 3-D over a larger range, so that the non-linearities due to changes in the tissue geometry will be measured. This will allow us to validate the performance of the model on large tissue deformations. Both devices are designed to pass through standard 12 mm laparoscopy trocars, and will be suitable for use during open or minimally invasive procedures. We plan to acquire data from pigs used by surgeons for training purposes, but conceivably, the tools could be refined for use on humans undergoing surgery. Our work will provide the necessary data input for surgical simulations to accurately model the force interactions that a surgeon would have with tissue, and will provide the force output to create a truly realistic simulation of minimally invasive surgery.

  10. Co-located haptic and 3D graphic interface for medical simulations.

    PubMed

    Berkelman, Peter; Miyasaka, Muneaki; Bozlee, Sebastian

    2013-01-01

    We describe a system which provides high-fidelity haptic feedback in the same physical location as a 3D graphical display, in order to enable realistic physical interaction with virtual anatomical tissue during modelled procedures such as needle driving, palpation, and other interventions performed using handheld instruments. The haptic feedback is produced by the interaction between an array of coils located behind a thin flat LCD screen, and permanent magnets embedded in the instrument held by the user. The coil and magnet configuration permits arbitrary forces and torques to be generated on the instrument in real time according to the dynamics of the simulated tissue by activating the coils in combination. A rigid-body motion tracker provides position and orientation feedback of the handheld instrument to the computer simulation, and the 3D display is produced using LCD shutter glasses and a head-tracking system for the user.

  11. Haptic interface of the KAIST-Ewha colonoscopy simulator II.

    PubMed

    Woo, Hyun Soo; Kim, Woo Seok; Ahn, Woojin; Lee, Doo Yong; Yi, Sun Young

    2008-11-01

    This paper presents an improved haptic interface for the Korea Advanced Institute of Science and Technology Ewha Colonoscopy Simulator II. The haptic interface enables the distal portion of the colonoscope to be freely bent while guaranteeing sufficient workspace and reflective forces for colonoscopy simulation. Its force-torque sensor measures the profiles of the user. Manipulation of the colonoscope tip is monitored by four deflection sensors and triggers computations to render accurate graphic images corresponding to the rotation of the angle knob. Tack sensors are attached to the valve-actuation buttons of the colonoscope to simulate air injection or suction as well as the corresponding deformation of the colon. A survey study for face validation was conducted, and the result shows that the developed haptic interface provides realistic haptic feedback for colonoscopy simulations.

  12. Improved haptic interface for colonoscopy simulation.

    PubMed

    Woo, Hyun Soo; Kim, Woo Seok; Ahn, Woojin; Lee, Doo Yong; Yi, Sun Young

    2007-01-01

    This paper presents an improved haptic interface of the KAIST-Ewha colonoscopy simulator II. The haptic interface enables the distal portion of the colonoscope to be freely bent while guaranteeing enough workspace and reflective forces for colonoscopy simulation. Its force-torque sensor measures profiles of the user. Manipulation of the colonoscope tip is monitored by four deflection sensors, and triggers computation to render accurate graphic images corresponding to the angle knob rotation. Tack switches are attached on the valve-actuation buttons of the colonoscope to simulate air-injection or suction, and the corresponding deformation of the colon.

  13. Modeling light

    NASA Astrophysics Data System (ADS)

    Dawson, P.; Gage, J.; Takatsuka, M.; Goyette, S.

    2009-02-01

    To compete with other digital images, holograms must go beyond the current range of source-image types, such as sequences of photographs, laser scans, and 3D computer graphics (CG) scenes made with software designed for other applications. This project develops a set of innovative techniques for creating 3D digital content specifically for digital holograms, with virtual tools which enable the direct hand-crafting of subjects, mark by mark, analogous to Michelangelo's practice in drawing, painting and sculpture. The haptic device, the Phantom Premium 1.5 is used to draw against three-dimensional laser- scan templates of Michelangelo's sculpture placed within the holographic viewing volume.

  14. Precise Haptic Device Co-Location for Visuo-Haptic Augmented Reality.

    PubMed

    Eck, Ulrich; Pankratz, Frieder; Sandor, Christian; Klinker, Gudrun; Laga, Hamid

    2015-12-01

    Visuo-haptic augmented reality systems enable users to see and touch digital information that is embedded in the real world. PHANToM haptic devices are often employed to provide haptic feedback. Precise co-location of computer-generated graphics and the haptic stylus is necessary to provide a realistic user experience. Previous work has focused on calibration procedures that compensate the non-linear position error caused by inaccuracies in the joint angle sensors. In this article we present a more complete procedure that additionally compensates for errors in the gimbal sensors and improves position calibration. The proposed procedure further includes software-based temporal alignment of sensor data and a method for the estimation of a reference for position calibration, resulting in increased robustness against haptic device initialization and external tracker noise. We designed our procedure to require minimal user input to maximize usability. We conducted an extensive evaluation with two different PHANToMs, two different optical trackers, and a mechanical tracker. Compared to state-of-the-art calibration procedures, our approach significantly improves the co-location of the haptic stylus. This results in higher fidelity visual and haptic augmentations, which are crucial for fine-motor tasks in areas such as medical training simulators, assembly planning tools, or rapid prototyping applications.

  15. Haptic Feedback in Robot-Assisted Minimally Invasive Surgery

    PubMed Central

    Okamura, Allison M.

    2009-01-01

    Purpose of Review Robot-assisted minimally invasive surgery (RMIS) holds great promise for improving the accuracy and dexterity of a surgeon while minimizing trauma to the patient. However, widespread clinical success with RMIS has been marginal. It is hypothesized that the lack of haptic (force and tactile) feedback presented to the surgeon is a limiting factor. This review explains the technical challenges of creating haptic feedback for robot-assisted surgery and provides recent results that evaluate the effectiveness of haptic feedback in mock surgical tasks. Recent Findings Haptic feedback systems for RMIS are still under development and evaluation. Most provide only force feedback, with limited fidelity. The major challenge at this time is sensing forces applied to the patient. A few tactile feedback systems for RMIS have been created, but their practicality for clinical implementation needs to be shown. It is particularly difficult to sense and display spatially distributed tactile information. The cost-benefit ratio for haptic feedback in RMIS has not been established. Summary The designs of existing commercial RMIS systems are not conducive for force feedback, and creative solutions are needed to create compelling tactile feedback systems. Surgeons, engineers, and neuroscientists should work together to develop effective solutions for haptic feedback in RMIS. PMID:19057225

  16. Virtual reality in neurosurgical education: part-task ventriculostomy simulation with dynamic visual and haptic feedback.

    PubMed

    Lemole, G Michael; Banerjee, P Pat; Luciano, Cristian; Neckrysh, Sergey; Charbel, Fady T

    2007-07-01

    Mastery of the neurosurgical skill set involves many hours of supervised intraoperative training. Convergence of political, economic, and social forces has limited neurosurgical resident operative exposure. There is need to develop realistic neurosurgical simulations that reproduce the operative experience, unrestricted by time and patient safety constraints. Computer-based, virtual reality platforms offer just such a possibility. The combination of virtual reality with dynamic, three-dimensional stereoscopic visualization, and haptic feedback technologies makes realistic procedural simulation possible. Most neurosurgical procedures can be conceptualized and segmented into critical task components, which can be simulated independently or in conjunction with other modules to recreate the experience of a complex neurosurgical procedure. We use the ImmersiveTouch (ImmersiveTouch, Inc., Chicago, IL) virtual reality platform, developed at the University of Illinois at Chicago, to simulate the task of ventriculostomy catheter placement as a proof-of-concept. Computed tomographic data are used to create a virtual anatomic volume. Haptic feedback offers simulated resistance and relaxation with passage of a virtual three-dimensional ventriculostomy catheter through the brain parenchyma into the ventricle. A dynamic three-dimensional graphical interface renders changing visual perspective as the user's head moves. The simulation platform was found to have realistic visual, tactile, and handling characteristics, as assessed by neurosurgical faculty, residents, and medical students. We have developed a realistic, haptics-based virtual reality simulator for neurosurgical education. Our first module recreates a critical component of the ventriculostomy placement task. This approach to task simulation can be assembled in a modular manner to reproduce entire neurosurgical procedures.

  17. Visual and haptic integration in the estimation of softness of deformable objects

    PubMed Central

    Cellini, Cristiano; Kaim, Lukas; Drewing, Knut

    2013-01-01

    Softness perception intrinsically relies on haptic information. However, through everyday experiences we learn correspondences between felt softness and the visual effects of exploratory movements that are executed to feel softness. Here, we studied how visual and haptic information is integrated to assess the softness of deformable objects. Participants discriminated between the softness of two softer or two harder objects using only-visual, only-haptic or both visual and haptic information. We assessed the reliabilities of the softness judgments using the method of constant stimuli. In visuo-haptic trials, discrepancies between the two senses' information allowed us to measure the contribution of the individual senses to the judgments. Visual information (finger movement and object deformation) was simulated using computer graphics; input in visual trials was taken from previous visuo-haptic trials. Participants were able to infer softness from vision alone, and vision considerably contributed to bisensory judgments (∼35%). The visual contribution was higher than predicted from models of optimal integration (senses are weighted according to their reliabilities). Bisensory judgments were less reliable than predicted from optimal integration. We conclude that the visuo-haptic integration of softness information is biased toward vision, rather than being optimal, and might even be guided by a fixed weighting scheme. PMID:25165510

  18. Virtual rounds: simulation-based education in procedural medicine

    NASA Astrophysics Data System (ADS)

    Shaffer, David W.; Meglan, Dwight A.; Ferrell, Margaret; Dawson, Steven L.

    1999-07-01

    Computer-based simulation is a goal for training physicians in specialties where traditional training puts patients at risk. Intuitively, interactive simulation of anatomy, pathology, and therapeutic actions should lead to shortening of the learning curve for novice or inexperienced physicians. Effective transfer of knowledge acquired in simulators must be shown for such devices to be widely accepted in the medical community. We have developed an Interventional Cardiology Training Simulator which incorporates real-time graphic interactivity coupled with haptic response, and an embedded curriculum permitting rehearsal, hypertext links, personal archiving and instructor review and testing capabilities. This linking of purely technical simulation with educational content creates a more robust educational purpose for procedural simulators.

  19. Tactile Data Entry System

    NASA Technical Reports Server (NTRS)

    Adams, Richard J.

    2015-01-01

    The patent-pending Glove-Enabled Computer Operations (GECO) design leverages extravehicular activity (EVA) glove design features as platforms for instrumentation and tactile feedback, enabling the gloves to function as human-computer interface devices. Flexible sensors in each finger enable control inputs that can be mapped to any number of functions (e.g., a mouse click, a keyboard strike, or a button press). Tracking of hand motion is interpreted alternatively as movement of a mouse (change in cursor position on a graphical user interface) or a change in hand position on a virtual keyboard. Programmable vibro-tactile actuators aligned with each finger enrich the interface by creating the haptic sensations associated with control inputs, such as recoil of a button press.

  20. A kinesthetic washout filter for force-feedback rendering.

    PubMed

    Danieau, Fabien; Lecuyer, Anatole; Guillotel, Philippe; Fleureau, Julien; Mollet, Nicolas; Christie, Marc

    2015-01-01

    Today haptic feedback can be designed and associated to audiovisual content (haptic-audiovisuals or HAV). Although there are multiple means to create individual haptic effects, the issue of how to properly adapt such effects on force-feedback devices has not been addressed and is mostly a manual endeavor. We propose a new approach for the haptic rendering of HAV, based on a washout filter for force-feedback devices. A body model and an inverse kinematics algorithm simulate the user's kinesthetic perception. Then, the haptic rendering is adapted in order to handle transitions between haptic effects and to optimize the amplitude of effects regarding the device capabilities. Results of a user study show that this new haptic rendering can successfully improve the HAV experience.

  1. An augmented reality haptic training simulator for spinal needle procedures.

    PubMed

    Sutherland, Colin; Hashtrudi-Zaad, Keyvan; Sellens, Rick; Abolmaesumi, Purang; Mousavi, Parvin

    2013-11-01

    This paper presents the prototype for an augmented reality haptic simulation system with potential for spinal needle insertion training. The proposed system is composed of a torso mannequin, a MicronTracker2 optical tracking system, a PHANToM haptic device, and a graphical user interface to provide visual feedback. The system allows users to perform simulated needle insertions on a physical mannequin overlaid with an augmented reality cutaway of patient anatomy. A tissue model based on a finite-element model provides force during the insertion. The system allows for training without the need for the presence of a trained clinician or access to live patients or cadavers. A pilot user study demonstrates the potential and functionality of the system.

  2. Analysis of hand contact areas and interaction capabilities during manipulation and exploration.

    PubMed

    Gonzalez, Franck; Gosselin, Florian; Bachta, Wael

    2014-01-01

    Manual human-computer interfaces for virtual reality are designed to allow an operator interacting with a computer simulation as naturally as possible. Dexterous haptic interfaces are the best suited for this goal. They give intuitive and efficient control on the environment with haptic and tactile feedback. This paper is aimed at helping in the choice of the interaction areas to be taken into account in the design of such interfaces. The literature dealing with hand interactions is first reviewed in order to point out the contact areas involved in exploration and manipulation tasks. Their frequencies of use are then extracted from existing recordings. The results are gathered in an original graphical interaction map allowing for a simple visualization of the way the hand is used, and compared with a map of mechanoreceptors densities. Then an interaction tree, mapping the relative amount of actions made available through the use of a given contact area, is built and correlated with the losses of hand function induced by amputations. A rating of some existing haptic interfaces and guidelines for their design are finally achieved to illustrate a possible use of the developed graphical tools.

  3. Haptics – Touchfeedback Technology Widening the Horizon of Medicine

    PubMed Central

    Kapoor, Shalini; Arora, Pallak; Kapoor, Vikas; Jayachandran, Mahesh; Tiwari, Manish

    2014-01-01

    Haptics, or touchsense haptic technology is a major breakthrough in medical and dental interventions. Haptic perception is the process of recognizing objects through touch. Haptic sensations are created by actuators or motors which generate vibrations to the users and are controlled by embedded software which is integrated into the device. It takes the advantage of a combination of somatosensory pattern of skin and proprioception of hand position. Anatomical and diagnostic knowledge, when it is combined with this touch sense technology, has revolutionized medical education. This amalgamation of the worlds of diagnosis and surgical intervention adds precise robotic touch to the skill of the surgeon. A systematic literature review was done by using MEDLINE, GOOGLE SEARCH AND PubMed. The aim of this article was to introduce the fundamentals of haptic technology, its current applications in medical training and robotic surgeries, limitations of haptics and future aspects of haptics in medicine. PMID:24783164

  4. Low cost heads-up virtual reality (HUVR) with optical tracking and haptic feedback

    NASA Astrophysics Data System (ADS)

    Margolis, Todd; DeFanti, Thomas A.; Dawe, Greg; Prudhomme, Andrew; Schulze, Jurgen P.; Cutchin, Steve

    2011-03-01

    Researchers at the University of California, San Diego, have created a new, relatively low-cost augmented reality system that enables users to touch the virtual environment they are immersed in. The Heads-Up Virtual Reality device (HUVR) couples a consumer 3D HD flat screen TV with a half-silvered mirror to project any graphic image onto the user's hands and into the space surrounding them. With his or her head position optically tracked to generate the correct perspective view, the user maneuvers a force-feedback (haptic) device to interact with the 3D image, literally 'touching' the object's angles and contours as if it was a tangible physical object. HUVR can be used for training and education in structural and mechanical engineering, archaeology and medicine as well as other tasks that require hand-eye coordination. One of the most unique characteristics of HUVR is that a user can place their hands inside of the virtual environment without occluding the 3D image. Built using open-source software and consumer level hardware, HUVR offers users a tactile experience in an immersive environment that is functional, affordable and scalable.

  5. Haptic display for the VR arthroscopy training simulator

    NASA Astrophysics Data System (ADS)

    Ziegler, Rolf; Brandt, Christoph; Kunstmann, Christian; Mueller, Wolfgang; Werkhaeuser, Holger

    1997-05-01

    A specific desire to find new training methods arose from the new fields called 'minimal invasive surgery.' With the technical advance modern video arthroscopy became the standard procedure in the ORs. Holding the optical system with the video camera in one hand, watching the operation field on the monitor, the other hand was free to guide, e.g., a probe. As arthroscopy became a more common procedure it became obvious that some sort of special training was necessary to guarantee a certain level of qualification of the surgeons. Therefore, a hospital in Frankfurt, Germany approached the Fraunhofer Institute for Computer Graphics to develop a training system for arthroscopy based on VR techniques. At least the main drawback of the developed simulator is the missing of haptic perception, especially of force feedback. In cooperation with the Department of Electro-Mechanical Construction at the Darmstadt Technical University we have designed and built a haptic display for the VR arthroscopy training simulator. In parallel we developed a concept for the integration of the haptic display in a configurable way.

  6. Open Touch/Sound Maps: A system to convey street data through haptic and auditory feedback

    NASA Astrophysics Data System (ADS)

    Kaklanis, Nikolaos; Votis, Konstantinos; Tzovaras, Dimitrios

    2013-08-01

    The use of spatial (geographic) information is becoming ever more central and pervasive in today's internet society but the most of it is currently inaccessible to visually impaired users. However, access in visual maps is severely restricted to visually impaired and people with blindness, due to their inability to interpret graphical information. Thus, alternative ways of a map's presentation have to be explored, in order to enforce the accessibility of maps. Multiple types of sensory perception like touch and hearing may work as a substitute of vision for the exploration of maps. The use of multimodal virtual environments seems to be a promising alternative for people with visual impairments. The present paper introduces a tool for automatic multimodal map generation having haptic and audio feedback using OpenStreetMap data. For a desired map area, an elevation map is being automatically generated and can be explored by touch, using a haptic device. A sonification and a text-to-speech (TTS) mechanism provide also audio navigation information during the haptic exploration of the map.

  7. Virtual reality simulation: basic concepts and use in endoscopic neurosurgery training.

    PubMed

    Cohen, Alan R; Lohani, Subash; Manjila, Sunil; Natsupakpong, Suriya; Brown, Nathan; Cavusoglu, M Cenk

    2013-08-01

    Virtual reality simulation is a promising alternative to training surgical residents outside the operating room. It is also a useful aide to anatomic study, residency training, surgical rehearsal, credentialing, and recertification. Surgical simulation is based on a virtual reality with varying degrees of immersion and realism. Simulators provide a no-risk environment for harmless and repeatable practice. Virtual reality has three main components of simulation: graphics/volume rendering, model behavior/tissue deformation, and haptic feedback. The challenge of accurately simulating the forces and tactile sensations experienced in neurosurgery limits the sophistication of a virtual simulator. The limited haptic feedback available in minimally invasive neurosurgery makes it a favorable subject for simulation. Virtual simulators with realistic graphics and force feedback have been developed for ventriculostomy, intraventricular surgery, and transsphenoidal pituitary surgery, thus allowing preoperative study of the individual anatomy and increasing the safety of the procedure. The authors also present experiences with their own virtual simulation of endoscopic third ventriculostomy.

  8. Haptic Technologies for MEMS Design

    NASA Astrophysics Data System (ADS)

    Calis, Mustafa; Desmulliez, Marc P. Y.

    2006-04-01

    This paper presents for the first time a design methodology for MEMS/NEMS based on haptic sensing technologies. The software tool created as a result of this methodology will enable designers to model and interact in real time with their virtual prototype. One of the main advantages of haptic sensing is the ability to bring unusual microscopic forces back to the designer's world. Other significant benefits for developing such a methodology include gain productivity and the capability to include manufacturing costs within the design cycle.

  9. Touch-screen technology for the dynamic display of -2D spatial information without vision: promise and progress.

    PubMed

    Klatzky, Roberta L; Giudice, Nicholas A; Bennett, Christopher R; Loomis, Jack M

    2014-01-01

    Many developers wish to capitalize on touch-screen technology for developing aids for the blind, particularly by incorporating vibrotactile stimulation to convey patterns on their surfaces, which otherwise are featureless. Our belief is that they will need to take into account basic research on haptic perception in designing these graphics interfaces. We point out constraints and limitations in haptic processing that affect the use of these devices. We also suggest ways to use sound to augment basic information from touch, and we include evaluation data from users of a touch-screen device with vibrotactile and auditory feedback that we have been developing, called a vibro-audio interface.

  10. Let the Force Be with Us: Dyads Exploit Haptic Coupling for Coordination

    ERIC Educational Resources Information Center

    van der Wel, Robrecht P. R. D.; Knoblich, Guenther; Sebanz, Natalie

    2011-01-01

    People often perform actions that involve a direct physical coupling with another person, such as when moving furniture together. Here, we examined how people successfully coordinate such actions with others. We tested the hypothesis that dyads amplify their forces to create haptic information to coordinate. Participants moved a pole (resembling a…

  11. Designing Haptic Assistive Technology for Individuals Who Are Blind or Visually Impaired.

    PubMed

    Pawluk, Dianne T V; Adams, Richard J; Kitada, Ryo

    2015-01-01

    This paper considers issues relevant for the design and use of haptic technology for assistive devices for individuals who are blind or visually impaired in some of the major areas of importance: Braille reading, tactile graphics, orientation and mobility. We show that there is a wealth of behavioral research that is highly applicable to assistive technology design. In a few cases, conclusions from behavioral experiments have been directly applied to design with positive results. Differences in brain organization and performance capabilities between individuals who are "early blind" and "late blind" from using the same tactile/haptic accommodations, such as the use of Braille, suggest the importance of training and assessing these groups individually. Practical restrictions on device design, such as performance limitations of the technology and cost, raise questions as to which aspects of these restrictions are truly important to overcome to achieve high performance. In general, this raises the question of what it means to provide functional equivalence as opposed to sensory equivalence.

  12. Training Surgical Residents With a Haptic Robotic Central Venous Catheterization Simulator.

    PubMed

    Pepley, David F; Gordon, Adam B; Yovanoff, Mary A; Mirkin, Katelin A; Miller, Scarlett R; Han, David C; Moore, Jason Z

    Ultrasound guided central venous catheterization (CVC) is a common surgical procedure with complication rates ranging from 5 to 21 percent. Training is typically performed using manikins that do not simulate anatomical variations such as obesity and abnormal vessel positioning. The goal of this study was to develop and validate the effectiveness of a new virtual reality and force haptic based simulation platform for CVC of the right internal jugular vein. A CVC simulation platform was developed using a haptic robotic arm, 3D position tracker, and computer visualization. The haptic robotic arm simulated needle insertion force that was based on cadaver experiments. The 3D position tracker was used as a mock ultrasound device with realistic visualization on a computer screen. Upon completion of a practice simulation, performance feedback is given to the user through a graphical user interface including scoring factors based on good CVC practice. The effectiveness of the system was evaluated by training 13 first year surgical residents using the virtual reality haptic based training system over a 3 month period. The participants' performance increased from 52% to 96% on the baseline training scenario, approaching the average score of an expert surgeon: 98%. This also resulted in improvement in positive CVC practices including a 61% decrease between final needle tip position and vein center, a decrease in mean insertion attempts from 1.92 to 1.23, and a 12% increase in time spent aspirating the syringe throughout the procedure. A virtual reality haptic robotic simulator for CVC was successfully developed. Surgical residents training on the simulation improved to near expert levels after three robotic training sessions. This suggests that this system could act as an effective training device for CVC. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  13. Haptic-Based Neurorehabilitation in Poststroke Patients: A Feasibility Prospective Multicentre Trial for Robotics Hand Rehabilitation

    PubMed Central

    Daud Albasini, Omar A.; Oboe, Roberto; Tonin, Paolo; Paolucci, Stefano; Sandrini, Giorgio; Piron, Lamberto

    2013-01-01

    Background. Haptic robots allow the exploitation of known motor learning mechanisms, representing a valuable option for motor treatment after stroke. The aim of this feasibility multicentre study was to test the clinical efficacy of a haptic prototype, for the recovery of hand function after stroke. Methods. A prospective pilot clinical trial was planned on 15 consecutive patients enrolled in 3 rehabilitation centre in Italy. All the framework features of the haptic robot (e.g., control loop, external communication, and graphic rendering for virtual reality) were implemented into a real-time MATLAB/Simulink environment, controlling a five-bar linkage able to provide forces up to 20 [N] at the end effector, used for finger and hand rehabilitation therapies. Clinical (i.e., Fugl-Meyer upper extremity scale; nine hold pegboard test) and kinematics (i.e., time; velocity; jerk metric; normalized jerk of standard movements) outcomes were assessed before and after treatment to detect changes in patients' motor performance. Reorganization of cortical activation was detected in one patient by fMRI. Results and Conclusions. All patients showed significant improvements in both clinical and kinematic outcomes. Additionally, fMRI results suggest that the proposed approach may promote a better cortical activation in the brain. PMID:24319496

  14. Haptic-based neurorehabilitation in poststroke patients: a feasibility prospective multicentre trial for robotics hand rehabilitation.

    PubMed

    Turolla, Andrea; Daud Albasini, Omar A; Oboe, Roberto; Agostini, Michela; Tonin, Paolo; Paolucci, Stefano; Sandrini, Giorgio; Venneri, Annalena; Piron, Lamberto

    2013-01-01

    Background. Haptic robots allow the exploitation of known motor learning mechanisms, representing a valuable option for motor treatment after stroke. The aim of this feasibility multicentre study was to test the clinical efficacy of a haptic prototype, for the recovery of hand function after stroke. Methods. A prospective pilot clinical trial was planned on 15 consecutive patients enrolled in 3 rehabilitation centre in Italy. All the framework features of the haptic robot (e.g., control loop, external communication, and graphic rendering for virtual reality) were implemented into a real-time MATLAB/Simulink environment, controlling a five-bar linkage able to provide forces up to 20 [N] at the end effector, used for finger and hand rehabilitation therapies. Clinical (i.e., Fugl-Meyer upper extremity scale; nine hold pegboard test) and kinematics (i.e., time; velocity; jerk metric; normalized jerk of standard movements) outcomes were assessed before and after treatment to detect changes in patients' motor performance. Reorganization of cortical activation was detected in one patient by fMRI. Results and Conclusions. All patients showed significant improvements in both clinical and kinematic outcomes. Additionally, fMRI results suggest that the proposed approach may promote a better cortical activation in the brain.

  15. Integration of Haptics in Agricultural Robotics

    NASA Astrophysics Data System (ADS)

    Kannan Megalingam, Rajesh; Sreekanth, M. M.; Sivanantham, Vinu; Sai Kumar, K.; Ghanta, Sriharsha; Surya Teja, P.; Reddy, Rajesh G.

    2017-08-01

    Robots can differentiate with open loop system and closed loop system robots. We face many problems when we do not have a feedback from robots. In this research paper, we are discussing all possibilities to achieve complete closed loop system for Multiple-DOF Robotic Arm, which is used in a coconut tree climbing and cutting robot by introducing a Haptic device. We are working on various sensors like tactile, vibration, force and proximity sensors for getting feedback. For monitoring the robotic arm achieved by graphical user interference software which simulates the working of the robotic arm, send the feedback of all the real time analog values which are produced by various sensors and provide real-time graphs for estimate the efficiency of the Robot.

  16. Feel, imagine and learn! - Haptic augmented simulation and embodied instruction in physics learning

    NASA Astrophysics Data System (ADS)

    Han, In Sook

    The purpose of this study was to investigate the potentials and effects of an embodied instructional model in abstract concept learning. This embodied instructional process included haptic augmented educational simulation as an instructional tool to provide perceptual experiences as well as further instruction to activate those previous experiences with perceptual simulation. In order to verify the effectiveness of this instructional model, haptic augmented simulation with three different haptic levels (force and kinesthetic, kinesthetic, and non-haptic) and instructional materials (narrative and expository) were developed and their effectiveness tested. 220 fifth grade students were recruited to participate in the study from three elementary schools located in lower SES neighborhoods in Bronx, New York. The study was conducted for three consecutive weeks in regular class periods. The data was analyzed using ANCOVA, ANOVA, and MANOVA. The result indicates that haptic augmented simulations, both the force and kinesthetic and the kinesthetic simulations, was more effective than the non-haptic simulation in providing perceptual experiences and helping elementary students to create multimodal representations about machines' movements. However, in most cases, force feedback was needed to construct a fully loaded multimodal representation that could be activated when the instruction with less sensory modalities was being given. In addition, the force and kinesthetic simulation was effective in providing cognitive grounding to comprehend a new learning content based on the multimodal representation created with enhanced force feedback. Regarding the instruction type, it was found that the narrative and the expository instructions did not make any difference in activating previous perceptual experiences. These findings suggest that it is important to help students to make a solid cognitive ground with perceptual anchor. Also, sequential abstraction process would deepen students' understanding by providing an opportunity to practice their mental simulation by removing sensory modalities used one by one and to gradually reach abstract level of understanding where students can imagine the machine's movements and working mechanisms with only abstract language without any perceptual supports.

  17. Prototype of haptic device for sole of foot using magnetic field sensitive elastomer

    NASA Astrophysics Data System (ADS)

    Kikuchi, T.; Masuda, Y.; Sugiyama, M.; Mitsumata, T.; Ohori, S.

    2013-02-01

    Walking is one of the most popular activities and a healthy aerobic exercise for the elderly. However, if they have physical and / or cognitive disabilities, sometimes it is challenging to go somewhere they don't know well. The final goal of this study is to develop a virtual reality walking system that allows users to walk in virtual worlds fabricated with computer graphics. We focus on a haptic device that can perform various plantar pressures on users' soles of feet as an additional sense in the virtual reality walking. In this study, we discuss a use of a magnetic field sensitive elastomer (MSE) as a working material for the haptic interface on the sole. The first prototype with MSE was developed and evaluated in this work. According to the measurement of planter pressures, it was found that this device can perform different pressures on the sole of a light-weight user by applying magnetic field on the MSE. The result also implied necessities of the improvement of the magnetic circuit and the basic structure of the mechanism of the device.

  18. Detection of Nuclear Sources by UAV Teleoperation Using a Visuo-Haptic Augmented Reality Interface

    PubMed Central

    Micconi, Giorgio; Caselli, Stefano; Benassi, Giacomo; Zambelli, Nicola; Bettelli, Manuele

    2017-01-01

    A visuo-haptic augmented reality (VHAR) interface is presented enabling an operator to teleoperate an unmanned aerial vehicle (UAV) equipped with a custom CdZnTe-based spectroscopic gamma-ray detector in outdoor environments. The task is to localize nuclear radiation sources, whose location is unknown to the user, without the close exposure of the operator. The developed detector also enables identification of the localized nuclear sources. The aim of the VHAR interface is to increase the situation awareness of the operator. The user teleoperates the UAV using a 3DOF haptic device that provides an attractive force feedback around the location of the most intense detected radiation source. Moreover, a fixed camera on the ground observes the environment where the UAV is flying. A 3D augmented reality scene is displayed on a computer screen accessible to the operator. Multiple types of graphical overlays are shown, including sensor data acquired by the nuclear radiation detector, a virtual cursor that tracks the UAV and geographical information, such as buildings. Experiments performed in a real environment are reported using an intense nuclear source. PMID:28961198

  19. Detection of Nuclear Sources by UAV Teleoperation Using a Visuo-Haptic Augmented Reality Interface.

    PubMed

    Aleotti, Jacopo; Micconi, Giorgio; Caselli, Stefano; Benassi, Giacomo; Zambelli, Nicola; Bettelli, Manuele; Zappettini, Andrea

    2017-09-29

    A visuo-haptic augmented reality (VHAR) interface is presented enabling an operator to teleoperate an unmanned aerial vehicle (UAV) equipped with a custom CdZnTe-based spectroscopic gamma-ray detector in outdoor environments. The task is to localize nuclear radiation sources, whose location is unknown to the user, without the close exposure of the operator. The developed detector also enables identification of the localized nuclear sources. The aim of the VHAR interface is to increase the situation awareness of the operator. The user teleoperates the UAV using a 3DOF haptic device that provides an attractive force feedback around the location of the most intense detected radiation source. Moreover, a fixed camera on the ground observes the environment where the UAV is flying. A 3D augmented reality scene is displayed on a computer screen accessible to the operator. Multiple types of graphical overlays are shown, including sensor data acquired by the nuclear radiation detector, a virtual cursor that tracks the UAV and geographical information, such as buildings. Experiments performed in a real environment are reported using an intense nuclear source.

  20. Shadow-driven 4D haptic visualization.

    PubMed

    Zhang, Hui; Hanson, Andrew

    2007-01-01

    Just as we can work with two-dimensional floor plans to communicate 3D architectural design, we can exploit reduced-dimension shadows to manipulate the higher-dimensional objects generating the shadows. In particular, by taking advantage of physically reactive 3D shadow-space controllers, we can transform the task of interacting with 4D objects to a new level of physical reality. We begin with a teaching tool that uses 2D knot diagrams to manipulate the geometry of 3D mathematical knots via their projections; our unique 2D haptic interface allows the user to become familiar with sketching, editing, exploration, and manipulation of 3D knots rendered as projected imageson a 2D shadow space. By combining graphics and collision-sensing haptics, we can enhance the 2D shadow-driven editing protocol to successfully leverage 2D pen-and-paper or blackboard skills. Building on the reduced-dimension 2D editing tool for manipulating 3D shapes, we develop the natural analogy to produce a reduced-dimension 3D tool for manipulating 4D shapes. By physically modeling the correct properties of 4D surfaces, their bending forces, and their collisions in the 3D haptic controller interface, we can support full-featured physical exploration of 4D mathematical objects in a manner that is otherwise far beyond the experience accessible to human beings. As far as we are aware, this paper reports the first interactive system with force-feedback that provides "4D haptic visualization" permitting the user to model and interact with 4D cloth-like objects.

  1. A haptic device for guide wire in interventional radiology procedures.

    PubMed

    Moix, Thomas; Ilic, Dejan; Bleuler, Hannes; Zoethout, Jurjen

    2006-01-01

    Interventional Radiology (IR) is a minimally invasive procedure where thin tubular instruments, guide wires and catheters, are steered through the patient's vascular system under X-ray imaging. In order to perform these procedures, a radiologist has to be trained to master hand-eye coordination, instrument manipulation and procedure protocols. The existing simulation systems all have major drawbacks: the use of modified instruments, unrealistic insertion lengths, high inertia of the haptic device that creates a noticeably degraded dynamic behavior or excessive friction that is not properly compensated for. In this paper we propose a quality training environment dedicated to IR. The system is composed of a virtual reality (VR) simulation of the patient's anatomy linked to a robotic interface providing haptic force feedback. This paper focuses on the requirements, design and prototyping of a specific haptic interface for guide wires.

  2. An Enhanced Soft Vibrotactile Actuator Based on ePVC Gel with Silicon Dioxide Nanoparticles.

    PubMed

    Park, Won-Hyeong; Shin, Eun-Jae; Yun, Sungryul; Kim, Sang-Youn

    2018-01-01

    In this paper, we propose a soft vibrotactile actuator made by mixing silicon dioxide nanoparticles and plasticized PVC gel. The effect of the silicon dioxide nanoparticles in the plasticized PVC gel for the haptic performance is investigated in terms of electric, dielectric, and mechanical properties. Furthermore, eight soft vibrotactile actuators are prepared as a function of the content. Experiments are conducted to examine the haptic performance of the prepared eight soft vibrotactile actuators and to find the best weight ratio of the plasticized PVC gel to the nanoparticles. The experiments should show that the plasticized PVC gel with silicon dioxide nanoparticles improves the haptic performance of the plasticized PVC gel-based vibrotactile actuator, and the proposed vibrotactile actuator can create a variety of haptic sensations in a wide frequency range.

  3. Validation and learning in the Procedicus KSA virtual reality surgical simulator.

    PubMed

    Ström, P; Kjellin, A; Hedman, L; Johnson, E; Wredmark, T; Felländer-Tsai, L

    2003-02-01

    Advanced simulator training within medicine is a rapidly growing field. Virtual reality simulators are being introduced as cost-saving educational tools, which also lead to increased patient safety. Fifteen medical students were included in the study. For 10 medical students performance was monitored, before and after 1 h of training, in two endoscopic simulators (the Procedicus KSA with haptic feedback and anatomical graphics and the established MIST simulator without this haptic feedback and graphics). Five medical students performed 50 tests in the Procedicus KSA in order to analyze learning curves. One of these five medical students performed multiple training sessions during 2 weeks and performed more than 300 tests. There was a significant improvement after 1 h of training regarding time, movement economy, and total score. The results in the two simulators were highly correlated. Our results show that the use of surgical simulators as a pedagogical tool in medical student training is encouraging. It shows rapid learning curves and our suggestion is to introduce endoscopic simulator training in undergraduate medical education during the course in surgery when motivation is high and before the development of "negative stereotypes" and incorrect practices.

  4. Visual-haptic integration with pliers and tongs: signal “weights” take account of changes in haptic sensitivity caused by different tools

    PubMed Central

    Takahashi, Chie; Watt, Simon J.

    2014-01-01

    When we hold an object while looking at it, estimates from visual and haptic cues to size are combined in a statistically optimal fashion, whereby the “weight” given to each signal reflects their relative reliabilities. This allows object properties to be estimated more precisely than would otherwise be possible. Tools such as pliers and tongs systematically perturb the mapping between object size and the hand opening. This could complicate visual-haptic integration because it may alter the reliability of the haptic signal, thereby disrupting the determination of appropriate signal weights. To investigate this we first measured the reliability of haptic size estimates made with virtual pliers-like tools (created using a stereoscopic display and force-feedback robots) with different “gains” between hand opening and object size. Haptic reliability in tool use was straightforwardly determined by a combination of sensitivity to changes in hand opening and the effects of tool geometry. The precise pattern of sensitivity to hand opening, which violated Weber's law, meant that haptic reliability changed with tool gain. We then examined whether the visuo-motor system accounts for these reliability changes. We measured the weight given to visual and haptic stimuli when both were available, again with different tool gains, by measuring the perceived size of stimuli in which visual and haptic sizes were varied independently. The weight given to each sensory cue changed with tool gain in a manner that closely resembled the predictions of optimal sensory integration. The results are consistent with the idea that different tool geometries are modeled by the brain, allowing it to calculate not only the distal properties of objects felt with tools, but also the certainty with which those properties are known. These findings highlight the flexibility of human sensory integration and tool-use, and potentially provide an approach for optimizing the design of visual-haptic devices. PMID:24592245

  5. Modeling and modification of medical 3D objects. The benefit of using a haptic modeling tool.

    PubMed

    Kling-Petersen, T; Rydmark, M

    2000-01-01

    The Computer Laboratory of the medical faculty in Goteborg (Mednet) has since the end of 1998 been one of a limited numbers of participants in the development of a new modeling tool together with SensAble Technologies Inc [http:¿www.sensable.com/]. The software called SensAble FreeForm was officially released at Siggraph September 1999. Briefly, the software mimics the modeling techniques traditionally used by clay artists. An imported model or a user defined block of "clay" can be modified using different tools such as a ball, square block, scrape etc via the use of a SensAble Technologies PHANToM haptic arm. The model will deform in 3D as a result of touching the "clay" with any selected tool and the amount of deformation is linear to the force applied. By getting instantaneous haptic as well as visual feedback, precise and intuitive changes are easily made. While SensAble FreeForm lacks several of the features normally associated with a 3D modeling program (such as text handling, application of surface and bumpmaps, high-end rendering engines, etc) it's strength lies in the ability to rapidly create non-geometric 3D models. For medical use, very few anatomically correct models are created from scratch. However, FreeForm features tools enable advanced modification of reconstructed or 3D scanned models. One of the main problems with 3D laserscanning of medical specimens is that the technique usually leaves holes or gaps in the dataset corresponding to areas in shadows such as orifices, deep grooves etc. By using FreeForms different tools, these defects are easily corrected and gaps are filled out. Similarly, traditional 3D reconstruction (based on serial sections etc) often shows artifacts as a result of the triangulation and/or tessellation processes. These artifacts usually manifest as unnatural ridges or uneven areas ("the accordion effect"). FreeForm contains a smoothing algorithm that enables the user to select an area to be modified and subsequently apply any given amount of smoothing to the object. While the final objects need to be exported for further 3D graphic manipulation, FreeForm addresses one of the most time consuming problems of 3D modeling: modification and creation of non-geometric 3D objects.

  6. Graphic and haptic simulation system for virtual laparoscopic rectum surgery.

    PubMed

    Pan, Jun J; Chang, Jian; Yang, Xiaosong; Zhang, Jian J; Qureshi, Tahseen; Howell, Robert; Hickish, Tamas

    2011-09-01

    Medical simulators with vision and haptic feedback techniques offer a cost-effective and efficient alternative to the traditional medical trainings. They have been used to train doctors in many specialties of medicine, allowing tasks to be practised in a safe and repetitive manner. This paper describes a virtual-reality (VR) system which will help to influence surgeons' learning curves in the technically challenging field of laparoscopic surgery of the rectum. Data from MRI of the rectum and real operation videos are used to construct the virtual models. A haptic force filter based on radial basis functions is designed to offer realistic and smooth force feedback. To handle collision detection efficiently, a hybrid model is presented to compute the deformation of intestines. Finally, a real-time cutting technique based on mesh is employed to represent the incision operation. Despite numerous research efforts, fast and realistic solutions of soft tissues with large deformation, such as intestines, prove extremely challenging. This paper introduces our latest contribution to this endeavour. With this system, the user can haptically operate with the virtual rectum and simultaneously watch the soft tissue deformation. Our system has been tested by colorectal surgeons who believe that the simulated tactile and visual feedbacks are realistic. It could replace the traditional training process and effectively transfer surgical skills to novices. Copyright © 2011 John Wiley & Sons, Ltd.

  7. Face and construct validity of a computer-based virtual reality simulator for ERCP.

    PubMed

    Bittner, James G; Mellinger, John D; Imam, Toufic; Schade, Robert R; Macfadyen, Bruce V

    2010-02-01

    Currently, little evidence supports computer-based simulation for ERCP training. To determine face and construct validity of a computer-based simulator for ERCP and assess its perceived utility as a training tool. Novice and expert endoscopists completed 2 simulated ERCP cases by using the GI Mentor II. Virtual Education and Surgical Simulation Laboratory, Medical College of Georgia. Outcomes included times to complete the procedure, reach the papilla, and use fluoroscopy; attempts to cannulate the papilla, pancreatic duct, and common bile duct; and number of contrast injections and complications. Subjects assessed simulator graphics, procedural accuracy, difficulty, haptics, overall realism, and training potential. Only when performance data from cases A and B were combined did the GI Mentor II differentiate novices and experts based on times to complete the procedure, reach the papilla, and use fluoroscopy. Across skill levels, overall opinions were similar regarding graphics (moderately realistic), accuracy (similar to clinical ERCP), difficulty (similar to clinical ERCP), overall realism (moderately realistic), and haptics. Most participants (92%) claimed that the simulator has definite training potential or should be required for training. Small sample size, single institution. The GI Mentor II demonstrated construct validity for ERCP based on select metrics. Most subjects thought that the simulated graphics, procedural accuracy, and overall realism exhibit face validity. Subjects deemed it a useful training tool. Study repetition involving more participants and cases may help confirm results and establish the simulator's ability to differentiate skill levels based on ERCP-specific metrics.

  8. Haptic fMRI: using classification to quantify task-correlated noise during goal-directed reaching motions.

    PubMed

    Menon, Samir; Quigley, Paul; Yu, Michelle; Khatib, Oussama

    2014-01-01

    Neuroimaging artifacts in haptic functional magnetic resonance imaging (Haptic fMRI) experiments have the potential to induce spurious fMRI activation where there is none, or to make neural activation measurements appear correlated across brain regions when they are actually not. Here, we demonstrate that performing three-dimensional goal-directed reaching motions while operating Haptic fMRI Interface (HFI) does not create confounding motion artifacts. To test for artifacts, we simultaneously scanned a subject's brain with a customized soft phantom placed a few centimeters away from the subject's left motor cortex. The phantom captured task-related motion and haptic noise, but did not contain associated neural activation measurements. We quantified the task-related information present in fMRI measurements taken from the brain and the phantom by using a linear max-margin classifier to predict whether raw time series data could differentiate between motion planning or reaching. fMRI measurements in the phantom were uninformative (2σ, 45-73%; chance=50%), while those in primary motor, visual, and somatosensory cortex accurately classified task-conditions (2σ, 90-96%). We also localized artifacts due to the haptic interface alone by scanning a stand-alone fBIRN phantom, while an operator performed haptic tasks outside the scanner's bore with the interface at the same location. The stand-alone phantom had lower temporal noise and had similar mean classification but a tighter distribution (bootstrap Gaussian fit) than the brain phantom. Our results suggest that any fMRI measurement artifacts for Haptic fMRI reaching experiments are dominated by actual neural responses.

  9. A haptics-assisted cranio-maxillofacial surgery planning system for restoring skeletal anatomy in complex trauma cases.

    PubMed

    Olsson, Pontus; Nysjö, Fredrik; Hirsch, Jan-Michaél; Carlbom, Ingrid B

    2013-11-01

       Cranio-maxillofacial (CMF) surgery to restore normal skeletal anatomy in patients with serious trauma to the face can be both complex and time-consuming. But it is generally accepted that careful pre-operative planning leads to a better outcome with a higher degree of function and reduced morbidity in addition to reduced time in the operating room. However, today's surgery planning systems are primitive, relying mostly on the user's ability to plan complex tasks with a two-dimensional graphical interface.    A system for planning the restoration of skeletal anatomy in facial trauma patients using a virtual model derived from patient-specific CT data. The system combines stereo visualization with six degrees-of-freedom, high-fidelity haptic feedback that enables analysis, planning, and preoperative testing of alternative solutions for restoring bone fragments to their proper positions. The stereo display provides accurate visual spatial perception, and the haptics system provides intuitive haptic feedback when bone fragments are in contact as well as six degrees-of-freedom attraction forces for precise bone fragment alignment.    A senior surgeon without prior experience of the system received 45 min of system training. Following the training session, he completed a virtual reconstruction in 22 min of a complex mandibular fracture with an adequately reduced result.    Preliminary testing with one surgeon indicates that our surgery planning system, which combines stereo visualization with sophisticated haptics, has the potential to become a powerful tool for CMF surgery planning. With little training, it allows a surgeon to complete a complex plan in a short amount of time.

  10. H-Man: a planar, H-shape cabled differential robotic manipulandum for experiments on human motor control.

    PubMed

    Campolo, Domenico; Tommasino, Paolo; Gamage, Kumudu; Klein, Julius; Hughes, Charmayne M L; Masia, Lorenzo

    2014-09-30

    In the last decades more robotic manipulanda have been employed to investigate the effect of haptic environments on motor learning and rehabilitation. However, implementing complex haptic renderings can be challenging from technological and control perspectives. We propose a novel robot (H-Man) characterized by a mechanical design based on cabled differential transmission providing advantages over current robotic technology. The H-Man transmission translates to extremely simplified kinematics and homogenous dynamic properties, offering the possibility to generate haptic channels by passively blocking the mechanics, and eliminating stability concerns. We report results of experiments characterizing the performance of the device (haptic bandwidth, Z-width, and perceived impedance). We also present the results of a study investigating the influence of haptic channel compliance on motor learning in healthy individuals, which highlights the effects of channel compliance in enhancing proprioceptive information. The generation of haptic channels to study motor redundancy is not easy for actual robots because of the needs of powerful actuation and complex real-time control implementation. The mechanical design of H-Man affords the possibility to promptly create haptic channels by mechanical stoppers (on one of the motors) without compromising the superior backdriveability and high isotropic manipulability. This paper presents a novel robotic device for motor control studies and robotic rehabilitation. The hardware was designed with specific emphasis on the mechanics that result in a system that is easy to control, homogeneous, and is intrinsically safe for use. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Effects of kinesthetic haptic feedback on standing stability of young healthy subjects and stroke patients.

    PubMed

    Afzal, Muhammad Raheel; Byun, Ha-Young; Oh, Min-Kyun; Yoon, Jungwon

    2015-03-13

    Haptic control is a useful therapeutic option in rehabilitation featuring virtual reality interaction. As with visual and vibrotactile biofeedback, kinesthetic haptic feedback may assist in postural control, and can achieve balance control. Kinesthetic haptic feedback in terms of body sway can be delivered via a commercially available haptic device and can enhance the balance stability of both young healthy subjects and stroke patients. Our system features a waist-attached smartphone, software running on a computer (PC), and a dedicated Phantom Omni® device. Young healthy participants performed balance tasks after assumption of each of four distinct postures for 30 s (one foot on the ground; the Tandem Romberg stance; one foot on foam; and the Tandem Romberg stance on foam) with eyes closed. Patient eyes were not closed and assumption of the Romberg stance (only) was tested during a balance task 25 s in duration. An Android application running continuously on the smartphone sent mediolateral (ML) and anteroposterior (AP) tilt angles to a PC, which generated kinesthetic haptic feedback via Phantom Omni®. A total of 16 subjects, 8 of whom were young healthy and 8 of whom had suffered stroke, participated in the study. Post-experiment data analysis was performed using MATLAB®. Mean Velocity Displacement (MVD), Planar Deviation (PD), Mediolateral Trajectory (MLT) and Anteroposterior Trajectory (APT) parameters were analyzed to measure reduction in body sway. Our kinesthetic haptic feedback system was effective to reduce postural sway in young healthy subjects regardless of posture and the condition of the substrate (the ground) and to improve MVD and PD in stroke patients who assumed the Romberg stance. Analysis of Variance (ANOVA) revealed that kinesthetic haptic feedback significantly reduced body sway in both categories of subjects. Kinesthetic haptic feedback can be implemented using a commercial haptic device and a smartphone. Intuitive balance cues were created using the handle of a haptic device, rendering the approach very simple yet efficient in practice. This novel form of biofeedback will be a useful rehabilitation tool improving the balance of stroke patients.

  12. Understanding Graphics on a Scalable Latching Assistive Haptic Display Using a Shape Memory Polymer Membrane.

    PubMed

    Besse, Nadine; Rosset, Samuel; Zarate, Juan Jose; Ferrari, Elisabetta; Brayda, Luca; Shea, Herbert

    2018-01-01

    We present a fully latching and scalable 4 × 4 haptic display with 4 mm pitch, 5 s refresh time, 400 mN holding force, and 650 μm displacement per taxel. The display serves to convey dynamic graphical information to blind and visually impaired users. Combining significant holding force with high taxel density and large amplitude motion in a very compact overall form factor was made possible by exploiting the reversible, fast, hundred-fold change in the stiffness of a thin shape memory polymer (SMP) membrane when heated above its glass transition temperature. Local heating is produced using an addressable array of 3 mm in diameter stretchable microheaters patterned on the SMP. Each taxel is selectively and independently actuated by synchronizing the local Joule heating with a single pressure supply. Switching off the heating locks each taxel into its position (up or down), enabling holding any array configuration with zero power consumption. A 3D-printed pin array is mounted over the SMP membrane, providing the user with a smooth and room temperature array of movable pins to explore by touch. Perception tests were carried out with 24 blind users resulting in 70 percent correct pattern recognition over a 12-word tactile dictionary.

  13. Tele-rehabilitation using in-house wearable ankle rehabilitation robot.

    PubMed

    Jamwal, Prashant K; Hussain, Shahid; Mir-Nasiri, Nazim; Ghayesh, Mergen H; Xie, Sheng Q

    2018-01-01

    This article explores wide-ranging potential of the wearable ankle robot for in-house rehabilitation. The presented robot has been conceptualized following a brief analysis of the existing technologies, systems, and solutions for in-house physical ankle rehabilitation. Configuration design analysis and component selection for ankle robot have been discussed as part of the conceptual design. The complexities of human robot interaction are closely encountered while maneuvering a rehabilitation robot. We present a fuzzy logic-based controller to perform the required robot-assisted ankle rehabilitation treatment. Designs of visual haptic interfaces have also been discussed, which will make the treatment interesting, and the subject will be motivated to exert more and regain lost functions rapidly. The complex nature of web-based communication between user and remotely sitting physiotherapy staff has also been discussed. A high-level software architecture appended with robot ensures user-friendly operations. This software is made up of three important components: patient-related database, graphical user interface (GUI), and a library of exercises creating virtual reality-specifically developed for ankle rehabilitation.

  14. Evaluation of Pseudo-Haptic Interactions with Soft Objects in Virtual Environments.

    PubMed

    Li, Min; Sareh, Sina; Xu, Guanghua; Ridzuan, Maisarah Binti; Luo, Shan; Xie, Jun; Wurdemann, Helge; Althoefer, Kaspar

    2016-01-01

    This paper proposes a pseudo-haptic feedback method conveying simulated soft surface stiffness information through a visual interface. The method exploits a combination of two feedback techniques, namely visual feedback of soft surface deformation and control of the indenter avatar speed, to convey stiffness information of a simulated surface of a soft object in virtual environments. The proposed method was effective in distinguishing different sizes of virtual hard nodules integrated into the simulated soft bodies. To further improve the interactive experience, the approach was extended creating a multi-point pseudo-haptic feedback system. A comparison with regards to (a) nodule detection sensitivity and (b) elapsed time as performance indicators in hard nodule detection experiments to a tablet computer incorporating vibration feedback was conducted. The multi-point pseudo-haptic interaction is shown to be more time-efficient than the single-point pseudo-haptic interaction. It is noted that multi-point pseudo-haptic feedback performs similarly well when compared to a vibration-based feedback method based on both performance measures elapsed time and nodule detection sensitivity. This proves that the proposed method can be used to convey detailed haptic information for virtual environmental tasks, even subtle ones, using either a computer mouse or a pressure sensitive device as an input device. This pseudo-haptic feedback method provides an opportunity for low-cost simulation of objects with soft surfaces and hard inclusions, as, for example, occurring in ever more realistic video games with increasing emphasis on interaction with the physical environment and minimally invasive surgery in the form of soft tissue organs with embedded cancer nodules. Hence, the method can be used in many low-budget applications where haptic sensation is required, such as surgeon training or video games, either using desktop computers or portable devices, showing reasonably high fidelity in conveying stiffness perception to the user.

  15. SURGICAL OUTCOME OF SIMULTANEOUS INTRAOCULAR LENS RESCUE AND SUTURELESS INTRASCLERAL TUNNEL FIXATION OF DISLOCATED INTRAOCULAR LENSES.

    PubMed

    Kim, Min; Lee, Dong H; Koh, Hyoung J; Lee, Sung C; Kim, Sung S

    2015-07-01

    To report short-term surgical outcomes of single-stage simultaneous rescue and sutureless intrascleral fixation of dislocated intraocular lens (IOLs). Sixteen eyes of 16 patients who underwent simultaneous rescue and intrascleral fixation of dislocated 3-piece IOLs were retrospectively evaluated. Partial thickness limbal-based scleral flaps (2.0 × 2.0 mm) were created, and a 22-gauge round needle was used to create a sclerotomy at 1.5 mm from the limbus under the previously created scleral flap, and a 23-gauge trans pars plana vitrectomy was performed. Bimanual maneuvers using two 23-gauge end-grasping forceps under chandelier illumination and a wide-angle viewing system enabled 1 step rescue of IOLs from the posterior vitreous cavity with 1 hand and simultaneous haptic externalization through sclerotomy with the other hand. An externalized haptic was placed into the 3-mm intrascleral tunnel created using a bent 26-gauge needle. Fibrin glue was used to fixate haptics and close the scleral flaps. Intraocular lenses were successfully rescued and sclera-fixated through intrascleral tunnels in all 16 eyes (mean age, 56.56 ± 19.89 years). The mean preoperative logarithm of the minimum angle of resolution best-corrected visual acuity was 0.92 ± 0.68, and this significantly improved at 6 months to 0.289 ± 0.36 (P = 0.003). During the follow-up period (10.1 ± 3.21 months), no significant change of endothelial cell count or central foveal thickness was noted postoperatively (P = 0.203 and P = 0.979, respectively). There were no significant postoperative complications such as IOL dislocation, IOL decentration, retinal detachment, endophthalmitis, or postoperative hypotony. Simultaneous rescue and sutureless intrascleral haptic fixation of dislocated 3-piece IOLs using bimanual maneuvers is an effective, safe, and minimally invasive surgical method to rescue and fixate the dislocated IOL without further explant.

  16. SeaTouch: A Haptic and Auditory Maritime Environment for Non Visual Cognitive Mapping of Blind Sailors

    NASA Astrophysics Data System (ADS)

    Simonnet, Mathieu; Jacobson, Dan; Vieilledent, Stephane; Tisseau, Jacques

    Navigating consists of coordinating egocentric and allocentric spatial frames of reference. Virtual environments have afforded researchers in the spatial community with tools to investigate the learning of space. The issue of the transfer between virtual and real situations is not trivial. A central question is the role of frames of reference in mediating spatial knowledge transfer to external surroundings, as is the effect of different sensory modalities accessed in simulated and real worlds. This challenges the capacity of blind people to use virtual reality to explore a scene without graphics. The present experiment involves a haptic and auditory maritime virtual environment. In triangulation tasks, we measure systematic errors and preliminary results show an ability to learn configurational knowledge and to navigate through it without vision. Subjects appeared to take advantage of getting lost in an egocentric “haptic” view in the virtual environment to improve performances in the real environment.

  17. Importance of Matching Physical Friction, Hardness, and Texture in Creating Realistic Haptic Virtual Surfaces.

    PubMed

    Culbertson, Heather; Kuchenbecker, Katherine J

    2017-01-01

    Interacting with physical objects through a tool elicits tactile and kinesthetic sensations that comprise your haptic impression of the object. These cues, however, are largely missing from interactions with virtual objects, yielding an unrealistic user experience. This article evaluates the realism of virtual surfaces rendered using haptic models constructed from data recorded during interactions with real surfaces. The models include three components: surface friction, tapping transients, and texture vibrations. We render the virtual surfaces on a SensAble Phantom Omni haptic interface augmented with a Tactile Labs Haptuator for vibration output. We conducted a human-subject study to assess the realism of these virtual surfaces and the importance of the three model components. Following a perceptual discrepancy paradigm, subjects compared each of 15 real surfaces to a full rendering of the same surface plus versions missing each model component. The realism improvement achieved by including friction, tapping, or texture in the rendering was found to directly relate to the intensity of the surface's property in that domain (slipperiness, hardness, or roughness). A subsequent analysis of forces and vibrations measured during interactions with virtual surfaces indicated that the Omni's inherent mechanical properties corrupted the user's haptic experience, decreasing realism of the virtual surface.

  18. Man, mind, and machine: the past and future of virtual reality simulation in neurologic surgery.

    PubMed

    Robison, R Aaron; Liu, Charles Y; Apuzzo, Michael L J

    2011-11-01

    To review virtual reality in neurosurgery, including the history of simulation and virtual reality and some of the current implementations; to examine some of the technical challenges involved; and to propose a potential paradigm for the development of virtual reality in neurosurgery going forward. A search was made on PubMed using key words surgical simulation, virtual reality, haptics, collision detection, and volumetric modeling to assess the current status of virtual reality in neurosurgery. Based on previous results, investigators extrapolated the possible integration of existing efforts and potential future directions. Simulation has a rich history in surgical training, and there are numerous currently existing applications and systems that involve virtual reality. All existing applications are limited to specific task-oriented functions and typically sacrifice visual realism for real-time interactivity or vice versa, owing to numerous technical challenges in rendering a virtual space in real time, including graphic and tissue modeling, collision detection, and direction of the haptic interface. With ongoing technical advancements in computer hardware and graphic and physical rendering, incremental or modular development of a fully immersive, multipurpose virtual reality neurosurgical simulator is feasible. The use of virtual reality in neurosurgery is predicted to change the nature of neurosurgical education, and to play an increased role in surgical rehearsal and the continuing education and credentialing of surgical practitioners. Copyright © 2011 Elsevier Inc. All rights reserved.

  19. Haptic force-feedback devices for the office computer: performance and musculoskeletal loading issues.

    PubMed

    Dennerlein, J T; Yang, M C

    2001-01-01

    Pointing devices, essential input tools for the graphical user interface (GUI) of desktop computers, require precise motor control and dexterity to use. Haptic force-feedback devices provide the human operator with tactile cues, adding the sense of touch to existing visual and auditory interfaces. However, the performance enhancements, comfort, and possible musculoskeletal loading of using a force-feedback device in an office environment are unknown. Hypothesizing that the time to perform a task and the self-reported pain and discomfort of the task improve with the addition of force feedback, 26 people ranging in age from 22 to 44 years performed a point-and-click task 540 times with and without an attractive force field surrounding the desired target. The point-and-click movements were approximately 25% faster with the addition of force feedback (paired t-tests, p < 0.001). Perceived user discomfort and pain, as measured through a questionnaire, were also smaller with the addition of force feedback (p < 0.001). However, this difference decreased as additional distracting force fields were added to the task environment, simulating a more realistic work situation. These results suggest that for a given task, use of a force-feedback device improves performance, and potentially reduces musculoskeletal loading during mouse use. Actual or potential applications of this research include human-computer interface design, specifically that of the pointing device extensively used for the graphical user interface.

  20. Design of high-fidelity haptic display for one-dimensional force reflection applications

    NASA Astrophysics Data System (ADS)

    Gillespie, Brent; Rosenberg, Louis B.

    1995-12-01

    This paper discusses the development of a virtual reality platform for the simulation of medical procedures which involve needle insertion into human tissue. The paper's focus is the hardware and software requirements for haptic display of a particular medical procedure known as epidural analgesia. To perform this delicate manual procedure, an anesthesiologist must carefully guide a needle through various layers of tissue using only haptic cues for guidance. As a simplifying aspect for the simulator design, all motions and forces involved in the task occur along a fixed line once insertion begins. To create a haptic representation of this procedure, we have explored both physical modeling and perceptual modeling techniques. A preliminary physical model was built based on CT-scan data of the operative site. A preliminary perceptual model was built based on current training techniques for the procedure provided by a skilled instructor. We compare and contrast these two modeling methods and discuss the implications of each. We select and defend the perceptual model as a superior approach for the epidural analgesia simulator.

  1. Virtual reality haptic dissection.

    PubMed

    Erolin, Caroline; Wilkinson, Caroline; Soames, Roger

    2011-12-01

    This project aims to create a three-dimensional digital model of the human hand and wrist which can be virtually 'dissected' through a haptic interface. Tissue properties will be added to the various anatomical structures to replicate a realistic look and feel. The project will explore the role of the medical artist, and investigate cross-discipline collaborations in the field of virtual anatomy. The software will be used to train anatomy students in dissection skills, before experience on a real cadaver. The effectiveness of the software will be evaluated and assessed both quantitatively as well as qualitatively.

  2. Development of a virtual reality haptic Veress needle insertion simulator for surgical skills training.

    PubMed

    Okrainec, A; Farcas, M; Henao, O; Choy, I; Green, J; Fotoohi, M; Leslie, R; Wight, D; Karam, P; Gonzalez, N; Apkarian, J

    2009-01-01

    The Veress needle is the most commonly used technique for creating the pneumoperitoneum at the start of a laparoscopic surgical procedure. Inserting the Veress needle correctly is crucial since errors can cause significant harm to patients. Unfortunately, this technique can be difficult to teach since surgeons rely heavily on tactile feedback while advancing the needle through the various layers of the abdominal wall. This critical step in laparoscopy, therefore, can be challenging for novice trainees to learn without adequate opportunities to practice in a safe environment with no risk of injury to patients. To address this issue, we have successfully developed a prototype of a virtual reality haptic needle insertion simulator using the tactile feedback of 22 surgeons to set realistic haptic parameters. A survey of these surgeons concluded that our device appeared and felt realistic, and could potentially be a useful tool for teaching the proper technique of Veress needle insertion.

  3. Virtual reality simulation in neurosurgery: technologies and evolution.

    PubMed

    Chan, Sonny; Conti, François; Salisbury, Kenneth; Blevins, Nikolas H

    2013-01-01

    Neurosurgeons are faced with the challenge of learning, planning, and performing increasingly complex surgical procedures in which there is little room for error. With improvements in computational power and advances in visual and haptic display technologies, virtual surgical environments can now offer potential benefits for surgical training, planning, and rehearsal in a safe, simulated setting. This article introduces the various classes of surgical simulators and their respective purposes through a brief survey of representative simulation systems in the context of neurosurgery. Many technical challenges currently limit the application of virtual surgical environments. Although we cannot yet expect a digital patient to be indistinguishable from reality, new developments in computational methods and related technology bring us closer every day. We recognize that the design and implementation of an immersive virtual reality surgical simulator require expert knowledge from many disciplines. This article highlights a selection of recent developments in research areas related to virtual reality simulation, including anatomic modeling, computer graphics and visualization, haptics, and physics simulation, and discusses their implication for the simulation of neurosurgery.

  4. Virtual reality haptic human dissection.

    PubMed

    Needham, Caroline; Wilkinson, Caroline; Soames, Roger

    2011-01-01

    This project aims to create a three-dimensional digital model of the human hand and wrist which can be virtually 'dissected' through a haptic interface. Tissue properties will be added to the various anatomical structures to replicate a realistic look and feel. The project will explore the role of the medical artist and investigate the cross-discipline collaborations required in the field of virtual anatomy. The software will be used to train anatomy students in dissection skills before experience on a real cadaver. The effectiveness of the software will be evaluated and assessed both quantitatively as well as qualitatively.

  5. Stereoscopic visualization and haptic technology used to create a virtual environment for remote surgery - biomed 2011.

    PubMed

    Bornhoft, J M; Strabala, K W; Wortman, T D; Lehman, A C; Oleynikov, D; Farritor, S M

    2011-01-01

    The objective of this research is to study the effectiveness of using a stereoscopic visualization system for performing remote surgery. The use of stereoscopic vision has become common with the advent of the da Vinci® system (Intuitive, Sunnyvale CA). This system creates a virtual environment that consists of a 3-D display for visual feedback and haptic tactile feedback, together providing an intuitive environment for remote surgical applications. This study will use simple in vivo robotic surgical devices and compare the performance of surgeons using the stereoscopic interfacing system to the performance of surgeons using one dimensional monitors. The stereoscopic viewing system consists of two cameras, two monitors, and four mirrors. The cameras are mounted to a multi-functional miniature in vivo robot; and mimic the depth perception of the actual human eyes. This is done by placing the cameras at a calculated angle and distance apart. Live video streams from the left and right cameras are displayed on the left and right monitors, respectively. A system of angled mirrors allows the left and right eyes to see the video stream from the left and right monitor, respectively, creating the illusion of depth. The haptic interface consists of two PHANTOM Omni® (SensAble, Woburn Ma) controllers. These controllers measure the position and orientation of a pen-like end effector with three degrees of freedom. As the surgeon uses this interface, they see a 3-D image and feel force feedback for collision and workspace limits. The stereoscopic viewing system has been used in several surgical training tests and shows a potential improvement in depth perception and 3-D vision. The haptic system accurately gives force feedback that aids in surgery. Both have been used in non-survival animal surgeries, and have successfully been used in suturing and gallbladder removal. Bench top experiments using the interfacing system have also been conducted. A group of participants completed two different surgical training tasks using both a two dimensional visual system and the stereoscopic visual system. Results suggest that the stereoscopic visual system decreased the amount of time taken to complete the tasks. All participants also reported that the stereoscopic system was easier to utilize than the two dimensional system. Haptic controllers combined with stereoscopic vision provides for a more intuitive virtual environment. This system provides the surgeon with 3-D vision, depth perception, and the ability to receive feedback through forces applied in the haptic controller while performing surgery. These capabilities potentially enable the performance of more complex surgeries with a higher level of precision.

  6. Lack of transfer of skills after virtual reality simulator training with haptic feedback.

    PubMed

    Våpenstad, Cecilie; Hofstad, Erlend Fagertun; Bø, Lars Eirik; Kuhry, Esther; Johnsen, Gjermund; Mårvik, Ronald; Langø, Thomas; Hernes, Toril Nagelhus

    2017-12-01

    Virtual reality (VR) simulators enrich surgical training and offer training possibilities outside of the operating room (OR). In this study, we created a criterion-based training program on a VR simulator with haptic feedback and tested it by comparing the performances of a simulator group against a control group. Medical students with no experience in laparoscopy were randomly assigned to a simulator group or a control group. In the simulator group, the candidates trained until they reached predefined criteria on the LapSim ® VR simulator (Surgical Science AB, Göteborg, Sweden) with haptic feedback (Xitact TM IHP, Mentice AB, Göteborg, Sweden). All candidates performed a cholecystectomy on a porcine organ model in a box trainer (the clinical setting). The performances were video rated by two surgeons blinded to subject training status. In total, 30 students performed the cholecystectomy and had their videos rated (N = 16 simulator group, N = 14 control group). The control group achieved better video rating scores than the simulator group (p < .05). The criterion-based training program did not transfer skills to the clinical setting. Poor mechanical performance of the simulated haptic feedback is believed to have resulted in a negative training effect.

  7. Integration of soft tissue model and open haptic device for medical training simulator

    NASA Astrophysics Data System (ADS)

    Akasum, G. F.; Ramdhania, L. N.; Suprijanto; Widyotriatmo, A.

    2016-03-01

    Minimally Invasive Surgery (MIS) has been widely used to perform any surgical procedures nowadays. Currently, MIS has been applied in some cases in Indonesia. Needle insertion is one of simple MIS procedure that can be used for some purposes. Before the needle insertion technique used in the real situation, it essential to train this type of medical student skills. The research has developed an open platform of needle insertion simulator with haptic feedback that providing the medical student a realistic feel encountered during the actual procedures. There are three main steps in build the training simulator, which are configure hardware system, develop a program to create soft tissue model and the integration of hardware and software. For evaluating its performance, haptic simulator was tested by 24 volunteers on a scenario of soft tissue model. Each volunteer must insert the needle on simulator until rearch the target point with visual feedback that visualized on the monitor. From the result it can concluded that the soft tissue model can bring the sensation of touch through the perceived force feedback on haptic actuator by looking at the different force in accordance with different stiffness in each layer.

  8. Haptics in forensics: the possibilities and advantages in using the haptic device for reconstruction approaches in forensic science.

    PubMed

    Buck, Ursula; Naether, Silvio; Braun, Marcel; Thali, Michael

    2008-09-18

    Non-invasive documentation methods such as surface scanning and radiological imaging are gaining in importance in the forensic field. These three-dimensional technologies provide digital 3D data, which are processed and handled in the computer. However, the sense of touch gets lost using the virtual approach. The haptic device enables the use of the sense of touch to handle and feel digital 3D data. The multifunctional application of a haptic device for forensic approaches is evaluated and illustrated in three different cases: the representation of bone fractures of the lower extremities, by traffic accidents, in a non-invasive manner; the comparison of bone injuries with the presumed injury-inflicting instrument; and in a gunshot case, the identification of the gun by the muzzle imprint, and the reconstruction of the holding position of the gun. The 3D models of the bones are generated from the Computed Tomography (CT) images. The 3D models of the exterior injuries, the injury-inflicting tools and the bone injuries, where a higher resolution is necessary, are created by the optical surface scan. The haptic device is used in combination with the software FreeForm Modelling Plus for touching the surface of the 3D models to feel the minute injuries and the surface of tools, to reposition displaced bone parts and to compare an injury-causing instrument with an injury. The repositioning of 3D models in a reconstruction is easier, faster and more precisely executed by means of using the sense of touch and with the user-friendly movement in the 3D space. For representation purposes, the fracture lines of bones are coloured. This work demonstrates that the haptic device is a suitable and efficient application in forensic science. The haptic device offers a new way in the handling of digital data in the virtual 3D space.

  9. Virtual reality cerebral aneurysm clipping simulation with real-time haptic feedback.

    PubMed

    Alaraj, Ali; Luciano, Cristian J; Bailey, Daniel P; Elsenousi, Abdussalam; Roitberg, Ben Z; Bernardo, Antonio; Banerjee, P Pat; Charbel, Fady T

    2015-03-01

    With the decrease in the number of cerebral aneurysms treated surgically and the increase of complexity of those treated surgically, there is a need for simulation-based tools to teach future neurosurgeons the operative techniques of aneurysm clipping. To develop and evaluate the usefulness of a new haptic-based virtual reality simulator in the training of neurosurgical residents. A real-time sensory haptic feedback virtual reality aneurysm clipping simulator was developed using the ImmersiveTouch platform. A prototype middle cerebral artery aneurysm simulation was created from a computed tomographic angiogram. Aneurysm and vessel volume deformation and haptic feedback are provided in a 3-dimensional immersive virtual reality environment. Intraoperative aneurysm rupture was also simulated. Seventeen neurosurgery residents from 3 residency programs tested the simulator and provided feedback on its usefulness and resemblance to real aneurysm clipping surgery. Residents thought that the simulation would be useful in preparing for real-life surgery. About two-thirds of the residents thought that the 3-dimensional immersive anatomic details provided a close resemblance to real operative anatomy and accurate guidance for deciding surgical approaches. They thought the simulation was useful for preoperative surgical rehearsal and neurosurgical training. A third of the residents thought that the technology in its current form provided realistic haptic feedback for aneurysm surgery. Neurosurgical residents thought that the novel immersive VR simulator is helpful in their training, especially because they do not get a chance to perform aneurysm clippings until late in their residency programs.

  10. Soft Somatosensitive Actuators via Embedded 3D Printing.

    PubMed

    Truby, Ryan L; Wehner, Michael; Grosskopf, Abigail K; Vogt, Daniel M; Uzel, Sebastien G M; Wood, Robert J; Lewis, Jennifer A

    2018-04-01

    Humans possess manual dexterity, motor skills, and other physical abilities that rely on feedback provided by the somatosensory system. Herein, a method is reported for creating soft somatosensitive actuators (SSAs) via embedded 3D printing, which are innervated with multiple conductive features that simultaneously enable haptic, proprioceptive, and thermoceptive sensing. This novel manufacturing approach enables the seamless integration of multiple ionically conductive and fluidic features within elastomeric matrices to produce SSAs with the desired bioinspired sensing and actuation capabilities. Each printed sensor is composed of an ionically conductive gel that exhibits both long-term stability and hysteresis-free performance. As an exemplar, multiple SSAs are combined into a soft robotic gripper that provides proprioceptive and haptic feedback via embedded curvature, inflation, and contact sensors, including deep and fine touch contact sensors. The multimaterial manufacturing platform enables complex sensing motifs to be easily integrated into soft actuating systems, which is a necessary step toward closed-loop feedback control of soft robots, machines, and haptic devices. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. A user's guide for DTIZE an interactive digitizing and graphical editing computer program

    NASA Technical Reports Server (NTRS)

    Thomas, C. C.

    1981-01-01

    A guide for DTIZE, a two dimensional digitizing program with graphical editing capability, is presented. DTIZE provides the capability to simultaneously create and display a picture on the display screen. Data descriptions may be permanently saved in three different formats. DTIZE creates the picture graphics in the locator mode, thus inputting one coordinate each time the terminator button is pushed. Graphic input devices (GIN) are also used to select function command menu. These menu commands and the program's interactive prompting sequences provide a complete capability for creating, editing, and permanently recording a graphical picture file. DTIZE is written in FORTRAN IV language for the Tektronix 4081 graphic system utilizing the Plot 80 Distributed Graphics Library (DGL) subroutines. The Tektronix 4953/3954 Graphic Tablet with mouse, pen, or joystick are used as graphics input devices to create picture graphics.

  12. Evaluation of stiffness feedback for hard nodule identification on a phantom silicone model

    PubMed Central

    Konstantinova, Jelizaveta; Xu, Guanghua; He, Bo; Aminzadeh, Vahid; Xie, Jun; Wurdemann, Helge; Althoefer, Kaspar

    2017-01-01

    Haptic information in robotic surgery can significantly improve clinical outcomes and help detect hard soft-tissue inclusions that indicate potential abnormalities. Visual representation of tissue stiffness information is a cost-effective technique. Meanwhile, direct force feedback, although considerably more expensive than visual representation, is an intuitive method of conveying information regarding tissue stiffness to surgeons. In this study, real-time visual stiffness feedback by sliding indentation palpation is proposed, validated, and compared with force feedback involving human subjects. In an experimental tele-manipulation environment, a dynamically updated color map depicting the stiffness of probed soft tissue is presented via a graphical interface. The force feedback is provided, aided by a master haptic device. The haptic device uses data acquired from an F/T sensor attached to the end-effector of a tele-manipulated robot. Hard nodule detection performance is evaluated for 2 modes (force feedback and visual stiffness feedback) of stiffness feedback on an artificial organ containing buried stiff nodules. From this artificial organ, a virtual-environment tissue model is generated based on sliding indentation measurements. Employing this virtual-environment tissue model, we compare the performance of human participants in distinguishing differently sized hard nodules by force feedback and visual stiffness feedback. Results indicate that the proposed distributed visual representation of tissue stiffness can be used effectively for hard nodule identification. The representation can also be used as a sufficient substitute for force feedback in tissue palpation. PMID:28248996

  13. Evaluation of stiffness feedback for hard nodule identification on a phantom silicone model.

    PubMed

    Li, Min; Konstantinova, Jelizaveta; Xu, Guanghua; He, Bo; Aminzadeh, Vahid; Xie, Jun; Wurdemann, Helge; Althoefer, Kaspar

    2017-01-01

    Haptic information in robotic surgery can significantly improve clinical outcomes and help detect hard soft-tissue inclusions that indicate potential abnormalities. Visual representation of tissue stiffness information is a cost-effective technique. Meanwhile, direct force feedback, although considerably more expensive than visual representation, is an intuitive method of conveying information regarding tissue stiffness to surgeons. In this study, real-time visual stiffness feedback by sliding indentation palpation is proposed, validated, and compared with force feedback involving human subjects. In an experimental tele-manipulation environment, a dynamically updated color map depicting the stiffness of probed soft tissue is presented via a graphical interface. The force feedback is provided, aided by a master haptic device. The haptic device uses data acquired from an F/T sensor attached to the end-effector of a tele-manipulated robot. Hard nodule detection performance is evaluated for 2 modes (force feedback and visual stiffness feedback) of stiffness feedback on an artificial organ containing buried stiff nodules. From this artificial organ, a virtual-environment tissue model is generated based on sliding indentation measurements. Employing this virtual-environment tissue model, we compare the performance of human participants in distinguishing differently sized hard nodules by force feedback and visual stiffness feedback. Results indicate that the proposed distributed visual representation of tissue stiffness can be used effectively for hard nodule identification. The representation can also be used as a sufficient substitute for force feedback in tissue palpation.

  14. Virtual Reality Cerebral Aneurysm Clipping Simulation With Real-time Haptic Feedback

    PubMed Central

    Alaraj, Ali; Luciano, Cristian J.; Bailey, Daniel P.; Elsenousi, Abdussalam; Roitberg, Ben Z.; Bernardo, Antonio; Banerjee, P. Pat; Charbel, Fady T.

    2014-01-01

    Background With the decrease in the number of cerebral aneurysms treated surgically and the increase of complexity of those treated surgically, there is a need for simulation-based tools to teach future neurosurgeons the operative techniques of aneurysm clipping. Objective To develop and evaluate the usefulness of a new haptic-based virtual reality (VR) simulator in the training of neurosurgical residents. Methods A real-time sensory haptic feedback virtual reality aneurysm clipping simulator was developed using the Immersive Touch platform. A prototype middle cerebral artery aneurysm simulation was created from a computed tomography angiogram. Aneurysm and vessel volume deformation and haptic feedback are provided in a 3-D immersive VR environment. Intraoperative aneurysm rupture was also simulated. Seventeen neurosurgery residents from three residency programs tested the simulator and provided feedback on its usefulness and resemblance to real aneurysm clipping surgery. Results Residents felt that the simulation would be useful in preparing for real-life surgery. About two thirds of the residents felt that the 3-D immersive anatomical details provided a very close resemblance to real operative anatomy and accurate guidance for deciding surgical approaches. They believed the simulation is useful for preoperative surgical rehearsal and neurosurgical training. One third of the residents felt that the technology in its current form provided very realistic haptic feedback for aneurysm surgery. Conclusion Neurosurgical residents felt that the novel immersive VR simulator is helpful in their training especially since they do not get a chance to perform aneurysm clippings until very late in their residency programs. PMID:25599200

  15. Functionalization of Tactile Sensation for Robot Based on Haptograph and Modal Decomposition

    NASA Astrophysics Data System (ADS)

    Yokokura, Yuki; Katsura, Seiichiro; Ohishi, Kiyoshi

    In the real world, robots should be able to recognize the environment in order to be of help to humans. A video camera and a laser range finder are devices that can help robots recognize the environment. However, these devices cannot obtain tactile information from environments. Future human-assisting-robots should have the ability to recognize haptic signals, and a disturbance observer can possibly be used to provide the robot with this ability. In this study, a disturbance observer is employed in a mobile robot to functionalize the tactile sensation. This paper proposes a method that involves the use of haptograph and modal decomposition for the haptic recognition of road environments. The haptograph presents a graphic view of the tactile information. It is possible to classify road conditions intuitively. The robot controller is designed by considering the decoupled modal coordinate system, which consists of translational and rotational modes. Modal decomposition is performed by using a quarry matrix. Once the robot is provided with the ability to recognize tactile sensations, its usefulness to humans will increase.

  16. Reviewing the technological challenges associated with the development of a laparoscopic palpation device.

    PubMed

    Culmer, Peter; Barrie, Jenifer; Hewson, Rob; Levesley, Martin; Mon-Williams, Mark; Jayne, David; Neville, Anne

    2012-06-01

    Minimally invasive surgery (MIS) has heralded a revolution in surgical practice, with numerous advantages over open surgery. Nevertheless, it prevents the surgeon from directly touching and manipulating tissue and therefore severely restricts the use of valuable techniques such as palpation. Accordingly a key challenge in MIS is to restore haptic feedback to the surgeon. This paper reviews the state-of-the-art in laparoscopic palpation devices (LPDs) with particular focus on device mechanisms, sensors and data analysis. It concludes by examining the challenges that must be overcome to create effective LPD systems that measure and display haptic information to the surgeon for improved intraoperative assessment. Copyright © 2012 John Wiley & Sons, Ltd.

  17. Modeling and test of a kinaesthetic actuator based on MR fluid for haptic applications.

    PubMed

    Yang, Tae-Heon; Koo, Jeong-Hoi; Kim, Sang-Youn; Kwon, Dong-Soo

    2017-03-01

    Haptic display units have been widely used for conveying button sensations to users, primarily employing vibrotactile actuators. However, the human feeling for pressing buttons mainly relies on kinaesthetic sensations (rather than vibrotactile sensations), and little studies exist on small-scale kinaesthetic haptic units. Thus, the primary goals of this paper are to design a miniature kinaesthetic actuator based on Magneto-Rheological (MR) fluid that can convey various button-clicking sensations and to experimentally evaluate its haptic performance. The design focuses of the proposed actuator were to produce sufficiently large actuation forces (resistive forces) for human users in a given size constraint and to offer a wide range of actuation forces for conveying vivid haptic sensations to users. To this end, this study first performed a series of parametric studies using mathematical force models for multiple operating modes of MR fluid in conjunction with finite element electromagnetism analysis. After selecting design parameters based on parametric studies, a prototype actuator was constructed, and its performance was evaluated using a dynamic mechanical analyzer. It measured the actuator's resistive force with a varying stroke (pressed depth) up to 1 mm and a varying input current from 0 A to 200 mA. The results show that the proposed actuator creates a wide range of resistive forces from around 2 N (off-state) to over 9.5 N at 200 mA. In order to assess the prototype's performance in the terms of the haptic application prospective, a maximum force rate was calculated to determine just noticeable difference in force changes for the 1 mm stoke of the actuator. The results show that the force rate is sufficient to mimic various levels of button sensations, indicating that the proposed kinaesthetic actuator can offer a wide range of resistive force changes that can be conveyed to human operators.

  18. GRAPHICS MANAGER (GFXMGR): An interactive graphics software program for the Advanced Electronics Design (AED) graphics controller, Model 767

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faculjak, D.A.

    1988-03-01

    Graphics Manager (GFXMGR) is menu-driven, user-friendly software designed to interactively create, edit, and delete graphics displays on the Advanced Electronics Design (AED) graphics controller, Model 767. The software runs on the VAX family of computers and has been used successfully in security applications to create and change site layouts (maps) of specific facilities. GFXMGR greatly benefits graphics development by minimizing display-development time, reducing tedium on the part of the user, and improving system performance. It is anticipated that GFXMGR can be used to create graphics displays for many types of applications. 8 figs., 2 tabs.

  19. Neodymium:YAG laser cutting of intraocular lens haptics in vitro and in vivo.

    PubMed

    Feder, J M; Rosenberg, M A; Farber, M D

    1989-09-01

    Various complications following intraocular lens (IOL) surgery result in explantation of the lenses. Haptic fibrosis may necessitate cutting the IOL haptics prior to removal. In this study we used the neodymium: YAG (Nd:YAG) laser to cut polypropylene and poly(methyl methacrylate) (PMMA) haptics in vitro and in rabbit eyes. In vitro we were able to cut 100% of both haptic types successfully (28 PMMA and 30 polypropylene haptics). In rabbit eyes we were able to cut 50% of the PMMA haptics and 43% of the polypropylene haptics. Poly(methyl methacrylate) haptics were easier to cut in vitro and in vivo than polypropylene haptics, requiring fewer shots for transection. Complications of Nd:YAG laser use frequently interfered with haptic transections in rabbit eyes. Haptic transection may be more easily accomplished in human eyes.

  20. Designing Media for Visually-Impaired Users of Refreshable Touch Displays: Possibilities and Pitfalls.

    PubMed

    O'Modhrain, Sile; Giudice, Nicholas A; Gardner, John A; Legge, Gordon E

    2015-01-01

    This paper discusses issues of importance to designers of media for visually impaired users. The paper considers the influence of human factors on the effectiveness of presentation as well as the strengths and weaknesses of tactile, vibrotactile, haptic, and multimodal methods of rendering maps, graphs, and models. The authors, all of whom are visually impaired researchers in this domain, present findings from their own work and work of many others who have contributed to the current understanding of how to prepare and render images for both hard-copy and technology-mediated presentation of Braille and tangible graphics.

  1. Graphic and haptic simulation for transvaginal cholecystectomy training in NOTES.

    PubMed

    Pan, Jun J; Ahn, Woojin; Dargar, Saurabh; Halic, Tansel; Li, Bai C; Sankaranarayanan, Ganesh; Roberts, Kurt; Schwaitzberg, Steven; De, Suvranu

    2016-04-01

    Natural Orifice Transluminal Endoscopic Surgery (NOTES) provides an emerging surgical technique which usually needs a long learning curve for surgeons. Virtual reality (VR) medical simulators with vision and haptic feedback can usually offer an efficient and cost-effective alternative without risk to the traditional training approaches. Under this motivation, we developed the first virtual reality simulator for transvaginal cholecystectomy in NOTES (VTEST™). This VR-based surgical simulator aims to simulate the hybrid NOTES of cholecystectomy. We use a 6DOF haptic device and a tracking sensor to construct the core hardware component of simulator. For software, an innovative approach based on the inner-spheres is presented to deform the organs in real time. To handle the frequent collision between soft tissue and surgical instruments, an adaptive collision detection method based on GPU is designed and implemented. To give a realistic visual performance of gallbladder fat tissue removal by cautery hook, a multi-layer hexahedral model is presented to simulate the electric dissection of fat tissue. From the experimental results, trainees can operate in real time with high degree of stability and fidelity. A preliminary study was also performed to evaluate the realism and the usefulness of this hybrid NOTES simulator. This prototyped simulation system has been verified by surgeons through a pilot study. Some items of its visual performance and the utility were rated fairly high by the participants during testing. It exhibits the potential to improve the surgical skills of trainee and effectively shorten their learning curve. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. The visible ear simulator: a public PC application for GPU-accelerated haptic 3D simulation of ear surgery based on the visible ear data.

    PubMed

    Sorensen, Mads Solvsten; Mosegaard, Jesper; Trier, Peter

    2009-06-01

    Existing virtual simulators for middle ear surgery are based on 3-dimensional (3D) models from computed tomographic or magnetic resonance imaging data in which image quality is limited by the lack of detail (maximum, approximately 50 voxels/mm3), natural color, and texture of the source material.Virtual training often requires the purchase of a program, a customized computer, and expensive peripherals dedicated exclusively to this purpose. The Visible Ear freeware library of digital images from a fresh-frozen human temporal bone was segmented, and real-time volume rendered as a 3D model of high-fidelity, true color, and great anatomic detail and realism of the surgically relevant structures. A haptic drilling model was developed for surgical interaction with the 3D model. Realistic visualization in high-fidelity (approximately 125 voxels/mm3) and true color, 2D, or optional anaglyph stereoscopic 3D was achieved on a standard Core 2 Duo personal computer with a GeForce 8,800 GTX graphics card, and surgical interaction was provided through a relatively inexpensive (approximately $2,500) Phantom Omni haptic 3D pointing device. This prototype is published for download (approximately 120 MB) as freeware at http://www.alexandra.dk/ves/index.htm.With increasing personal computer performance, future versions may include enhanced resolution (up to 8,000 voxels/mm3) and realistic interaction with deformable soft tissue components such as skin, tympanic membrane, dura, and cholesteatomas-features some of which are not possible with computed tomographic-/magnetic resonance imaging-based systems.

  3. Sensing and Force-Feedback Exoskeleton (SAFE) Robotic Glove.

    PubMed

    Ben-Tzvi, Pinhas; Ma, Zhou

    2015-11-01

    This paper presents the design, implementation and experimental validation of a novel robotic haptic exoskeleton device to measure the user's hand motion and assist hand motion while remaining portable and lightweight. The device consists of a five-finger mechanism actuated with miniature DC motors through antagonistically routed cables at each finger, which act as both active and passive force actuators. The SAFE Glove is a wireless and self-contained mechatronic system that mounts over the dorsum of a bare hand and provides haptic force feedback to each finger. The glove is adaptable to a wide variety of finger sizes without constraining the range of motion. This makes it possible to accurately and comfortably track the complex motion of the finger and thumb joints associated with common movements of hand functions, including grip and release patterns. The glove can be wirelessly linked to a computer for displaying and recording the hand status through 3D Graphical User Interface (GUI) in real-time. The experimental results demonstrate that the SAFE Glove is capable of reliably modeling hand kinematics, measuring finger motion and assisting hand grasping motion. Simulation and experimental results show the potential of the proposed system in rehabilitation therapy and virtual reality applications.

  4. Soft tissue deformation modelling through neural dynamics-based reaction-diffusion mechanics.

    PubMed

    Zhang, Jinao; Zhong, Yongmin; Gu, Chengfan

    2018-05-30

    Soft tissue deformation modelling forms the basis of development of surgical simulation, surgical planning and robotic-assisted minimally invasive surgery. This paper presents a new methodology for modelling of soft tissue deformation based on reaction-diffusion mechanics via neural dynamics. The potential energy stored in soft tissues due to a mechanical load to deform tissues away from their rest state is treated as the equivalent transmembrane potential energy, and it is distributed in the tissue masses in the manner of reaction-diffusion propagation of nonlinear electrical waves. The reaction-diffusion propagation of mechanical potential energy and nonrigid mechanics of motion are combined to model soft tissue deformation and its dynamics, both of which are further formulated as the dynamics of cellular neural networks to achieve real-time computational performance. The proposed methodology is implemented with a haptic device for interactive soft tissue deformation with force feedback. Experimental results demonstrate that the proposed methodology exhibits nonlinear force-displacement relationship for nonlinear soft tissue deformation. Homogeneous, anisotropic and heterogeneous soft tissue material properties can be modelled through the inherent physical properties of mass points. Graphical abstract Soft tissue deformation modelling with haptic feedback via neural dynamics-based reaction-diffusion mechanics.

  5. Characteristic analysis and simulation for polysilicon comb micro-accelerometer

    NASA Astrophysics Data System (ADS)

    Liu, Fengli; Hao, Yongping

    2008-10-01

    High force update rate is a key factor for achieving high performance haptic rendering, which imposes a stringent real time requirement upon the execution environment of the haptic system. This requirement confines the haptic system to simplified environment for reducing the computation cost of haptic rendering algorithms. In this paper, we present a novel "hyper-threading" architecture consisting of several threads for haptic rendering. The high force update rate is achieved with relatively large computation time interval for each haptic loop. The proposed method was testified and proved to be effective with experiments on virtual wall prototype haptic system via Delta Haptic Device.

  6. Perception of force and stiffness in the presence of low-frequency haptic noise

    PubMed Central

    Gurari, Netta; Okamura, Allison M.; Kuchenbecker, Katherine J.

    2017-01-01

    Objective This work lays the foundation for future research on quantitative modeling of human stiffness perception. Our goal was to develop a method by which a human’s ability to perceive suprathreshold haptic force stimuli and haptic stiffness stimuli can be affected by adding haptic noise. Methods Five human participants performed a same-different task with a one-degree-of-freedom force-feedback device. Participants used the right index finger to actively interact with variations of force (∼5 and ∼8 N) and stiffness (∼290 N/m) stimuli that included one of four scaled amounts of haptically rendered noise (None, Low, Medium, High). The haptic noise was zero-mean Gaussian white noise that was low-pass filtered with a 2 Hz cut-off frequency; the resulting low-frequency signal was added to the force rendered while the participant interacted with the force and stiffness stimuli. Results We found that the precision with which participants could identify the magnitude of both the force and stiffness stimuli was affected by the magnitude of the low-frequency haptically rendered noise added to the haptic stimulus, as well as the magnitude of the haptic stimulus itself. The Weber fraction strongly correlated with the standard deviation of the low-frequency haptic noise with a Pearson product-moment correlation coefficient of ρ > 0.83. The mean standard deviation of the low-frequency haptic noise in the haptic stimuli ranged from 0.184 N to 1.111 N across the four haptically rendered noise levels, and the corresponding mean Weber fractions spanned between 0.042 and 0.101. Conclusions The human ability to perceive both suprathreshold haptic force and stiffness stimuli degrades in the presence of added low-frequency haptic noise. Future work can use the reported methods to investigate how force perception and stiffness perception may relate, with possible applications in haptic watermarking and in the assessment of the functionality of peripheral pathways in individuals with haptic impairments. PMID:28575068

  7. The effects of perceptual priming on 4-year-olds' haptic-to-visual cross-modal transfer.

    PubMed

    Kalagher, Hilary

    2013-01-01

    Four-year-old children often have difficulty visually recognizing objects that were previously experienced only haptically. This experiment attempts to improve their performance in these haptic-to-visual transfer tasks. Sixty-two 4-year-old children participated in priming trials in which they explored eight unfamiliar objects visually, haptically, or visually and haptically together. Subsequently, all children participated in the same haptic-to-visual cross-modal transfer task. In this task, children haptically explored the objects that were presented in the priming phase and then visually identified a match from among three test objects, each matching the object on only one dimension (shape, texture, or color). Children in all priming conditions predominantly made shape-based matches; however, the most shape-based matches were made in the Visual and Haptic condition. All kinds of priming provided the necessary memory traces upon which subsequent haptic exploration could build a strong enough representation to enable subsequent visual recognition. Haptic exploration patterns during the cross-modal transfer task are discussed and the detailed analyses provide a unique contribution to our understanding of the development of haptic exploratory procedures.

  8. Clinical and optical intraocular performance of rotationally asymmetric multifocal IOL plate-haptic design versus C-loop haptic design.

    PubMed

    Alió, Jorge L; Plaza-Puche, Ana B; Javaloy, Jaime; Ayala, María José; Vega-Estrada, Alfredo

    2013-04-01

    To compare the visual and intraocular optical quality outcomes with different designs of the refractive rotationally asymmetric multifocal intraocular lens (MFIOL) (Lentis Mplus; Oculentis GmbH, Berlin, Germany) with or without capsular tension ring (CTR) implantation. One hundred thirty-five consecutive eyes of 78 patients with cataract (ages 36 to 82 years) were divided into three groups: 43 eyes implanted with the C-Loop haptic design without CTR (C-Loop haptic only group); 47 eyes implanted with the C-Loop haptic design with CTR (C-Loop haptic with CTR group); and 45 eyes implanted with the plate-haptic design (plate-haptic group). Visual acuity, contrast sensitivity, defocus curve, and ocular and intraocular optical quality were evaluated at 3 months postoperatively. Significant differences in the postoperative sphere were found (P = .01), with a more myopic postoperative refraction for the C-Loop haptic only group. No significant differences were detected in photopic and scotopic contrast sensitivity among groups (P ⩾ .05). Significantly better visual acuities were present in the C-Loop haptic with CTR group for the defocus levels of -2.0, -1.5, -1.0, and -0.50 D (P ⩽.03). Statistically significant differences among groups were found in total intraocular root mean square (RMS), high-order intraocular RMS, and intraocular coma-like RMS aberrations (P ⩽.04), with lower values from the plate-haptic group. The plate-haptic design and the C-Loop haptic design with CTR implantation both allow good visual rehabilitation. However, better refractive predictability and intraocular optical quality was obtained with the plate-haptic design without CTR implantation. The plate-haptic design seems to be a better design to support rotational asymmetric MFIOL optics. Copyright 2013, SLACK Incorporated.

  9. Limited value of haptics in virtual reality laparoscopic cholecystectomy training.

    PubMed

    Thompson, Jonathan R; Leonard, Anthony C; Doarn, Charles R; Roesch, Matt J; Broderick, Timothy J

    2011-04-01

    Haptics is an expensive addition to virtual reality (VR) simulators, and the added value to training has not been proven. This study evaluated the benefit of haptics in VR laparoscopic surgery training for novices. The Simbionix LapMentor II haptic VR simulator was used in the study. Randomly, 33 laparoscopic novice students were placed in one of three groups: control, haptics-trained, or nonhaptics-trained group. The control group performed nine basic laparoscopy tasks and four cholecystectomy procedural tasks one time with haptics engaged at the default setting. The haptics group was trained to proficiency in the basic tasks and then performed each of the procedural tasks one time with haptics engaged. The nonhaptics group used the same training protocol except that haptics was disengaged. The proficiency values used were previously published expert values. Each group was assessed in the performance of 10 laparoscopic cholecystectomies (alternating with and without haptics). Performance was measured via automatically collected simulator data. The three groups exhibited no differences in terms of sex, education level, hand dominance, video game experience, surgical experience, and nonsurgical simulator experience. The number of attempts required to reach proficiency did not differ between the haptics- and nonhaptics-training groups. The haptics and nonhaptics groups exhibited no difference in performance. Both training groups outperformed the control group in number of movements as well as path length of the left instrument. In addition, the nonhaptics group outperformed the control group in total time. Haptics does not improve the efficiency or effectiveness of LapMentor II VR laparoscopic surgery training. The limited benefit and the significant cost of haptics suggest that haptics should not be included routinely in VR laparoscopic surgery training.

  10. Using an embedded reality approach to improve test reliability for NHPT tasks.

    PubMed

    Bowler, M; Amirabdollahian, F; Dautenhahn, K

    2011-01-01

    Research into the use of haptic and virtual reality technologies has increased greatly over the past decade, in terms of both quality and quantity. Methods to utilise haptic and virtual technologies with currently existing techniques for assessing impairment are underway, and, due to the commercially available equipment, has found some success in the use of these methods for individuals who suffer upper limb impairment. This paper uses the clinically validated assessment technique for measuring motor impairment: the Nine Hole Peg Test and creates three tasks with different levels of realism. The efficacy of these tasks is discussed with particular attention paid to analysis in terms of removing factors that limit a virtual environment's use in a clinical setting, such as inter-subject variation. © 2011 IEEE

  11. Self-Control of Haptic Assistance for Motor Learning: Influences of Frequency and Opinion of Utility

    PubMed Central

    Williams, Camille K.; Tseung, Victrine; Carnahan, Heather

    2017-01-01

    Studies of self-controlled practice have shown benefits when learners controlled feedback schedule, use of assistive devices and task difficulty, with benefits attributed to information processing and motivational advantages of self-control. Although haptic assistance serves as feedback, aids task performance and modifies task difficulty, researchers have yet to explore whether self-control over haptic assistance could be beneficial for learning. We explored whether self-control of haptic assistance would be beneficial for learning a tracing task. Self-controlled participants selected practice blocks on which they would receive haptic assistance, while participants in a yoked group received haptic assistance on blocks determined by a matched self-controlled participant. We inferred learning from performance on retention tests without haptic assistance. From qualitative analysis of open-ended questions related to rationales for/experiences of the haptic assistance that was chosen/provided, themes emerged regarding participants’ views of the utility of haptic assistance for performance and learning. Results showed that learning was directly impacted by the frequency of haptic assistance for self-controlled participants only and view of haptic assistance. Furthermore, self-controlled participants’ views were significantly associated with their requested haptic assistance frequency. We discuss these findings as further support for the beneficial role of self-controlled practice for motor learning. PMID:29255438

  12. [Postoperative ultrasound biomicroscopic evaluation of the tangible position of black diaphragm posterior chamber lenses in congenital and traumatic aniridia in comparison with gonioscopy].

    PubMed

    Schweykart, N; Reinhard, T; Engelhardt, S; Sundmacher, R

    1999-06-01

    Ultrasound biomicroscopy (UBM) allows to determine the haptic position of posterior chamber lenses (PCL) in relation to adjacent structures. In transsclerally sutured PCLs, the comparison between intraoperatively endoscopically and postoperatively localized haptic positions via UBM showed a correspondence of only 81%. The different localisation of 19% of the examined haptic positions was explained with postoperative dislocation without any proof for this assumption. The purpose of this study therefore was the correlation of UBM results with simultaneously determined haptic positions via gonioscopy in aniridia after black diaphragm PCL implantation. The haptic positions of black diaphragm PCL implants in 20 patients with congenital and 13 patients with traumatic aniridia were determined via UBM (50-MHz-probe) and gonioscopy 44.4 (6-75) months postoperatively. 39/66 haptic positions could be localized in gonioscopy as well as in UBM. 38 haptics (97.4%) showed the same position in both examination techniques. Determination of the haptic position through one of the two examination techniques was impossible in 27/66 haptics (11 haptics in gonioscopy, 16 haptics in UBM). Reasons for this were primarily haptic position behind iris remnants and corneal opacities in gonioscopy and scarring of the ciliary body in UBM. The validity of UBM in localization of PCLs was confirmed gonioscopically, which also confirms our prior assumption of postoperative displacement of IOL-haptics after transscleral suturing in about 20% of cases. Scarring of the ciliary body was the most important obstacle in the determination of PCL haptic positions in relation to adjacent structures.

  13. [Visual cuing effect for haptic angle judgment].

    PubMed

    Era, Ataru; Yokosawa, Kazuhiko

    2009-08-01

    We investigated whether visual cues are useful for judging haptic angles. Participants explored three-dimensional angles with a virtual haptic feedback device. For visual cues, we use a location cue, which synchronizes haptic exploration, and a space cue, which specifies the haptic space. In Experiment 1, angles were judged more correctly with both cues, but were overestimated with a location cue only. In Experiment 2, the visual cues emphasized depth, and overestimation with location cues occurred, but space cues had no influence. The results showed that (a) when both cues are presented, haptic angles are judged more correctly. (b) Location cues facilitate only motion information, and not depth information. (c) Haptic angles are apt to be overestimated when there is both haptic and visual information.

  14. From Sensory Signals to Modality-Independent Conceptual Representations: A Probabilistic Language of Thought Approach

    PubMed Central

    Erdogan, Goker; Yildirim, Ilker; Jacobs, Robert A.

    2015-01-01

    People learn modality-independent, conceptual representations from modality-specific sensory signals. Here, we hypothesize that any system that accomplishes this feat will include three components: a representational language for characterizing modality-independent representations, a set of sensory-specific forward models for mapping from modality-independent representations to sensory signals, and an inference algorithm for inverting forward models—that is, an algorithm for using sensory signals to infer modality-independent representations. To evaluate this hypothesis, we instantiate it in the form of a computational model that learns object shape representations from visual and/or haptic signals. The model uses a probabilistic grammar to characterize modality-independent representations of object shape, uses a computer graphics toolkit and a human hand simulator to map from object representations to visual and haptic features, respectively, and uses a Bayesian inference algorithm to infer modality-independent object representations from visual and/or haptic signals. Simulation results show that the model infers identical object representations when an object is viewed, grasped, or both. That is, the model’s percepts are modality invariant. We also report the results of an experiment in which different subjects rated the similarity of pairs of objects in different sensory conditions, and show that the model provides a very accurate account of subjects’ ratings. Conceptually, this research significantly contributes to our understanding of modality invariance, an important type of perceptual constancy, by demonstrating how modality-independent representations can be acquired and used. Methodologically, it provides an important contribution to cognitive modeling, particularly an emerging probabilistic language-of-thought approach, by showing how symbolic and statistical approaches can be combined in order to understand aspects of human perception. PMID:26554704

  15. Perception of synchronization errors in haptic and visual communications

    NASA Astrophysics Data System (ADS)

    Kameyama, Seiji; Ishibashi, Yutaka

    2006-10-01

    This paper deals with a system which conveys the haptic sensation experimented by a user to a remote user. In the system, the user controls a haptic interface device with another remote haptic interface device while watching video. Haptic media and video of a real object which the user is touching are transmitted to another user. By subjective assessment, we investigate the allowable range and imperceptible range of synchronization error between haptic media and video. We employ four real objects and ask each subject whether the synchronization error is perceived or not for each object in the assessment. Assessment results show that we can more easily perceive the synchronization error in the case of haptic media ahead of video than in the case of the haptic media behind the video.

  16. Active skin as new haptic interface

    NASA Astrophysics Data System (ADS)

    Vuong, Nguyen Huu Lam; Kwon, Hyeok Yong; Chuc, Nguyen Huu; Kim, Duksang; An, Kuangjun; Phuc, Vuong Hong; Moon, Hyungpil; Koo, Jachoon; Lee, Youngkwan; Nam, Jae-Do; Choi, Hyouk Ryeol

    2010-04-01

    In this paper, we present a new haptic interface, called "active skin", which is configured with a tactile sensor and a tactile stimulator in single haptic cell, and multiple haptic cells are embedded in a dielectric elastomer. The active skin generates a wide variety of haptic feel in response to the touch by synchronizing the sensor and the stimulator. In this paper, the design of the haptic cell is derived via iterative analysis and design procedures. A fabrication method dedicated to the proposed device is investigated and a controller to drive multiple haptic cells is developed. In addition, several experiments are performed to evaluate the performance of the active skin.

  17. A comparison of haptic material perception in blind and sighted individuals.

    PubMed

    Baumgartner, Elisabeth; Wiebel, Christiane B; Gegenfurtner, Karl R

    2015-10-01

    We investigated material perception in blind participants to explore the influence of visual experience on material representations and the relationship between visual and haptic material perception. In a previous study with sighted participants, we had found participants' visual and haptic judgments of material properties to be very similar (Baumgartner, Wiebel, & Gegenfurtner, 2013). In a categorization task, however, visual exploration had led to higher categorization accuracy than haptic exploration. Here, we asked congenitally blind participants to explore different materials haptically and rate several material properties in order to assess the role of the visual sense for the emergence of haptic material perception. Principal components analyses combined with a procrustes superimposition showed that the material representations of blind and blindfolded sighted participants were highly similar. We also measured haptic categorization performance, which was equal for the two groups. We conclude that haptic material representations can emerge independently of visual experience, and that there are no advantages for either group of observers in haptic categorization. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Haptic fMRI: combining functional neuroimaging with haptics for studying the brain's motor control representation.

    PubMed

    Menon, Samir; Brantner, Gerald; Aholt, Chris; Kay, Kendrick; Khatib, Oussama

    2013-01-01

    A challenging problem in motor control neuroimaging studies is the inability to perform complex human motor tasks given the Magnetic Resonance Imaging (MRI) scanner's disruptive magnetic fields and confined workspace. In this paper, we propose a novel experimental platform that combines Functional MRI (fMRI) neuroimaging, haptic virtual simulation environments, and an fMRI-compatible haptic device for real-time haptic interaction across the scanner workspace (above torso ∼ .65×.40×.20m(3)). We implement this Haptic fMRI platform with a novel haptic device, the Haptic fMRI Interface (HFI), and demonstrate its suitability for motor neuroimaging studies. HFI has three degrees-of-freedom (DOF), uses electromagnetic motors to enable high-fidelity haptic rendering (>350Hz), integrates radio frequency (RF) shields to prevent electromagnetic interference with fMRI (temporal SNR >100), and is kinematically designed to minimize currents induced by the MRI scanner's magnetic field during motor displacement (<2cm). HFI possesses uniform inertial and force transmission properties across the workspace, and has low friction (.05-.30N). HFI's RF noise levels, in addition, are within a 3 Tesla fMRI scanner's baseline noise variation (∼.85±.1%). Finally, HFI is haptically transparent and does not interfere with human motor tasks (tested for .4m reaches). By allowing fMRI experiments involving complex three-dimensional manipulation with haptic interaction, Haptic fMRI enables-for the first time-non-invasive neuroscience experiments involving interactive motor tasks, object manipulation, tactile perception, and visuo-motor integration.

  19. Scanning electron microscopic characteristics of commercially available 1- and 3-piece intraocular lenses.

    PubMed

    Brockmann, Tobias; Brockmann, Claudia; Nietzsche, Sandor; Bertelmann, Eckart; Strobel, Juergen; Dawczynski, Jens

    2013-12-01

    To evaluate commercially available 1- and 3-piece intraocular lenses (IOLs) with scanning electron microscopy (SEM). Department of Ophthalmology and Electron Microscopy Center, University Hospital Jena, Jena, Germany. Experimental study. Seven +23.0 diopter IOLs of different design and material and from different manufacturers were chosen for a detailed assessment. Scanning electron microscopy was used at standardized magnifications to assess typical IOL characteristics. The particular focus was the optic edge, the optic surface, the haptic–optic junction, and the haptic. All square-edged IOLs had a curvature radius of less than 10 μm, while the mean optic edge thickness ranged between 216 μm and 382 μm. A 360-degree square-edged boundary was present in all 3-piece IOLs and in a single 1-piece model. Relevant production remnants on the optic edge were observed in 1 case. Regarding the haptic, 3-piece IOLs had uniformly shaped fibers with a mean thickness of 177 μm ± 51 (SD) (range 116 to 220 μm). Chemical adhesives were used to attach the haptic in 1 case, where alterations of the IOL material were observed. In another case, the haptic fiber was press-fitted into the optic, which resulted in bulging of the optic profile. Inspection of surface characteristics showed wavelike patterns in 2 IOLs. Taking clinical relevance into account, all IOLs were of high manufacturing quality. Certain attention was paid in creating a sharp optic edge. Surface irregularities of 2 IOLs were attributed to the manufacturing technique. Methods for implementing the haptic–optic junction were diverse.

  20. Neurosurgery simulation in residency training: feasibility, cost, and educational benefit.

    PubMed

    Gasco, Jaime; Holbrook, Thomas J; Patel, Achal; Smith, Adrian; Paulson, David; Muns, Alan; Desai, Sohum; Moisi, Marc; Kuo, Yong-Fan; Macdonald, Bart; Ortega-Barnett, Juan; Patterson, Joel T

    2013-10-01

    The effort required to introduce simulation in neurosurgery academic programs and the benefits perceived by residents have not been systematically assessed. To create a neurosurgery simulation curriculum encompassing basic and advanced skills, cadaveric dissection, cranial and spine surgery simulation, and endovascular and computerized haptic training. A curriculum with 68 core exercises per academic year was distributed in individualized sets of 30 simulations to 6 neurosurgery residents. The total number of procedures completed during the academic year was set to 180. The curriculum includes 79 simulations with physical models, 57 cadaver dissections, and 44 haptic/computerized sessions. Likert-type evaluations regarding self-perceived performance were completed after each exercise. Subject identification was blinded to junior (postgraduate years 1-3) or senior resident (postgraduate years 4-6). Wilcoxon rank testing was used to detect differences within and between groups. One hundred eighty procedures and surveys were analyzed. Junior residents reported proficiency improvements in 82% of simulations performed (P < .001). Senior residents reported improvement in 42.5% of simulations (P < .001). Cadaver simulations accrued the highest reported benefit (71.5%; P < .001), followed by physical simulators (63.8%; P < .001) and haptic/computerized (59.1; P < .001). Initial cost is $341,978.00, with $27,876.36 for annual operational expenses. The systematic implementation of a simulation curriculum in a neurosurgery training program is feasible, is favorably regarded, and has a positive impact on trainees of all levels, particularly in junior years. All simulation forms, cadaver, physical, and haptic/computerized, have a role in different stages of learning and should be considered in the development of an educational simulation program.

  1. Saving and Reproduction of Human Motion Data by Using Haptic Devices with Different Configurations

    NASA Astrophysics Data System (ADS)

    Tsunashima, Noboru; Yokokura, Yuki; Katsura, Seiichiro

    Recently, there has been increased focus on “haptic recording” development of a motion-copying system is an efficient method for the realization of haptic recording. Haptic recording involves saving and reproduction of human motion data on the basis of haptic information. To increase the number of applications of the motion-copying system in various fields, it is necessary to reproduce human motion data by using haptic devices with different configurations. In this study, a method for the above-mentioned haptic recording is developed. In this method, human motion data are saved and reproduced on the basis of work space information, which is obtained by coordinate transformation of motor space information. The validity of the proposed method is demonstrated by experiments. With the proposed method, saving and reproduction of human motion data by using various devices is achieved. Furthermore, it is also possible to use haptic recording in various fields.

  2. Training haptic stiffness discrimination: time course of learning with or without visual information and knowledge of results.

    PubMed

    Teodorescu, Kinneret; Bouchigny, Sylvain; Korman, Maria

    2013-08-01

    In this study, we explored the time course of haptic stiffness discrimination learning and how it was affected by two experimental factors, the addition of visual information and/or knowledge of results (KR) during training. Stiffness perception may integrate both haptic and visual modalities. However, in many tasks, the visual field is typically occluded, forcing stiffness perception to be dependent exclusively on haptic information. No studies to date addressed the time course of haptic stiffness perceptual learning. Using a virtual environment (VE) haptic interface and a two-alternative forced-choice discrimination task, the haptic stiffness discrimination ability of 48 participants was tested across 2 days. Each day included two haptic test blocks separated by a training block Additional visual information and/or KR were manipulated between participants during training blocks. Practice repetitions alone induced significant improvement in haptic stiffness discrimination. Between days, accuracy was slightly improved, but decision time performance was deteriorated. The addition of visual information and/or KR had only temporary effects on decision time, without affecting the time course of haptic discrimination learning. Learning in haptic stiffness discrimination appears to evolve through at least two distinctive phases: A single training session resulted in both immediate and latent learning. This learning was not affected by the training manipulations inspected. Training skills in VE in spaced sessions can be beneficial for tasks in which haptic perception is critical, such as surgery procedures, when the visual field is occluded. However, training protocols for such tasks should account for low impact of multisensory information and KR.

  3. Structural impact detection with vibro-haptic interfaces

    NASA Astrophysics Data System (ADS)

    Jung, Hwee-Kwon; Park, Gyuhae; Todd, Michael D.

    2016-07-01

    This paper presents a new sensing paradigm for structural impact detection using vibro-haptic interfaces. The goal of this study is to allow humans to ‘feel’ structural responses (impact, shape changes, and damage) and eventually determine health conditions of a structure. The target applications for this study are aerospace structures, in particular, airplane wings. Both hardware and software components are developed to realize the vibro-haptic-based impact detection system. First, L-shape piezoelectric sensor arrays are deployed to measure the acoustic emission data generated by impacts on a wing. Unique haptic signals are then generated by processing the measured acoustic emission data. These haptic signals are wirelessly transmitted to human arms, and with vibro-haptic interface, human pilots could identify impact location, intensity and possibility of subsequent damage initiation. With the haptic interface, the experimental results demonstrate that human could correctly identify such events, while reducing false indications on structural conditions by capitalizing on human’s classification capability. Several important aspects of this study, including development of haptic interfaces, design of optimal human training strategies, and extension of the haptic capability into structural impact detection are summarized in this paper.

  4. Haptic wearables as sensory replacement, sensory augmentation and trainer - a review.

    PubMed

    Shull, Peter B; Damian, Dana D

    2015-07-20

    Sensory impairments decrease quality of life and can slow or hinder rehabilitation. Small, computationally powerful electronics have enabled the recent development of wearable systems aimed to improve function for individuals with sensory impairments. The purpose of this review is to synthesize current haptic wearable research for clinical applications involving sensory impairments. We define haptic wearables as untethered, ungrounded body worn devices that interact with skin directly or through clothing and can be used in natural environments outside a laboratory. Results of this review are categorized by degree of sensory impairment. Total impairment, such as in an amputee, blind, or deaf individual, involves haptics acting as sensory replacement; partial impairment, as is common in rehabilitation, involves haptics as sensory augmentation; and no impairment involves haptics as trainer. This review found that wearable haptic devices improved function for a variety of clinical applications including: rehabilitation, prosthetics, vestibular loss, osteoarthritis, vision loss and hearing loss. Future haptic wearables development should focus on clinical needs, intuitive and multimodal haptic displays, low energy demands, and biomechanical compliance for long-term usage.

  5. End-to-End Flow Control for Visual-Haptic Communication under Bandwidth Change

    NASA Astrophysics Data System (ADS)

    Yashiro, Daisuke; Tian, Dapeng; Yakoh, Takahiro

    This paper proposes an end-to-end flow controller for visual-haptic communication. A visual-haptic communication system transmits non-real-time packets, which contain large-size visual data, and real-time packets, which contain small-size haptic data. When the transmission rate of visual data exceeds the communication bandwidth, the visual-haptic communication system becomes unstable owing to buffer overflow. To solve this problem, an end-to-end flow controller is proposed. This controller determines the optimal transmission rate of visual data on the basis of the traffic conditions, which are estimated by the packets for haptic communication. Experimental results confirm that in the proposed method, a short packet-sending interval and a short delay are achieved under bandwidth change, and thus, high-precision visual-haptic communication is realized.

  6. Preserved Haptic Shape Processing after Bilateral LOC Lesions.

    PubMed

    Snow, Jacqueline C; Goodale, Melvyn A; Culham, Jody C

    2015-10-07

    The visual and haptic perceptual systems are understood to share a common neural representation of object shape. A region thought to be critical for recognizing visual and haptic shape information is the lateral occipital complex (LOC). We investigated whether LOC is essential for haptic shape recognition in humans by studying behavioral responses and brain activation for haptically explored objects in a patient (M.C.) with bilateral lesions of the occipitotemporal cortex, including LOC. Despite severe deficits in recognizing objects using vision, M.C. was able to accurately recognize objects via touch. M.C.'s psychophysical response profile to haptically explored shapes was also indistinguishable from controls. Using fMRI, M.C. showed no object-selective visual or haptic responses in LOC, but her pattern of haptic activation in other brain regions was remarkably similar to healthy controls. Although LOC is routinely active during visual and haptic shape recognition tasks, it is not essential for haptic recognition of object shape. The lateral occipital complex (LOC) is a brain region regarded to be critical for recognizing object shape, both in vision and in touch. However, causal evidence linking LOC with haptic shape processing is lacking. We studied recognition performance, psychophysical sensitivity, and brain response to touched objects, in a patient (M.C.) with extensive lesions involving LOC bilaterally. Despite being severely impaired in visual shape recognition, M.C. was able to identify objects via touch and she showed normal sensitivity to a haptic shape illusion. M.C.'s brain response to touched objects in areas of undamaged cortex was also very similar to that observed in neurologically healthy controls. These results demonstrate that LOC is not necessary for recognizing objects via touch. Copyright © 2015 the authors 0270-6474/15/3513745-16$15.00/0.

  7. Augmented kinematic feedback from haptic virtual reality for dental skill acquisition.

    PubMed

    Suebnukarn, Siriwan; Haddawy, Peter; Rhienmora, Phattanapon; Jittimanee, Pannapa; Viratket, Piyanuch

    2010-12-01

    We have developed a haptic virtual reality system for dental skill training. In this study we examined several kinds of kinematic information about the movement provided by the system supplement knowledge of results (KR) in dental skill acquisition. The kinematic variables examined involved force utilization (F) and mirror view (M). This created three experimental conditions that received augmented kinematic feedback (F, M, FM) and one control condition that did not (KR-only). Thirty-two dental students were randomly assigned to four groups. Their task was to perform access opening on the upper first molar with the haptic virtual reality system. An acquisition session consisted of two days of ten trials of practice in which augmented kinematic feedback was provided for the appropriate experimental conditions after each trial. One week after, a retention test consisting of two trials without augmented feedback was completed. The results showed that the augmented kinematic feedback groups had larger mean performance scores than the KR-only group in Day 1 of the acquisition and retention sessions (ANOVA, p<0.05). The apparent differences among feedback groups were not significant in Day 2 of the acquisition session (ANOVA, p>0.05). The trends in acquisition and retention sessions suggest that the augmented kinematic feedback can enhance the performance earlier in the skill acquisition and retention sessions.

  8. Haptic augmentation of science instruction: Does touch matter?

    NASA Astrophysics Data System (ADS)

    Jones, M. Gail; Minogue, James; Tretter, Thomas R.; Negishi, Atsuko; Taylor, Russell

    2006-01-01

    This study investigated the impact of haptic augmentation of a science inquiry program on students' learning about viruses and nanoscale science. The study assessed how the addition of different types of haptic feedback (active touch and kinesthetic feedback) combined with computer visualizations influenced middle and high school students' experiences. The influences of a PHANToM (a sophisticated haptic desktop device), a Sidewinder (a haptic gaming joystick), and a mouse (no haptic feedback) interface were compared. The levels of engagement in the instruction and students' attitudes about the instructional program were assessed using a combination of constructed response and Likert scale items. Potential cognitive differences were examined through an analysis of spontaneously generated analogies that appeared during student discourse. Results showed that the addition of haptic feedback from the haptic-gaming joystick and the PHANToM provided a more immersive learning environment that not only made the instruction more engaging but may also influence the way in which the students construct their understandings about abstract science concepts.

  9. Seeing a haptically explored face: visual facial-expression aftereffect from haptic adaptation to a face.

    PubMed

    Matsumiya, Kazumichi

    2013-10-01

    Current views on face perception assume that the visual system receives only visual facial signals. However, I show that the visual perception of faces is systematically biased by adaptation to a haptically explored face. Recently, face aftereffects (FAEs; the altered perception of faces after adaptation to a face) have been demonstrated not only in visual perception but also in haptic perception; therefore, I combined the two FAEs to examine whether the visual system receives face-related signals from the haptic modality. I found that adaptation to a haptically explored facial expression on a face mask produced a visual FAE for facial expression. This cross-modal FAE was not due to explicitly imaging a face, response bias, or adaptation to local features. Furthermore, FAEs transferred from vision to haptics. These results indicate that visual face processing depends on substrates adapted by haptic faces, which suggests that face processing relies on shared representation underlying cross-modal interactions.

  10. A magnetorheological haptic cue accelerator for manual transmission vehicles

    NASA Astrophysics Data System (ADS)

    Han, Young-Min; Noh, Kyung-Wook; Lee, Yang-Sub; Choi, Seung-Bok

    2010-07-01

    This paper proposes a new haptic cue function for manual transmission vehicles to achieve optimal gear shifting. This function is implemented on the accelerator pedal by utilizing a magnetorheological (MR) brake mechanism. By combining the haptic cue function with the accelerator pedal, the proposed haptic cue device can transmit the optimal moment of gear shifting for manual transmission to a driver without requiring the driver's visual attention. As a first step to achieve this goal, a MR fluid-based haptic device is devised to enable rotary motion of the accelerator pedal. Taking into account spatial limitations, the design parameters are optimally determined using finite element analysis to maximize the relative control torque. The proposed haptic cue device is then manufactured and its field-dependent torque and time response are experimentally evaluated. Then the manufactured MR haptic cue device is integrated with the accelerator pedal. A simple virtual vehicle emulating the operation of the engine of a passenger vehicle is constructed and put into communication with the haptic cue device. A feed-forward torque control algorithm for the haptic cue is formulated and control performances are experimentally evaluated and presented in the time domain.

  11. Design of a haptic device with grasp and push-pull force feedback for a master-slave surgical robot.

    PubMed

    Hu, Zhenkai; Yoon, Chae-Hyun; Park, Samuel Byeongjun; Jo, Yung-Ho

    2016-07-01

    We propose a portable haptic device providing grasp (kinesthetic) and push-pull (cutaneous) sensations for optical-motion-capture master interfaces. Although optical-motion-capture master interfaces for surgical robot systems can overcome the stiffness, friction, and coupling problems of mechanical master interfaces, it is difficult to add haptic feedback to an optical-motion-capture master interface without constraining the free motion of the operator's hands. Therefore, we utilized a Bowden cable-driven mechanism to provide the grasp and push-pull sensation while retaining the free hand motion of the optical-motion capture master interface. To evaluate the haptic device, we construct a 2-DOF force sensing/force feedback system. We compare the sensed force and the reproduced force of the haptic device. Finally, a needle insertion test was done to evaluate the performance of the haptic interface in the master-slave system. The results demonstrate that both the grasp force feedback and the push-pull force feedback provided by the haptic interface closely matched with the sensed forces of the slave robot. We successfully apply our haptic interface in the optical-motion-capture master-slave system. The results of the needle insertion test showed that our haptic feedback can provide more safety than merely visual observation. We develop a suitable haptic device to produce both kinesthetic grasp force feedback and cutaneous push-pull force feedback. Our future research will include further objective performance evaluations of the optical-motion-capture master-slave robot system with our haptic interface in surgical scenarios.

  12. The End of the Rainbow? Color Schemes for Improved Data Graphics

    NASA Astrophysics Data System (ADS)

    Light, Adam; Bartlein, Patrick J.

    2004-10-01

    Modern computer displays and printers enable the widespread use of color in scientific communication, but the expertise for designing effective graphics has not kept pace with the technology for producing them. Historically, even the most prestigious publications have tolerated high defect rates in figures and illustrations, and technological advances that make creating and reproducing graphics easier do not appear to have decreased the frequency of errors. Flawed graphics consequently beget more flawed graphics as authors emulate published examples. Color has the potential to enhance communication, but design mistakes can result in color figures that are less effective than gray scale displays of the same data. Empirical research on human subjects can build a fundamental understanding of visual perception and scientific methods can be used to evaluate existing designs, but creating effective data graphics is a design task and not fundamentally a scientific pursuit. Like writing well, creating good data graphics requires a combination of formal knowledge and artistic sensibility tempered by experience: a combination of ``substance, statistics, and design''.

  13. The Efficacy of Surface Haptics and Force Feedback in Education

    ERIC Educational Resources Information Center

    Gorlewicz, Jenna Lynn

    2013-01-01

    This dissertation bridges the fields of haptics, engineering, and education to realize some of the potential benefits haptic devices may have in Science, Technology, Engineering, and Math (STEM) education. Specifically, this dissertation demonstrates the development, implementation, and assessment of two haptic devices in engineering and math…

  14. Incorporating Haptic Feedback in Simulation for Learning Physics

    ERIC Educational Resources Information Center

    Han, Insook; Black, John B.

    2011-01-01

    The purpose of this study was to investigate the effectiveness of a haptic augmented simulation in learning physics. The results indicate that haptic augmented simulations, both the force and kinesthetic and the purely kinesthetic simulations, were more effective than the equivalent non-haptic simulation in providing perceptual experiences and…

  15. Haptic Distal Spatial Perception Mediated by Strings: Haptic "Looming"

    ERIC Educational Resources Information Center

    Cabe, Patrick A.

    2011-01-01

    Five experiments tested a haptic analog of optical looming, demonstrating string-mediated haptic distal spatial perception. Horizontally collinear hooks supported a weighted string held taut by a blindfolded participant's finger midway between the hooks. At the finger, the angle between string segments increased as the finger approached…

  16. Haptic Classification of Common Objects: Knowledge-Driven Exploration.

    ERIC Educational Resources Information Center

    Lederman, Susan J.; Klatzky, Roberta L.

    1990-01-01

    Theoretical and empirical issues relating to haptic exploration and the representation of common objects during haptic classification were investigated in 3 experiments involving a total of 112 college students. Results are discussed in terms of a computational model of human haptic object classification with implications for dextrous robot…

  17. A modern approach to storing of 3D geometry of objects in machine engineering industry

    NASA Astrophysics Data System (ADS)

    Sokolova, E. A.; Aslanov, G. A.; Sokolov, A. A.

    2017-02-01

    3D graphics is a kind of computer graphics which has absorbed a lot from the vector and raster computer graphics. It is used in interior design projects, architectural projects, advertising, while creating educational computer programs, movies, visual images of parts and products in engineering, etc. 3D computer graphics allows one to create 3D scenes along with simulation of light conditions and setting up standpoints.

  18. Perceptualization of geometry using intelligent haptic and visual sensing

    NASA Astrophysics Data System (ADS)

    Weng, Jianguang; Zhang, Hui

    2013-01-01

    We present a set of paradigms for investigating geometric structures using haptic and visual sensing. Our principal test cases include smoothly embedded geometry shapes such as knotted curves embedded in 3D and knotted surfaces in 4D, that contain massive intersections when projected to one lower dimension. One can exploit a touch-responsive 3D interactive probe to haptically override this conflicting evidence in the rendered images, by forcing continuity in the haptic representation to emphasize the true topology. In our work, we exploited a predictive haptic guidance, a "computer-simulated hand" with supplementary force suggestion, to support intelligent exploration of geometry shapes that will smooth and maximize the probability of recognition. The cognitive load can be reduced further when enabling an attention-driven visual sensing during the haptic exploration. Our methods combine to reveal the full richness of the haptic exploration of geometric structures, and to overcome the limitations of traditional 4D visualization.

  19. The Importance of Visual Experience, Gender, and Emotion in the Assessment of an Assistive Tactile Mouse.

    PubMed

    Brayda, Luca; Campus, Claudio; Memeo, Mariacarla; Lucagrossi, Laura

    2015-01-01

    Tactile maps are efficient tools to improve spatial understanding and mobility skills of visually impaired people. Their limited adaptability can be compensated with haptic devices which display graphical information, but their assessment is frequently limited to performance-based metrics only which can hide potential spatial abilities in O&M protocols. We assess a low-tech tactile mouse able to deliver three-dimensional content considering how performance, mental workload, behavior, and anxiety status vary with task difficulty and gender in congenitally blind, late blind, and sighted subjects. Results show that task difficulty coherently modulates the efficiency and difficulty to build mental maps, regardless of visual experience. Although exhibiting attitudes that were similar and gender-independent, the females had lower performance and higher cognitive load, especially when congenitally blind. All groups showed a significant decrease in anxiety after using the device. Tactile graphics with our device seems therefore to be applicable with different visual experiences, with no negative emotional consequences of mentally demanding spatial tasks. Going beyond performance-based assessment, our methodology can help with better targeting technological solutions in orientation and mobility protocols.

  20. Exploring Relationships between Students' Interaction and Learning with a Haptic Virtual Biomolecular Model

    ERIC Educational Resources Information Center

    Schonborn, Konrad J.; Bivall, Petter; Tibell, Lena A. E.

    2011-01-01

    This study explores tertiary students' interaction with a haptic virtual model representing the specific binding of two biomolecules, a core concept in molecular life science education. Twenty students assigned to a "haptics" (experimental) or "no-haptics" (control) condition performed a "docking" task where users sought the most favourable…

  1. Differential effects of non-informative vision and visual interference on haptic spatial processing

    PubMed Central

    van Rheede, Joram J.; Postma, Albert; Kappers, Astrid M. L.

    2008-01-01

    The primary purpose of this study was to examine the effects of non-informative vision and visual interference upon haptic spatial processing, which supposedly derives from an interaction between an allocentric and egocentric reference frame. To this end, a haptic parallelity task served as baseline to determine the participant-dependent biasing influence of the egocentric reference frame. As expected, large systematic participant-dependent deviations from veridicality were observed. In the second experiment we probed the effect of non-informative vision on the egocentric bias. Moreover, orienting mechanisms (gazing directions) were studied with respect to the presentation of haptic information in a specific hemispace. Non-informative vision proved to have a beneficial effect on haptic spatial processing. No effect of gazing direction or hemispace was observed. In the third experiment we investigated the effect of simultaneously presented interfering visual information on the haptic bias. Interfering visual information parametrically influenced haptic performance. The interplay of reference frames that subserves haptic spatial processing was found to be related to both the effects of non-informative vision and visual interference. These results suggest that spatial representations are influenced by direct cross-modal interactions; inter-participant differences in the haptic modality resulted in differential effects of the visual modality. PMID:18553074

  2. Telerobotic Haptic Exploration in Art Galleries and Museums for Individuals with Visual Impairments.

    PubMed

    Park, Chung Hyuk; Ryu, Eun-Seok; Howard, Ayanna M

    2015-01-01

    This paper presents a haptic telepresence system that enables visually impaired users to explore locations with rich visual observation such as art galleries and museums by using a telepresence robot, a RGB-D sensor (color and depth camera), and a haptic interface. The recent improvement on RGB-D sensors has enabled real-time access to 3D spatial information in the form of point clouds. However, the real-time representation of this data in the form of tangible haptic experience has not been challenged enough, especially in the case of telepresence for individuals with visual impairments. Thus, the proposed system addresses the real-time haptic exploration of remote 3D information through video encoding and real-time 3D haptic rendering of the remote real-world environment. This paper investigates two scenarios in haptic telepresence, i.e., mobile navigation and object exploration in a remote environment. Participants with and without visual impairments participated in our experiments based on the two scenarios, and the system performance was validated. In conclusion, the proposed framework provides a new methodology of haptic telepresence for individuals with visual impairments by providing an enhanced interactive experience where they can remotely access public places (art galleries and museums) with the aid of haptic modality and robotic telepresence.

  3. Size-Sensitive Perceptual Representations Underlie Visual and Haptic Object Recognition

    PubMed Central

    Craddock, Matt; Lawson, Rebecca

    2009-01-01

    A variety of similarities between visual and haptic object recognition suggests that the two modalities may share common representations. However, it is unclear whether such common representations preserve low-level perceptual features or whether transfer between vision and haptics is mediated by high-level, abstract representations. Two experiments used a sequential shape-matching task to examine the effects of size changes on unimodal and crossmodal visual and haptic object recognition. Participants felt or saw 3D plastic models of familiar objects. The two objects presented on a trial were either the same size or different sizes and were the same shape or different but similar shapes. Participants were told to ignore size changes and to match on shape alone. In Experiment 1, size changes on same-shape trials impaired performance similarly for both visual-to-visual and haptic-to-haptic shape matching. In Experiment 2, size changes impaired performance on both visual-to-haptic and haptic-to-visual shape matching and there was no interaction between the cost of size changes and direction of transfer. Together the unimodal and crossmodal matching results suggest that the same, size-specific perceptual representations underlie both visual and haptic object recognition, and indicate that crossmodal memory for objects must be at least partly based on common perceptual representations. PMID:19956685

  4. Learning of Temporal and Spatial Movement Aspects: A Comparison of Four Types of Haptic Control and Concurrent Visual Feedback.

    PubMed

    Rauter, Georg; Sigrist, Roland; Riener, Robert; Wolf, Peter

    2015-01-01

    In literature, the effectiveness of haptics for motor learning is controversially discussed. Haptics is believed to be effective for motor learning in general; however, different types of haptic control enhance different movement aspects. Thus, in dependence on the movement aspects of interest, one type of haptic control may be effective whereas another one is not. Therefore, in the current work, it was investigated if and how different types of haptic controllers affect learning of spatial and temporal movement aspects. In particular, haptic controllers that enforce active participation of the participants were expected to improve spatial aspects. Only haptic controllers that provide feedback about the task's velocity profile were expected to improve temporal aspects. In a study on learning a complex trunk-arm rowing task, the effect of training with four different types of haptic control was investigated: position control, path control, adaptive path control, and reactive path control. A fifth group (control) trained with visual concurrent augmented feedback. As hypothesized, the position controller was most effective for learning of temporal movement aspects, while the path controller was most effective in teaching spatial movement aspects of the rowing task. Visual feedback was also effective for learning temporal and spatial movement aspects.

  5. Prevailing Trends in Haptic Feedback Simulation for Minimally Invasive Surgery.

    PubMed

    Pinzon, David; Byrns, Simon; Zheng, Bin

    2016-08-01

    Background The amount of direct hand-tool-tissue interaction and feedback in minimally invasive surgery varies from being attenuated in laparoscopy to being completely absent in robotic minimally invasive surgery. The role of haptic feedback during surgical skill acquisition and its emphasis in training have been a constant source of controversy. This review discusses the major developments in haptic simulation as they relate to surgical performance and the current research questions that remain unanswered. Search Strategy An in-depth review of the literature was performed using PubMed. Results A total of 198 abstracts were returned based on our search criteria. Three major areas of research were identified, including advancements in 1 of the 4 components of haptic systems, evaluating the effectiveness of haptic integration in simulators, and improvements to haptic feedback in robotic surgery. Conclusions Force feedback is the best method for tissue identification in minimally invasive surgery and haptic feedback provides the greatest benefit to surgical novices in the early stages of their training. New technology has improved our ability to capture, playback and enhance to utility of haptic cues in simulated surgery. Future research should focus on deciphering how haptic training in surgical education can increase performance, safety, and improve training efficiency. © The Author(s) 2016.

  6. Haptic Foot Pedal: Influence of Shoe Type, Age, and Gender on Subjective Pulse Perception.

    PubMed

    Geitner, Claudia; Birrell, Stewart; Krehl, Claudia; Jennings, Paul

    2018-06-01

    This study investigates the influence of shoe type (sneakers and safety boots), age, and gender on the perception of haptic pulse feedback provided by a prototype accelerator pedal in a running stationary vehicle. Haptic feedback can be a less distracting alternative to traditionally visual and auditory in-vehicle feedback. However, to be effective, the device delivering the haptic feedback needs to be in contact with the person. Factors such as shoe type vary naturally over the season and could render feedback that is perceived well in one situation, unnoticeable in another. In this study, we evaluate factors that can influence the subjective perception of haptic feedback in a stationary but running car: shoe type, age, and gender. Thirty-six drivers within three age groups (≤39, 40-59, and ≥60) took part. For each haptic feedback, participants rated intensity, urgency, and comfort via a questionnaire. The perception of the haptic feedback is significantly influenced by the interaction between the pulse's duration and force amplitude and the participant's age and gender but not shoe type. The results indicate that it is important to consider different age groups and gender in the evaluation of haptic feedback. Future research might also look into approaches to adapt haptic feedback to the individual driver's preferences. Findings from this study can be applied to the design of an accelerator pedal in a car, for example, for a nonvisual in-vehicle warning, but also to plan user studies with a haptic pedal in general.

  7. KinoHaptics: An Automated, Wearable, Haptic Assisted, Physio-therapeutic System for Post-surgery Rehabilitation and Self-care.

    PubMed

    Rajanna, Vijay; Vo, Patrick; Barth, Jerry; Mjelde, Matthew; Grey, Trevor; Oduola, Cassandra; Hammond, Tracy

    2016-03-01

    A carefully planned, structured, and supervised physiotherapy program, following a surgery, is crucial for the successful diagnosis of physical injuries. Nearly 50 % of the surgeries fail due to unsupervised, and erroneous physiotherapy. The demand for a physiotherapist for an extended period is expensive to afford, and sometimes inaccessible. Researchers have tried to leverage the advancements in wearable sensors and motion tracking by building affordable, automated, physio-therapeutic systems that direct a physiotherapy session by providing audio-visual feedback on patient's performance. There are many aspects of automated physiotherapy program which are yet to be addressed by the existing systems: a wide classification of patients' physiological conditions to be diagnosed, multiple demographics of the patients (blind, deaf, etc.), and the need to pursue patients to adopt the system for an extended period for self-care. In our research, we have tried to address these aspects by building a health behavior change support system called KinoHaptics, for post-surgery rehabilitation. KinoHaptics is an automated, wearable, haptic assisted, physio-therapeutic system that can be used by a wide variety of demographics and for various physiological conditions of the patients. The system provides rich and accurate vibro-haptic feedback that can be felt by the user, irrespective of the physiological limitations. KinoHaptics is built to ensure that no injuries are induced during the rehabilitation period. The persuasive nature of the system allows for personal goal-setting, progress tracking, and most importantly life-style compatibility. The system was evaluated under laboratory conditions, involving 14 users. Results show that KinoHaptics is highly convenient to use, and the vibro-haptic feedback is intuitive, accurate, and has shown to prevent accidental injuries. Also, results show that KinoHaptics is persuasive in nature as it supports behavior change and habit building. The successful acceptance of KinoHaptics, an automated, wearable, haptic assisted, physio-therapeutic system proves the need and future-scope of automated physio-therapeutic systems for self-care and behavior change. It also proves that such systems incorporated with vibro-haptic feedback encourage strong adherence to the physiotherapy program; can have profound impact on the physiotherapy experience resulting in higher acceptance rate.

  8. A one degree of freedom haptic system to investigate issues in human perception with particular application to probing tissue.

    PubMed

    Dibble, Edward; Zivanovic, Aleksandar; Davies, Brian

    2004-01-01

    This paper presents the results of several early studies relating to human haptic perception sensitivity when probing a virtual object. A 1 degree of freedom (DoF) rotary haptic system, that was designed and built for this purpose, is also presented. The experiments were to assess the maximum forces applied in a minimally invasive surgery (MIS) procedure, quantify the compliance sensitivity threshold when probing virtual tissue and identify the haptic system loop rate necessary for haptic feedback to feel realistic.

  9. fgui: A Method for Automatically Creating Graphical User Interfaces for Command-Line R Packages

    PubMed Central

    Hoffmann, Thomas J.; Laird, Nan M.

    2009-01-01

    The fgui R package is designed for developers of R packages, to help rapidly, and sometimes fully automatically, create a graphical user interface for a command line R package. The interface is built upon the Tcl/Tk graphical interface included in R. The package further facilitates the developer by loading in the help files from the command line functions to provide context sensitive help to the user with no additional effort from the developer. Passing a function as the argument to the routines in the fgui package creates a graphical interface for the function, and further options are available to tweak this interface for those who want more flexibility. PMID:21625291

  10. Create a Logo.

    ERIC Educational Resources Information Center

    Duchen, Gail

    2002-01-01

    Presents an art lesson that introduced students to graphic art as a career path. Explains that the students met a graphic artist and created a logo for a pretend client. Explains that the students researched logos. (CMK)

  11. A new visual feedback-based magnetorheological haptic master for robot-assisted minimally invasive surgery

    NASA Astrophysics Data System (ADS)

    Choi, Seung-Hyun; Kim, Soomin; Kim, Pyunghwa; Park, Jinhyuk; Choi, Seung-Bok

    2015-06-01

    In this study, we developed a novel four-degrees-of-freedom haptic master using controllable magnetorheological (MR) fluid. We also integrated the haptic master with a vision device with image processing for robot-assisted minimally invasive surgery (RMIS). The proposed master can be used in RMIS as a haptic interface to provide the surgeon with a sense of touch by using both kinetic and kinesthetic information. The slave robot, which is manipulated with a proportional-integrative-derivative controller, uses a force sensor to obtain the desired forces from tissue contact, and these desired repulsive forces are then embodied through the MR haptic master. To verify the effectiveness of the haptic master, the desired force and actual force are compared in the time domain. In addition, a visual feedback system is implemented in the RMIS experiment to distinguish between the tumor and organ more clearly and provide better visibility to the operator. The hue-saturation-value color space is adopted for the image processing since it is often more intuitive than other color spaces. The image processing and haptic feedback are realized on surgery performance. In this work, tumor-cutting experiments are conducted under four different operating conditions: haptic feedback on, haptic feedback off, image processing on, and image processing off. The experimental realization shows that the performance index, which is a function of pixels, is different in the four operating conditions.

  12. Aging and solid shape recognition: Vision and haptics.

    PubMed

    Norman, J Farley; Cheeseman, Jacob R; Adkins, Olivia C; Cox, Andrea G; Rogers, Connor E; Dowell, Catherine J; Baxter, Michael W; Norman, Hideko F; Reyes, Cecia M

    2015-10-01

    The ability of 114 younger and older adults to recognize naturally-shaped objects was evaluated in three experiments. The participants viewed or haptically explored six randomly-chosen bell peppers (Capsicum annuum) in a study session and were later required to judge whether each of twelve bell peppers was "old" (previously presented during the study session) or "new" (not presented during the study session). When recognition memory was tested immediately after study, the younger adults' (Experiment 1) performance for vision and haptics was identical when the individual study objects were presented once. Vision became superior to haptics, however, when the individual study objects were presented multiple times. When 10- and 20-min delays (Experiment 2) were inserted in between study and test sessions, no significant differences occurred between vision and haptics: recognition performance in both modalities was comparable. When the recognition performance of older adults was evaluated (Experiment 3), a negative effect of age was found for visual shape recognition (younger adults' overall recognition performance was 60% higher). There was no age effect, however, for haptic shape recognition. The results of the present experiments indicate that the visual recognition of natural object shape is different from haptic recognition in multiple ways: visual shape recognition can be superior to that of haptics and is affected by aging, while haptic shape recognition is less accurate and unaffected by aging. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Haptic Paddle Enhancements and a Formal Assessment of Student Learning in System Dynamics

    ERIC Educational Resources Information Center

    Gorlewicz, Jenna L.; Kratchman, Louis B.; Webster, Robert J., III

    2014-01-01

    The haptic paddle is a force-feedback joystick used at several universities in teaching System Dynamics, a core mechanical engineering undergraduate course where students learn to model dynamic systems in several domains. A second goal of the haptic paddle is to increase the accessibility of robotics and haptics by providing a low-cost device for…

  14. Identification of walked-upon materials in auditory, kinesthetic, haptic, and audio-haptic conditions.

    PubMed

    Giordano, Bruno L; Visell, Yon; Yao, Hsin-Yun; Hayward, Vincent; Cooperstock, Jeremy R; McAdams, Stephen

    2012-05-01

    Locomotion generates multisensory information about walked-upon objects. How perceptual systems use such information to get to know the environment remains unexplored. The ability to identify solid (e.g., marble) and aggregate (e.g., gravel) walked-upon materials was investigated in auditory, haptic or audio-haptic conditions, and in a kinesthetic condition where tactile information was perturbed with a vibromechanical noise. Overall, identification performance was better than chance in all experimental conditions and for both solids and the better identified aggregates. Despite large mechanical differences between the response of solids and aggregates to locomotion, for both material categories discrimination was at its worst in the auditory and kinesthetic conditions and at its best in the haptic and audio-haptic conditions. An analysis of the dominance of sensory information in the audio-haptic context supported a focus on the most accurate modality, haptics, but only for the identification of solid materials. When identifying aggregates, response biases appeared to produce a focus on the least accurate modality--kinesthesia. When walking on loose materials such as gravel, individuals do not perceive surfaces by focusing on the most accurate modality, but by focusing on the modality that would most promptly signal postural instabilities.

  15. Review of Designs for Haptic Data Visualization.

    PubMed

    Paneels, Sabrina; Roberts, Jonathan C

    2010-01-01

    There are many different uses for haptics, such as training medical practitioners, teleoperation, or navigation of virtual environments. This review focuses on haptic methods that display data. The hypothesis is that haptic devices can be used to present information, and consequently, the user gains quantitative, qualitative, or holistic knowledge about the presented data. Not only is this useful for users who are blind or partially sighted (who can feel line graphs, for instance), but also the haptic modality can be used alongside other modalities, to increase the amount of variables being presented, or to duplicate some variables to reinforce the presentation. Over the last 20 years, a significant amount of research has been done in haptic data presentation; e.g., researchers have developed force feedback line graphs, bar charts, and other forms of haptic representations. However, previous research is published in different conferences and journals, with different application emphases. This paper gathers and collates these various designs to provide a comprehensive review of designs for haptic data visualization. The designs are classified by their representation: Charts, Maps, Signs, Networks, Diagrams, Images, and Tables. This review provides a comprehensive reference for researchers and learners, and highlights areas for further research.

  16. Role of combined tactile and kinesthetic feedback in minimally invasive surgery.

    PubMed

    Lim, Soo-Chul; Lee, Hyung-Kew; Park, Joonah

    2014-10-18

    Haptic feedback is of critical importance in surgical tasks. However, conventional surgical robots do not provide haptic feedback to surgeons during surgery. Thus, in this study, a combined tactile and kinesthetic feedback system was developed to provide haptic feedback to surgeons during robotic surgery. To assess haptic feasibility, the effects of two types of haptic feedback were examined empirically - kinesthetic and tactile feedback - to measure object-pulling force with a telesurgery robotics system at two desired pulling forces (1 N and 2 N). Participants answered a set of questionnaires after experiments. The experimental results reveal reductions in force error (39.1% and 40.9%) when using haptic feedback during 1 N and 2 N pulling tasks. Moreover, survey analyses show the effectiveness of the haptic feedback during teleoperation. The combined tactile and kinesthetic feedback of the master device in robotic surgery improves the surgeon's ability to control the interaction force applied to the tissue. Copyright © 2014 John Wiley & Sons, Ltd. Copyright © 2014 John Wiley & Sons, Ltd.

  17. Study on development of active-passive rehabilitation system for upper limbs: Hybrid-PLEMO

    NASA Astrophysics Data System (ADS)

    Kikuchi, T.; Jin, Y.; Fukushima, K.; Akai, H.; Furusho, J.

    2009-02-01

    In recent years, many researchers have studied the potential of using robotics technology to assist and quantify the motor functions for neuron-rehabilitation. Some kinds of haptic devices have been developed and evaluated its efficiency with clinical tests, for example, upper limb training for patients with spasticity after stroke. Active-type (motor-driven) haptic devices can realize a lot of varieties of haptics. But they basically require high-cost safety system. On the other hand, passive-type (brake-based) haptic devices have inherent safety. However, the passive robot system has strong limitation on varieties of haptics. There are not sufficient evidences to clarify how the passive/active haptics effect to the rehabilitation of motor skills. In this paper, we developed an active-passive-switchable rehabilitation system with ER clutch/brake device named "Hybrid-PLEMO" in order to address these problems. In this paper, basic structures and haptic control methods of the Hybrid-PLEMO are described.

  18. Mixed-reality simulation for neurosurgical procedures.

    PubMed

    Bova, Frank J; Rajon, Didier A; Friedman, William A; Murad, Gregory J; Hoh, Daniel J; Jacob, R Patrick; Lampotang, Samsun; Lizdas, David E; Lombard, Gwen; Lister, J Richard

    2013-10-01

    Surgical education is moving rapidly to the use of simulation for technical training of residents and maintenance or upgrading of surgical skills in clinical practice. To optimize the learning exercise, it is essential that both visual and haptic cues are presented to best present a real-world experience. Many systems attempt to achieve this goal through a total virtual interface. To demonstrate that the most critical aspect in optimizing a simulation experience is to provide the visual and haptic cues, allowing the training to fully mimic the real-world environment. Our approach has been to create a mixed-reality system consisting of a physical and a virtual component. A physical model of the head or spine is created with a 3-dimensional printer using deidentified patient data. The model is linked to a virtual radiographic system or an image guidance platform. A variety of surgical challenges can be presented in which the trainee must use the same anatomic and radiographic references required during actual surgical procedures. Using the aforementioned techniques, we have created simulators for ventriculostomy, percutaneous stereotactic lesion procedure for trigeminal neuralgia, and spinal instrumentation. The design and implementation of these platforms are presented. The system has provided the residents an opportunity to understand and appreciate the complex 3-dimensional anatomy of the 3 neurosurgical procedures simulated. The systems have also provided an opportunity to break procedures down into critical segments, allowing the user to concentrate on specific areas of deficiency.

  19. Grounded Learning Experience: Helping Students Learn Physics through Visuo-Haptic Priming and Instruction

    NASA Astrophysics Data System (ADS)

    Huang, Shih-Chieh Douglas

    In this dissertation, I investigate the effects of a grounded learning experience on college students' mental models of physics systems. The grounded learning experience consisted of a priming stage and an instruction stage, and within each stage, one of two different types of visuo-haptic representation was applied: visuo-gestural simulation (visual modality and gestures) and visuo-haptic simulation (visual modality, gestures, and somatosensory information). A pilot study involving N = 23 college students examined how using different types of visuo-haptic representation in instruction affected people's mental model construction for physics systems. Participants' abilities to construct mental models were operationalized through their pretest-to-posttest gain scores for a basic physics system and their performance on a transfer task involving an advanced physics system. Findings from this pilot study revealed that, while both simulations significantly improved participants' mental modal construction for physics systems, visuo-haptic simulation was significantly better than visuo-gestural simulation. In addition, clinical interviews suggested that participants' mental model construction for physics systems benefited from receiving visuo-haptic simulation in a tutorial prior to the instruction stage. A dissertation study involving N = 96 college students examined how types of visuo-haptic representation in different applications support participants' mental model construction for physics systems. Participant's abilities to construct mental models were again operationalized through their pretest-to-posttest gain scores for a basic physics system and their performance on a transfer task involving an advanced physics system. Participants' physics misconceptions were also measured before and after the grounded learning experience. Findings from this dissertation study not only revealed that visuo-haptic simulation was significantly more effective in promoting mental model construction and remedying participants' physics misconceptions than visuo-gestural simulation, they also revealed that visuo-haptic simulation was more effective during the priming stage than during the instruction stage. Interestingly, the effects of visuo-haptic simulation in priming and visuo-haptic simulation in instruction on participants' pretest-to-posttest gain scores for a basic physics system appeared additive. These results suggested that visuo-haptic simulation is effective in physics learning, especially when it is used during the priming stage.

  20. Haptic feedback for virtual assembly

    NASA Astrophysics Data System (ADS)

    Luecke, Greg R.; Zafer, Naci

    1998-12-01

    Assembly operations require high speed and precision with low cost. The manufacturing industry has recently turned attenuation to the possibility of investigating assembly procedures using graphical display of CAD parts. For these tasks, some sort of feedback to the person is invaluable in providing a real sense of interaction with virtual parts. This research develops the use of a commercial assembly robot as the haptic display in such tasks. For demonstration, a peg-hole insertion task is studied. Kane's Method is employed to derive the dynamics of the peg and the contact motions between the peg and the hole. A handle modeled as a cylindrical peg is attached to the end effector of a PUMA 560 robotic arm. The arm is handle modeled as a cylindrical peg is attached to the end effector of a PUMA 560 robotic arm. The arm is equipped with a six axis force/torque transducer. The use grabs the handle and the user-applied forces are recorded. A 300 MHz Pentium computer is used to simulate the dynamics of the virtual peg and its interactions as it is inserted in the virtual hole. The computed torque control is then employed to exert the full dynamics of the task to the user hand. Visual feedback is also incorporated to help the user in the process of inserting the peg into the hole. Experimental results are presented to show several contact configurations for this virtually simulated task.

  1. The value of haptic feedback in conventional and robot-assisted minimal invasive surgery and virtual reality training: a current review.

    PubMed

    van der Meijden, O A J; Schijven, M P

    2009-06-01

    Virtual reality (VR) as surgical training tool has become a state-of-the-art technique in training and teaching skills for minimally invasive surgery (MIS). Although intuitively appealing, the true benefits of haptic (VR training) platforms are unknown. Many questions about haptic feedback in the different areas of surgical skills (training) need to be answered before adding costly haptic feedback in VR simulation for MIS training. This study was designed to review the current status and value of haptic feedback in conventional and robot-assisted MIS and training by using virtual reality simulation. A systematic review of the literature was undertaken using PubMed and MEDLINE. The following search terms were used: Haptic feedback OR Haptics OR Force feedback AND/OR Minimal Invasive Surgery AND/OR Minimal Access Surgery AND/OR Robotics AND/OR Robotic Surgery AND/OR Endoscopic Surgery AND/OR Virtual Reality AND/OR Simulation OR Surgical Training/Education. The results were assessed according to level of evidence as reflected by the Oxford Centre of Evidence-based Medicine Levels of Evidence. In the current literature, no firm consensus exists on the importance of haptic feedback in performing minimally invasive surgery. Although the majority of the results show positive assessment of the benefits of force feedback, results are ambivalent and not unanimous on the subject. Benefits are least disputed when related to surgery using robotics, because there is no haptic feedback in currently used robotics. The addition of haptics is believed to reduce surgical errors resulting from a lack of it, especially in knot tying. Little research has been performed in the area of robot-assisted endoscopic surgical training, but results seem promising. Concerning VR training, results indicate that haptic feedback is important during the early phase of psychomotor skill acquisition.

  2. Different haptic tools reduce trunk velocity in the frontal plane during walking, but haptic anchors have advantages over lightly touching a railing.

    PubMed

    Hedayat, Isabel; Moraes, Renato; Lanovaz, Joel L; Oates, Alison R

    2017-06-01

    There are different ways to add haptic input during walking which may affect walking balance. This study compared the use of two different haptic tools (rigid railing and haptic anchors) and investigated whether any effects on walking were the result of the added sensory input and/or the posture generated when using those tools. Data from 28 young healthy adults were collected using the Mobility Lab inertial sensor system (APDM, Oregon, USA). Participants walked with and without both haptic tools and while pretending to use both haptic tools (placebo trials), with eyes opened and eyes closed. Using the tools or pretending to use both tools decreased normalized stride velocity (p < .001-0.008) and peak medial-lateral (ML) trunk velocity (p < .001-0.001). Normalized stride velocity was slower when actually using the railing compared to placebo railing trials (p = .006). Using the anchors resulted in lower peak ML trunk velocity than the railing (p = .002). The anchors had lower peak ML trunk velocity than placebo anchors (p < .001), but there was no difference between railing and placebo railing (p > .999). These findings highlight a difference in the type of tool used to add haptic input and suggest that changes in balance control strategy resulting from using the railing are based on arm placement, where it is the posture combined with added sensory input that affects balance control strategies with the haptic anchors. These findings provide a strong framework for additional research to be conducted on the effects of haptic input on walking in populations known to have decreased walking balance.

  3. Haptic device development based on electro static force of cellulose electro active paper

    NASA Astrophysics Data System (ADS)

    Yun, Gyu-young; Kim, Sang-Youn; Jang, Sang-Dong; Kim, Dong-Gu; Kim, Jaehwan

    2011-04-01

    Haptic is one of well-considered device which is suitable for demanding virtual reality applications such as medical equipment, mobile devices, the online marketing and so on. Nowadays, many of concepts for haptic devices have been suggested to meet the demand of industries. Cellulose has received much attention as an emerging smart material, named as electro-active paper (EAPap). The EAPap is attractive for mobile haptic devices due to its unique characteristics in terms of low actuation power, suitability for thin devices and transparency. In this paper, we suggest a new concept of haptic actuator with the use of cellulose EAPap. Its performance is evaluated depending on various actuation conditions. As a result, cellulose electrostatic force actuator shows a large output displacement and fast response, which is suitable for mobile haptic devices.

  4. Detection thresholds for small haptic effects

    NASA Astrophysics Data System (ADS)

    Dosher, Jesse A.; Hannaford, Blake

    2002-02-01

    We are interested in finding out whether or not haptic interfaces will be useful in portable and hand held devices. Such systems will have severe constraints on force output. Our first step is to investigate the lower limits at which haptic effects can be perceived. In this paper we report on experiments studying the effects of varying the amplitude, size, shape, and pulse-duration of a haptic feature. Using a specific haptic device we measure the smallest detectable haptics effects, with active exploration of saw-tooth shaped icons sized 3, 4 and 5 mm, a sine-shaped icon 5 mm wide, and static pulses 50, 100, and 150 ms in width. Smooth shaped icons resulted in a detection threshold of approximately 55 mN, almost twice that of saw-tooth shaped icons which had a threshold of 31 mN.

  5. Enhancing audiovisual experience with haptic feedback: a survey on HAV.

    PubMed

    Danieau, F; Lecuyer, A; Guillotel, P; Fleureau, J; Mollet, N; Christie, M

    2013-01-01

    Haptic technology has been widely employed in applications ranging from teleoperation and medical simulation to art and design, including entertainment, flight simulation, and virtual reality. Today there is a growing interest among researchers in integrating haptic feedback into audiovisual systems. A new medium emerges from this effort: haptic-audiovisual (HAV) content. This paper presents the techniques, formalisms, and key results pertinent to this medium. We first review the three main stages of the HAV workflow: the production, distribution, and rendering of haptic effects. We then highlight the pressing necessity for evaluation techniques in this context and discuss the key challenges in the field. By building on existing technologies and tackling the specific challenges of the enhancement of audiovisual experience with haptics, we believe the field presents exciting research perspectives whose financial and societal stakes are significant.

  6. Haptic interface of web-based training system for interventional radiology procedures

    NASA Astrophysics Data System (ADS)

    Ma, Xin; Lu, Yiping; Loe, KiaFock; Nowinski, Wieslaw L.

    2004-05-01

    The existing web-based medical training systems and surgical simulators can provide affordable and accessible medical training curriculum, but they seldom offer the trainee realistic and affordable haptic feedback. Therefore, they cannot offer the trainee a suitable practicing environment. In this paper, a haptic solution for interventional radiology (IR) procedures is proposed. System architecture of a web-based training system for IR procedures is briefly presented first. Then, the mechanical structure, the working principle and the application of a haptic device are discussed in detail. The haptic device works as an interface between the training environment and the trainees and is placed at the end user side. With the system, the user can be trained on the interventional radiology procedures - navigating catheters, inflating balloons, deploying coils and placing stents on the web and get surgical haptic feedback in real time.

  7. Investigating Students' Ideas About Buoyancy and the Influence of Haptic Feedback

    NASA Astrophysics Data System (ADS)

    Minogue, James; Borland, David

    2016-04-01

    While haptics (simulated touch) represents a potential breakthrough technology for science teaching and learning, there is relatively little research into its differential impact in the context of teaching and learning. This paper describes the testing of a haptically enhanced simulation (HES) for learning about buoyancy. Despite a lifetime of everyday experiences, a scientifically sound explanation of buoyancy remains difficult to construct for many. It requires the integration of domain-specific knowledge regarding density, fluid, force, gravity, mass, weight, and buoyancy. Prior studies suggest that novices often focus on only one dimension of the sinking and floating phenomenon. Our HES was designed to promote the integration of the subconcepts of density and buoyant forces and stresses the relationship between the object itself and the surrounding fluid. The study employed a randomized pretest-posttest control group research design and a suite of measures including an open-ended prompt and objective content questions to provide insights into the influence of haptic feedback on undergraduate students' thinking about buoyancy. A convenience sample (n = 40) was drawn from a university's population of undergraduate elementary education majors. Two groups were formed from haptic feedback (n = 22) and no haptic feedback (n = 18). Through content analysis, discernible differences were seen in the posttest explanations sinking and floating across treatment groups. Learners that experienced the haptic feedback made more frequent use of "haptically grounded" terms (e.g., mass, gravity, buoyant force, pushing), leading us to begin to build a local theory of language-mediated haptic cognition.

  8. Development of visuo-haptic transfer for object recognition in typical preschool and school-aged children.

    PubMed

    Purpura, Giulia; Cioni, Giovanni; Tinelli, Francesca

    2018-07-01

    Object recognition is a long and complex adaptive process and its full maturation requires combination of many different sensory experiences as well as cognitive abilities to manipulate previous experiences in order to develop new percepts and subsequently to learn from the environment. It is well recognized that the transfer of visual and haptic information facilitates object recognition in adults, but less is known about development of this ability. In this study, we explored the developmental course of object recognition capacity in children using unimodal visual information, unimodal haptic information, and visuo-haptic information transfer in children from 4 years to 10 years and 11 months of age. Participants were tested through a clinical protocol, involving visual exploration of black-and-white photographs of common objects, haptic exploration of real objects, and visuo-haptic transfer of these two types of information. Results show an age-dependent development of object recognition abilities for visual, haptic, and visuo-haptic modalities. A significant effect of time on development of unimodal and crossmodal recognition skills was found. Moreover, our data suggest that multisensory processes for common object recognition are active at 4 years of age. They facilitate recognition of common objects, and, although not fully mature, are significant in adaptive behavior from the first years of age. The study of typical development of visuo-haptic processes in childhood is a starting point for future studies regarding object recognition in impaired populations.

  9. Visual and Haptic Shape Processing in the Human Brain: Unisensory Processing, Multisensory Convergence, and Top-Down Influences.

    PubMed

    Lee Masson, Haemy; Bulthé, Jessica; Op de Beeck, Hans P; Wallraven, Christian

    2016-08-01

    Humans are highly adept at multisensory processing of object shape in both vision and touch. Previous studies have mostly focused on where visually perceived object-shape information can be decoded, with haptic shape processing receiving less attention. Here, we investigate visuo-haptic shape processing in the human brain using multivoxel correlation analyses. Importantly, we use tangible, parametrically defined novel objects as stimuli. Two groups of participants first performed either a visual or haptic similarity-judgment task. The resulting perceptual object-shape spaces were highly similar and matched the physical parameter space. In a subsequent fMRI experiment, objects were first compared within the learned modality and then in the other modality in a one-back task. When correlating neural similarity spaces with perceptual spaces, visually perceived shape was decoded well in the occipital lobe along with the ventral pathway, whereas haptically perceived shape information was mainly found in the parietal lobe, including frontal cortex. Interestingly, ventrolateral occipito-temporal cortex decoded shape in both modalities, highlighting this as an area capable of detailed visuo-haptic shape processing. Finally, we found haptic shape representations in early visual cortex (in the absence of visual input), when participants switched from visual to haptic exploration, suggesting top-down involvement of visual imagery on haptic shape processing. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  10. Creating Interactive Graphical Overlays in the Advanced Weather Interactive Processing System (AWIPS) Using Shapefiles and DGM Files

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Lafosse, Richard; Hood, Doris; Hoeth, Brian

    2007-01-01

    Graphical overlays can be created in real-time in the Advanced Weather Interactive Processing System (AWIPS) using shapefiles or DARE Graphics Metafile (DGM) files. This presentation describes how to create graphical overlays on-the-fly for AWIPS, by using two examples of AWIPS applications that were created by the Applied Meteorology Unit (AMU). The first example is the Anvil Threat Corridor Forecast Tool, which produces a shapefile that depicts a graphical threat corridor of the forecast movement of thunderstorm anvil clouds, based on the observed or forecast upper-level winds. This tool is used by the Spaceflight Meteorology Group (SMG) and 45th Weather Squadron (45 WS) to analyze the threat of natural or space vehicle-triggered lightning over a location. The second example is a launch and landing trajectory tool that produces a DGM file that plots the ground track of space vehicles during launch or landing. The trajectory tool can be used by SMG and the 45 WS forecasters to analyze weather radar imagery along a launch or landing trajectory. Advantages of both file types will be listed.

  11. Haptic Exploration in Humans and Machines: Attribute Integration and Machine Recognition/Implementation.

    DTIC Science & Technology

    1988-04-30

    side it necessary and Identify’ by’ block n~nmbot) haptic hand, touch , vision, robot, object recognition, categorization 20. AGSTRPACT (Continue an...established that the haptic system has remarkable capabilities for object recognition. We define haptics as purposive touch . The basic tactual system...gathered ratings of the importance of dimensions for categorizing common objects by touch . Texture and hardness ratings strongly co-vary, which is

  12. Shifty: A Weight-Shifting Dynamic Passive Haptic Proxy to Enhance Object Perception in Virtual Reality.

    PubMed

    Zenner, Andre; Kruger, Antonio

    2017-04-01

    We define the concept of Dynamic Passive Haptic Feedback (DPHF) for virtual reality by introducing the weight-shifting physical DPHF proxy object Shifty. This concept combines actuators known from active haptics and physical proxies known from passive haptics to construct proxies that automatically adapt their passive haptic feedback. We describe the concept behind our ungrounded weight-shifting DPHF proxy Shifty and the implementation of our prototype. We then investigate how Shifty can, by automatically changing its internal weight distribution, enhance the user's perception of virtual objects interacted with in two experiments. In a first experiment, we show that Shifty can enhance the perception of virtual objects changing in shape, especially in length and thickness. Here, Shifty was shown to increase the user's fun and perceived realism significantly, compared to an equivalent passive haptic proxy. In a second experiment, Shifty is used to pick up virtual objects of different virtual weights. The results show that Shifty enhances the perception of weight and thus the perceived realism by adapting its kinesthetic feedback to the picked-up virtual object. In the same experiment, we additionally show that specific combinations of haptic, visual and auditory feedback during the pick-up interaction help to compensate for visual-haptic mismatch perceived during the shifting process.

  13. Virtual wall-based haptic-guided teleoperated surgical robotic system for single-port brain tumor removal surgery.

    PubMed

    Seung, Sungmin; Choi, Hongseok; Jang, Jongseong; Kim, Young Soo; Park, Jong-Oh; Park, Sukho; Ko, Seong Young

    2017-01-01

    This article presents a haptic-guided teleoperation for a tumor removal surgical robotic system, so-called a SIROMAN system. The system was developed in our previous work to make it possible to access tumor tissue, even those that seat deeply inside the brain, and to remove the tissue with full maneuverability. For a safe and accurate operation to remove only tumor tissue completely while minimizing damage to the normal tissue, a virtual wall-based haptic guidance together with a medical image-guided control is proposed and developed. The virtual wall is extracted from preoperative medical images, and the robot is controlled to restrict its motion within the virtual wall using haptic feedback. Coordinate transformation between sub-systems, a collision detection algorithm, and a haptic-guided teleoperation using a virtual wall are described in the context of using SIROMAN. A series of experiments using a simplified virtual wall are performed to evaluate the performance of virtual wall-based haptic-guided teleoperation. With haptic guidance, the accuracy of the robotic manipulator's trajectory is improved by 57% compared to one without. The tissue removal performance is also improved by 21% ( p < 0.05). The experiments show that virtual wall-based haptic guidance provides safer and more accurate tissue removal for single-port brain surgery.

  14. Haptograph Representation of Real-World Haptic Information by Wideband Force Control

    NASA Astrophysics Data System (ADS)

    Katsura, Seiichiro; Irie, Kouhei; Ohishi, Kiyoshi

    Artificial acquisition and reproduction of human sensations are basic technologies of communication engineering. For example, auditory information is obtained by a microphone, and a speaker reproduces it by artificial means. Furthermore, a video camera and a television make it possible to transmit visual sensation by broadcasting. On the contrary, since tactile or haptic information is subject to the Newton's “law of action and reaction” in the real world, a device which acquires, transmits, and reproduces the information has not been established. From the point of view, real-world haptics is the key technology for future haptic communication engineering. This paper proposes a novel acquisition method of haptic information named “haptograph”. The haptograph visualizes the haptic information like photograph. The proposed haptograph is applied to haptic recognition of the contact environment. A linear motor contacts to the surface of the environment and its reaction force is used to make a haptograph. A robust contact motion and sensor-less sensing of the reaction force are attained by using a disturbance observer. As a result, an encyclopedia of contact environment is attained. Since temporal and spatial analyses are conducted to represent haptic information as the haptograph, it is possible to be recognized and to be evaluated intuitively.

  15. Visuo-Haptic Mixed Reality with Unobstructed Tool-Hand Integration.

    PubMed

    Cosco, Francesco; Garre, Carlos; Bruno, Fabio; Muzzupappa, Maurizio; Otaduy, Miguel A

    2013-01-01

    Visuo-haptic mixed reality consists of adding to a real scene the ability to see and touch virtual objects. It requires the use of see-through display technology for visually mixing real and virtual objects, and haptic devices for adding haptic interaction with the virtual objects. Unfortunately, the use of commodity haptic devices poses obstruction and misalignment issues that complicate the correct integration of a virtual tool and the user's real hand in the mixed reality scene. In this work, we propose a novel mixed reality paradigm where it is possible to touch and see virtual objects in combination with a real scene, using commodity haptic devices, and with a visually consistent integration of the user's hand and the virtual tool. We discuss the visual obstruction and misalignment issues introduced by commodity haptic devices, and then propose a solution that relies on four simple technical steps: color-based segmentation of the hand, tracking-based segmentation of the haptic device, background repainting using image-based models, and misalignment-free compositing of the user's hand. We have developed a successful proof-of-concept implementation, where a user can touch virtual objects and interact with them in the context of a real scene, and we have evaluated the impact on user performance of obstruction and misalignment correction.

  16. Development of a StandAlone Surgical Haptic Arm.

    PubMed

    Jones, Daniel; Lewis, Andrew; Fischer, Gregory S

    2011-01-01

    When performing telesurgery with current commercially available Minimally Invasive Robotic Surgery (MIRS) systems, a surgeon cannot feel the tool interactions that are inherent in traditional laparoscopy. It is proposed that haptic feedback in the control of MIRS systems could improve the speed, safety and learning curve of robotic surgery. To test this hypothesis, a standalone surgical haptic arm (SASHA) capable of manipulating da Vinci tools has been designed and fabricated with the additional ability of providing information for haptic feedback. This arm was developed as a research platform for developing and evaluating approaches to telesurgery, including various haptic mappings between master and slave and evaluating the effects of latency.

  17. Comparative study on collaborative interaction in non-immersive and immersive systems

    NASA Astrophysics Data System (ADS)

    Shahab, Qonita M.; Kwon, Yong-Moo; Ko, Heedong; Mayangsari, Maria N.; Yamasaki, Shoko; Nishino, Hiroaki

    2007-09-01

    This research studies the Virtual Reality simulation for collaborative interaction so that different people from different places can interact with one object concurrently. Our focus is the real-time handling of inputs from multiple users, where object's behavior is determined by the combination of the multiple inputs. Issues addressed in this research are: 1) The effects of using haptics on a collaborative interaction, 2) The possibilities of collaboration between users from different environments. We conducted user tests on our system in several cases: 1) Comparison between non-haptics and haptics collaborative interaction over LAN, 2) Comparison between non-haptics and haptics collaborative interaction over Internet, and 3) Analysis of collaborative interaction between non-immersive and immersive display environments. The case studies are the interaction of users in two cases: collaborative authoring of a 3D model by two users, and collaborative haptic interaction by multiple users. In Virtual Dollhouse, users can observe physics law while constructing a dollhouse using existing building blocks, under gravity effects. In Virtual Stretcher, multiple users can collaborate on moving a stretcher together while feeling each other's haptic motions.

  18. Palpation simulator with stable haptic feedback.

    PubMed

    Kim, Sang-Youn; Ryu, Jee-Hwan; Lee, WooJeong

    2015-01-01

    The main difficulty in constructing palpation simulators is to compute and to generate stable and realistic haptic feedback without vibration. When a user haptically interacts with highly non-homogeneous soft tissues through a palpation simulator, a sudden change of stiffness in target tissues causes unstable interaction with the object. We propose a model consisting of a virtual adjustable damper and an energy measuring element. The energy measuring element gauges energy which is stored in a palpation simulator and the virtual adjustable damper dissipates the energy to achieve stable haptic interaction. To investigate the haptic behavior of the proposed method, impulse and continuous inputs are provided to target tissues. If a haptic interface point meets with the hardest portion in the target tissues modeled with a conventional method, we observe unstable motion and feedback force. However, when the target tissues are modeled with the proposed method, a palpation simulator provides stable interaction without vibration. The proposed method overcomes a problem in conventional haptic palpation simulators where unstable force or vibration can be generated if there is a big discrepancy in material property between an element and its neighboring elements in target tissues.

  19. Functional specialization and convergence in the occipito-temporal cortex supporting haptic and visual identification of human faces and body parts: an fMRI study.

    PubMed

    Kitada, Ryo; Johnsrude, Ingrid S; Kochiyama, Takanori; Lederman, Susan J

    2009-10-01

    Humans can recognize common objects by touch extremely well whenever vision is unavailable. Despite its importance to a thorough understanding of human object recognition, the neuroscientific study of this topic has been relatively neglected. To date, the few published studies have addressed the haptic recognition of nonbiological objects. We now focus on haptic recognition of the human body, a particularly salient object category for touch. Neuroimaging studies demonstrate that regions of the occipito-temporal cortex are specialized for visual perception of faces (fusiform face area, FFA) and other body parts (extrastriate body area, EBA). Are the same category-sensitive regions activated when these components of the body are recognized haptically? Here, we use fMRI to compare brain organization for haptic and visual recognition of human body parts. Sixteen subjects identified exemplars of faces, hands, feet, and nonbiological control objects using vision and haptics separately. We identified two discrete regions within the fusiform gyrus (FFA and the haptic face region) that were each sensitive to both haptically and visually presented faces; however, these two regions differed significantly in their response patterns. Similarly, two regions within the lateral occipito-temporal area (EBA and the haptic body region) were each sensitive to body parts in both modalities, although the response patterns differed. Thus, although the fusiform gyrus and the lateral occipito-temporal cortex appear to exhibit modality-independent, category-sensitive activity, our results also indicate a degree of functional specialization related to sensory modality within these structures.

  20. Solid shape discrimination from vision and haptics: natural objects (Capsicum annuum) and Gibson's "feelies".

    PubMed

    Norman, J Farley; Phillips, Flip; Holmin, Jessica S; Norman, Hideko F; Beers, Amanda M; Boswell, Alexandria M; Cheeseman, Jacob R; Stethen, Angela G; Ronning, Cecilia

    2012-10-01

    A set of three experiments evaluated 96 participants' ability to visually and haptically discriminate solid object shape. In the past, some researchers have found haptic shape discrimination to be substantially inferior to visual shape discrimination, while other researchers have found haptics and vision to be essentially equivalent. A primary goal of the present study was to understand these discrepant past findings and to determine the true capabilities of the haptic system. All experiments used the same task (same vs. different shape discrimination) and stimulus objects (James Gibson's "feelies" and a set of naturally shaped objects--bell peppers). However, the methodology varied across experiments. Experiment 1 used random 3-dimensional (3-D) orientations of the stimulus objects, and the conditions were full-cue (active manipulation of objects and rotation of the visual objects in depth). Experiment 2 restricted the 3-D orientations of the stimulus objects and limited the haptic and visual information available to the participants. Experiment 3 compared restricted and full-cue conditions using random 3-D orientations. We replicated both previous findings in the current study. When we restricted visual and haptic information (and placed the stimulus objects in the same orientation on every trial), the participants' visual performance was superior to that obtained for haptics (replicating the earlier findings of Davidson et al. in Percept Psychophys 15(3):539-543, 1974). When the circumstances resembled those of ordinary life (e.g., participants able to actively manipulate objects and see them from a variety of perspectives), we found no significant difference between visual and haptic solid shape discrimination.

  1. A study on haptic collaborative game in shared virtual environment

    NASA Astrophysics Data System (ADS)

    Lu, Keke; Liu, Guanyang; Liu, Lingzhi

    2013-03-01

    A study on collaborative game in shared virtual environment with haptic feedback over computer networks is introduced in this paper. A collaborative task was used where the players located at remote sites and played the game together. The player can feel visual and haptic feedback in virtual environment compared to traditional networked multiplayer games. The experiment was desired in two conditions: visual feedback only and visual-haptic feedback. The goal of the experiment is to assess the impact of force feedback on collaborative task performance. Results indicate that haptic feedback is beneficial for performance enhancement for collaborative game in shared virtual environment. The outcomes of this research can have a powerful impact on the networked computer games.

  2. A recursive Bayesian updating model of haptic stiffness perception.

    PubMed

    Wu, Bing; Klatzky, Roberta L

    2018-06-01

    Stiffness of many materials follows Hooke's Law, but the mechanism underlying the haptic perception of stiffness is not as simple as it seems in the physical definition. The present experiments support a model by which stiffness perception is adaptively updated during dynamic interaction. Participants actively explored virtual springs and estimated their stiffness relative to a reference. The stimuli were simulations of linear springs or nonlinear springs created by modulating a linear counterpart with low-amplitude, half-cycle (Experiment 1) or full-cycle (Experiment 2) sinusoidal force. Experiment 1 showed that subjective stiffness increased (decreased) as a linear spring was positively (negatively) modulated by a half-sinewave force. In Experiment 2, an opposite pattern was observed for full-sinewave modulations. Modeling showed that the results were best described by an adaptive process that sequentially and recursively updated an estimate of stiffness using the force and displacement information sampled over trajectory and time. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  3. Study on Collaborative Object Manipulation in Virtual Environment

    NASA Astrophysics Data System (ADS)

    Mayangsari, Maria Niken; Yong-Moo, Kwon

    This paper presents comparative study on network collaboration performance in different immersion. Especially, the relationship between user collaboration performance and degree of immersion provided by the system is addressed and compared based on several experiments. The user tests on our system include several cases: 1) Comparison between non-haptics and haptics collaborative interaction over LAN, 2) Comparison between non-haptics and haptics collaborative interaction over Internet, and 3) Analysis of collaborative interaction between non-immersive and immersive display environments.

  4. A “virtually minimal” visuo-haptic training of attention in severe traumatic brain injury

    PubMed Central

    2013-01-01

    Background Although common during the early stages of recovery from severe traumatic brain injury (TBI), attention deficits have been scarcely investigated. Encouraging evidence suggests beneficial effects of attention training in more chronic and higher functioning patients. Interactive technology may provide new opportunities for rehabilitation in inpatients who are earlier in their recovery. Methods We designed a “virtually minimal” approach using robot-rendered haptics in a virtual environment to train severely injured inpatients in the early stages of recovery to sustain attention to a visuo-motor task. 21 inpatients with severe TBI completed repetitive reaching toward targets that were both seen and felt. Patients were tested over two consecutive days, experiencing 3 conditions (no haptic feedback, a break-through force, and haptic nudge) in 12 successive, 4-minute blocks. Results The interactive visuo-haptic environments were well-tolerated and engaging. Patients typically remained attentive to the task. However, patients exhibited attention loss both before (prolonged initiation) and during (pauses during motion) a movement. Compared to no haptic feedback, patients benefited from haptic nudge cues but not break-through forces. As training progressed, patients increased the number of targets acquired and spontaneously improved from one day to the next. Conclusions Interactive visuo-haptic environments could be beneficial for attention training for severe TBI patients in the early stages of recovery and warrants further and more prolonged clinical testing. PMID:23938101

  5. A "virtually minimal" visuo-haptic training of attention in severe traumatic brain injury.

    PubMed

    Dvorkin, Assaf Y; Ramaiya, Milan; Larson, Eric B; Zollman, Felise S; Hsu, Nancy; Pacini, Sonia; Shah, Amit; Patton, James L

    2013-08-09

    Although common during the early stages of recovery from severe traumatic brain injury (TBI), attention deficits have been scarcely investigated. Encouraging evidence suggests beneficial effects of attention training in more chronic and higher functioning patients. Interactive technology may provide new opportunities for rehabilitation in inpatients who are earlier in their recovery. We designed a "virtually minimal" approach using robot-rendered haptics in a virtual environment to train severely injured inpatients in the early stages of recovery to sustain attention to a visuo-motor task. 21 inpatients with severe TBI completed repetitive reaching toward targets that were both seen and felt. Patients were tested over two consecutive days, experiencing 3 conditions (no haptic feedback, a break-through force, and haptic nudge) in 12 successive, 4-minute blocks. The interactive visuo-haptic environments were well-tolerated and engaging. Patients typically remained attentive to the task. However, patients exhibited attention loss both before (prolonged initiation) and during (pauses during motion) a movement. Compared to no haptic feedback, patients benefited from haptic nudge cues but not break-through forces. As training progressed, patients increased the number of targets acquired and spontaneously improved from one day to the next. Interactive visuo-haptic environments could be beneficial for attention training for severe TBI patients in the early stages of recovery and warrants further and more prolonged clinical testing.

  6. Development of Velocity Guidance Assistance System by Haptic Accelerator Pedal Reaction Force Control

    NASA Astrophysics Data System (ADS)

    Yin, Feilong; Hayashi, Ryuzo; Raksincharoensak, Pongsathorn; Nagai, Masao

    This research proposes a haptic velocity guidance assistance system for realizing eco-driving as well as enhancing traffic capacity by cooperating with ITS (Intelligent Transportation Systems). The proposed guidance system generates the desired accelerator pedal (abbreviated as pedal) stroke with respect to the desired velocity obtained from ITS considering vehicle dynamics, and provides the desired pedal stroke to the driver via a haptic pedal whose reaction force is controllable and guides the driver in order to trace the desired velocity in real time. The main purpose of this paper is to discuss the feasibility of the haptic velocity guidance. A haptic velocity guidance system for research is developed on the Driving Simulator of TUAT (DS), by attaching a low-inertia, low-friction motor to the pedal, which does not change the original characteristics of the original pedal when it is not operated, implementing an algorithm regarding the desired pedal stroke calculation and the reaction force controller. The haptic guidance maneuver is designed based on human pedal stepping experiments. A simple velocity profile with acceleration, deceleration and cruising is synthesized according to naturalistic driving for testing the proposed system. The experiment result of 9 drivers shows that the haptic guidance provides high accuracy and quick response in velocity tracking. These results prove that the haptic guidance is a promising velocity guidance method from the viewpoint of HMI (Human Machine Interface).

  7. Haptic perception accuracy depending on self-produced movement.

    PubMed

    Park, Chulwook; Kim, Seonjin

    2014-01-01

    This study measured whether self-produced movement influences haptic perception ability (experiment 1) as well as the factors associated with levels of influence (experiment 2) in racket sports. For experiment 1, the haptic perception accuracy levels of five male table tennis experts and five male novices were examined under two different conditions (no movement vs. movement). For experiment 2, the haptic afferent subsystems of five male table tennis experts and five male novices were investigated in only the self-produced movement-coupled condition. Inferential statistics (ANOVA, t-test) and custom-made devices (shock & vibration sensor, Qualisys Track Manager) of the data were used to determine the haptic perception accuracy (experiment 1, experiment 2) and its association with expertise. The results of this research show that expert-level players acquire higher accuracy with less variability (racket vibration and angle) than novice-level players, especially in their self-produced movement coupled performances. The important finding from this result is that, in terms of accuracy, the skill-associated differences were enlarged during self-produced movement. To explain the origin of this difference between experts and novices, the functional variability of haptic afferent subsystems can serve as a reference. These two factors (self-produced accuracy and the variability of haptic features) as investigated in this study would be useful criteria for educators in racket sports and suggest a broader hypothesis for further research into the effects of the haptic accuracy related to variability.

  8. Mechatronic design of haptic forceps for robotic surgery.

    PubMed

    Rizun, P; Gunn, D; Cox, B; Sutherland, G

    2006-12-01

    Haptic feedback increases operator performance and comfort during telerobotic manipulation. Feedback of grasping pressure is critical in many microsurgical tasks, yet no haptic interface for surgical tools is commercially available. Literature on the psychophysics of touch was reviewed to define the spectrum of human touch perception and the fidelity requirements of an ideal haptic interface. Mechanical design and control literature was reviewed to translate the psychophysical requirements to engineering specification. High-fidelity haptic forceps were then developed through an iterative process between engineering and surgery. The forceps are a modular device that integrate with a haptic hand controller to add force feedback for tool actuation in telerobotic or virtual surgery. Their overall length is 153 mm and their mass is 125 g. A contact-free voice coil actuator generates force feedback at frequencies up to 800 Hz. Maximum force output is 6 N (2N continuous) and the force resolution is 4 mN. The forceps employ a contact-free magnetic position sensor as well as micro-machined accelerometers to measure opening/closing acceleration. Position resolution is 0.6 microm with 1.3 microm RMS noise. The forceps can simulate stiffness greater than 20N/mm or impedances smaller than 15 g with no noticeable haptic artifacts or friction. As telerobotic surgery evolves, haptics will play an increasingly important role. Copyright 2006 John Wiley & Sons, Ltd.

  9. What you can't feel won't hurt you: Evaluating haptic hardware using a haptic contrast sensitivity function.

    PubMed

    Salisbury, C M; Gillespie, R B; Tan, H Z; Barbagli, F; Salisbury, J K

    2011-01-01

    In this paper, we extend the concept of the contrast sensitivity function - used to evaluate video projectors - to the evaluation of haptic devices. We propose using human observers to determine if vibrations rendered using a given haptic device are accompanied by artifacts detectable to humans. This determination produces a performance measure that carries particular relevance to applications involving texture rendering. For cases in which a device produces detectable artifacts, we have developed a protocol that localizes deficiencies in device design and/or hardware implementation. In this paper, we present results from human vibration detection experiments carried out using three commercial haptic devices and one high performance voice coil motor. We found that all three commercial devices produced perceptible artifacts when rendering vibrations near human detection thresholds. Our protocol allowed us to pinpoint the deficiencies, however, and we were able to show that minor modifications to the haptic hardware were sufficient to make these devices well suited for rendering vibrations, and by extension, the vibratory components of textures. We generalize our findings to provide quantitative design guidelines that ensure the ability of haptic devices to proficiently render the vibratory components of textures.

  10. Learning, retention, and generalization of haptic categories

    NASA Astrophysics Data System (ADS)

    Do, Phuong T.

    This dissertation explored how haptic concepts are learned, retained, and generalized to the same or different modality. Participants learned to classify objects into three categories either visually or haptically via different training procedures, followed by an immediate or delayed transfer test. Experiment I involved visual versus haptic learning and transfer. Intermodal matching between vision and haptics was investigated in Experiment II. Experiments III and IV examined intersensory conflict in within- and between-category bimodal situations to determine the degree of perceptual dominance between sight and touch. Experiment V explored the intramodal relationship between similarity and categorization in a psychological space, as revealed by MDS analysis of similarity judgments. Major findings were: (1) visual examination resulted in relatively higher performance accuracy than haptic learning; (2) systematic training produced better category learning of haptic concepts across all modality conditions; (3) the category prototypes were rated newer than any transfer stimulus followed learning both immediately and after a week delay; and, (4) although they converged at the apex of two transformational trajectories, the category prototypes became more central to their respective categories and increasingly structured as a function of learning. Implications for theories of multimodal similarity and categorization behavior are discussed in terms of discrimination learning, sensory integration, and dominance relation.

  11. A Multi-Finger Interface with MR Actuators for Haptic Applications.

    PubMed

    Qin, Huanhuan; Song, Aiguo; Gao, Zhan; Liu, Yuqing; Jiang, Guohua

    2018-01-01

    Haptic devices with multi-finger input are highly desirable in providing realistic and natural feelings when interacting with the remote or virtual environment. Compared with the conventional actuators, MR (Magneto-rheological) actuators are preferable options in haptics because of larger passive torque and torque-volume ratios. Among the existing haptic MR actuators, most of them are still bulky and heavy. If they were smaller and lighter, they would become more suitable for haptics. In this paper, a small-scale yet powerful MR actuator was designed to build a multi-finger interface for the 6 DOF haptic device. The compact structure was achieved by adopting the multi-disc configuration. Based on this configuration, the MR actuator can generate the maximum torque of 480 N.mm with dimensions of only 36 mm diameter and 18 mm height. Performance evaluation showed that it can exhibit a relatively high dynamic range and good response characteristics when compared with some other haptic MR actuators. The multi-finger interface is equipped with three MR actuators and can provide up to 8 N passive force to the thumb, index and middle fingers, respectively. An application example was used to demonstrate the effectiveness and potential of this new MR actuator based interface.

  12. Roughness based perceptual analysis towards digital skin imaging system with haptic feedback.

    PubMed

    Kim, K

    2016-08-01

    To examine psoriasis or atopic eczema, analyzing skin roughness by palpation is essential to precisely diagnose skin diseases. However, optical sensor based skin imaging systems do not allow dermatologists to touch skin images. To solve the problem, a new haptic rendering technology that can accurately display skin roughness must be developed. In addition, the rendering algorithm must be able to filter spatial noises created during 2D to 3D image conversion without losing the original roughness on the skin image. In this study, a perceptual way to design a noise filter that will remove spatial noises and in the meantime recover maximized roughness is introduced by understanding human sensitivity on surface roughness. A visuohaptic rendering system that can provide a user with seeing and touching digital skin surface roughness has been developed including a geometric roughness estimation method from a meshed surface. In following, a psychophysical experiment was designed and conducted with 12 human subjects to measure human perception with the developed visual and haptic interfaces to examine surface roughness. From the psychophysical experiment, it was found that touch is more sensitive at lower surface roughness, and vice versa. Human perception with both senses, vision and touch, becomes less sensitive to surface distortions as roughness increases. When interact with both channels, visual and haptic interfaces, the performance to detect abnormalities on roughness is greatly improved by sensory integration with the developed visuohaptic rendering system. The result can be used as a guideline to design a noise filter that can perceptually remove spatial noises while recover maximized roughness values from a digital skin image obtained by optical sensors. In addition, the result also confirms that the developed visuohaptic rendering system can help dermatologists or skin care professionals examine skin conditions by using vision and touch at the same time. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  13. Optimizing Cubature for Efficient Integration of Subspace Deformations

    PubMed Central

    An, Steven S.; Kim, Theodore; James, Doug L.

    2009-01-01

    We propose an efficient scheme for evaluating nonlinear subspace forces (and Jacobians) associated with subspace deformations. The core problem we address is efficient integration of the subspace force density over the 3D spatial domain. Similar to Gaussian quadrature schemes that efficiently integrate functions that lie in particular polynomial subspaces, we propose cubature schemes (multi-dimensional quadrature) optimized for efficient integration of force densities associated with particular subspace deformations, particular materials, and particular geometric domains. We support generic subspace deformation kinematics, and nonlinear hyperelastic materials. For an r-dimensional deformation subspace with O(r) cubature points, our method is able to evaluate subspace forces at O(r2) cost. We also describe composite cubature rules for runtime error estimation. Results are provided for various subspace deformation models, several hyperelastic materials (St.Venant-Kirchhoff, Mooney-Rivlin, Arruda-Boyce), and multimodal (graphics, haptics, sound) applications. We show dramatically better efficiency than traditional Monte Carlo integration. CR Categories: I.6.8 [Simulation and Modeling]: Types of Simulation—Animation, I.3.5 [Computer Graphics]: Computational Geometry and Object Modeling—Physically based modeling G.1.4 [Mathematics of Computing]: Numerical Analysis—Quadrature and Numerical Differentiation PMID:19956777

  14. Haptic augmented skin surface generation toward telepalpation from a mobile skin image.

    PubMed

    Kim, K

    2018-05-01

    Very little is known about the methods of integrating palpation techniques to existing mobile teleskin imaging that delivers low quality tactile information (roughness) for telepalpation. However, no study has been reported yet regarding telehaptic palpation using mobile phone images for teledermatology or teleconsultations of skincare. This study is therefore aimed at introducing a new algorithm accurately reconstructing a haptic augmented skin surface for telehaptic palpation using a low-cost clip-on microscope simply attached to a mobile phone. Multiple algorithms such as gradient-based image enhancement, roughness-adaptive tactile mask generation, roughness-enhanced 3D tactile map building, and visual and haptic rendering with a three-degrees-of-freedom (DOF) haptic device were developed and integrated as one system. Evaluation experiments have been conducted to test the performance of 3D roughness reconstruction with/without the tactile mask. The results confirm that reconstructed haptic roughness with the tactile mask is superior to the reconstructed haptic roughness without the tactile mask. Additional experiments demonstrate that the proposed algorithm is robust against varying lighting conditions and blurring. In last, a user study has been designed to see the effect of the haptic modality to the existing visual only interface and the results attest that the haptic skin palpation can significantly improve the skin exam performance. Mobile image-based telehaptic palpation technology was proposed, and an initial version was developed. The developed technology was tested with several skin images and the experimental results showed the superiority of the proposed scheme in terms of the performance of haptic augmentation of real skin images. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  15. The contributions of vision and haptics to reaching and grasping

    PubMed Central

    Stone, Kayla D.; Gonzalez, Claudia L. R.

    2015-01-01

    This review aims to provide a comprehensive outlook on the sensory (visual and haptic) contributions to reaching and grasping. The focus is on studies in developing children, normal, and neuropsychological populations, and in sensory-deprived individuals. Studies have suggested a right-hand/left-hemisphere specialization for visually guided grasping and a left-hand/right-hemisphere specialization for haptically guided object recognition. This poses the interesting possibility that when vision is not available and grasping relies heavily on the haptic system, there is an advantage to use the left hand. We review the evidence for this possibility and dissect the unique contributions of the visual and haptic systems to grasping. We ultimately discuss how the integration of these two sensory modalities shape hand preference. PMID:26441777

  16. Robotic guidance benefits the learning of dynamic, but not of spatial movement characteristics.

    PubMed

    Lüttgen, Jenna; Heuer, Herbert

    2012-10-01

    Robotic guidance is an engineered form of haptic-guidance training and intended to enhance motor learning in rehabilitation, surgery, and sports. However, its benefits (and pitfalls) are still debated. Here, we investigate the effects of different presentation modes on the reproduction of a spatiotemporal movement pattern. In three different groups of participants, the movement was demonstrated in three different modalities, namely visual, haptic, and visuo-haptic. After demonstration, participants had to reproduce the movement in two alternating recall conditions: haptic and visuo-haptic. Performance of the three groups during recall was compared with regard to spatial and dynamic movement characteristics. After haptic presentation, participants showed superior dynamic accuracy, whereas after visual presentation, participants performed better with regard to spatial accuracy. Added visual feedback during recall always led to enhanced performance, independent of the movement characteristic and the presentation modality. These findings substantiate the different benefits of different presentation modes for different movement characteristics. In particular, robotic guidance is beneficial for the learning of dynamic, but not of spatial movement characteristics.

  17. Haptic feedback in OP:Sense - augmented reality in telemanipulated robotic surgery.

    PubMed

    Beyl, T; Nicolai, P; Mönnich, H; Raczkowksy, J; Wörn, H

    2012-01-01

    In current research, haptic feedback in robot assisted interventions plays an important role. However most approaches to haptic feedback only regard the mapping of the current forces at the surgical instrument to the haptic input devices, whereas surgeons demand a combination of medical imaging and telemanipulated robotic setups. In this paper we describe how this feature is integrated in our robotic research platform OP:Sense. The proposed method allows the automatic transfer of segmented imaging data to the haptic renderer and therefore allows enriching the haptic feedback with virtual fixtures based on imaging data. Anatomical structures are extracted from pre-operative generated medical images or virtual walls are defined by the surgeon inside the imaging data. Combining real forces with virtual fixtures can guide the surgeon to the regions of interest as well as helps to prevent the risk of damage to critical structures inside the patient. We believe that the combination of medical imaging and telemanipulation is a crucial step for the next generation of MIRS-systems.

  18. Design of a 7-DOF haptic master using a magneto-rheological devices for robot surgery

    NASA Astrophysics Data System (ADS)

    Kang, Seok-Rae; Choi, Seung-Bok; Hwang, Yong-Hoon; Cha, Seung-Woo

    2017-04-01

    This paper presents a 7 degrees-of-freedom (7-DOF) haptic master which is applicable to the robot-assisted minimally invasive surgery (RMIS). By utilizing a controllable magneto-rheological (MR) fluid, the haptic master can provide force information to the surgeon during surgery. The proposed haptic master consists of three degrees motions of X, Y, Z and four degrees motions of the pitch, yaw, roll and grasping. All of them have force feedback capability. The proposed haptic master can generate the repulsive forces or torques by activating MR clutch and MR brake. Both MR clutch and MR brake are designed and manufactured with consideration of the size and output torque which is usable to the robotic surgery. A proportional-integral-derivative (PID) controller is then designed and implemented to achieve torque/force tracking trajectories. It is verified that the proposed haptic master can track well the desired torque and force occurred in the surgical place by controlling the input current applied to MR clutch and brake.

  19. Haptic fMRI: Reliability and performance of electromagnetic haptic interfaces for motion and force neuroimaging experiments.

    PubMed

    Menon, Samir; Zhu, Jack; Goyal, Deeksha; Khatib, Oussama

    2017-07-01

    Haptic interfaces compatible with functional magnetic resonance imaging (Haptic fMRI) promise to enable rich motor neuroscience experiments that study how humans perform complex manipulation tasks. Here, we present a large-scale study (176 scans runs, 33 scan sessions) that characterizes the reliability and performance of one such electromagnetically actuated device, Haptic fMRI Interface 3 (HFI-3). We outline engineering advances that ensured HFI-3 did not interfere with fMRI measurements. Observed fMRI temporal noise levels with HFI-3 operating were at the fMRI baseline (0.8% noise to signal). We also present results from HFI-3 experiments demonstrating that high resolution fMRI can be used to study spatio-temporal patterns of fMRI blood oxygenation dependent (BOLD) activation. These experiments include motor planning, goal-directed reaching, and visually-guided force control. Observed fMRI responses are consistent with existing literature, which supports Haptic fMRI's effectiveness at studying the brain's motor regions.

  20. Evaluation of Wearable Haptic Systems for the Fingers in Augmented Reality Applications.

    PubMed

    Maisto, Maurizio; Pacchierotti, Claudio; Chinello, Francesco; Salvietti, Gionata; De Luca, Alessandro; Prattichizzo, Domenico

    2017-01-01

    Although Augmented Reality (AR) has been around for almost five decades, only recently we have witnessed AR systems and applications entering in our everyday life. Representative examples of this technological revolution are the smartphone games "Pokémon GO" and "Ingress" or the Google Translate real-time sign interpretation app. Even if AR applications are already quite compelling and widespread, users are still not able to physically interact with the computer-generated reality. In this respect, wearable haptics can provide the compelling illusion of touching the superimposed virtual objects without constraining the motion or the workspace of the user. In this paper, we present the experimental evaluation of two wearable haptic interfaces for the fingers in three AR scenarios, enrolling 38 participants. In the first experiment, subjects were requested to write on a virtual board using a real chalk. The haptic devices provided the interaction forces between the chalk and the board. In the second experiment, subjects were asked to pick and place virtual and real objects. The haptic devices provided the interaction forces due to the weight of the virtual objects. In the third experiment, subjects were asked to balance a virtual sphere on a real cardboard. The haptic devices provided the interaction forces due to the weight of the virtual sphere rolling on the cardboard. Providing haptic feedback through the considered wearable device significantly improved the performance of all the considered tasks. Moreover, subjects significantly preferred conditions providing wearable haptic feedback.

  1. enhancedGraphics: a Cytoscape app for enhanced node graphics

    PubMed Central

    Morris, John H.; Kuchinsky, Allan; Ferrin, Thomas E.; Pico, Alexander R.

    2014-01-01

    enhancedGraphics ( http://apps.cytoscape.org/apps/enhancedGraphics) is a Cytoscape app that implements a series of enhanced charts and graphics that may be added to Cytoscape nodes. It enables users and other app developers to create pie, line, bar, and circle plots that are driven by columns in the Cytoscape Node Table. Charts are drawn using vector graphics to allow full-resolution scaling. PMID:25285206

  2. Effects of Motion and Figural Goodness on Haptic Object Perception in Infancy.

    ERIC Educational Resources Information Center

    Streri, Arlette; Spelke, Elizabeth S.

    1989-01-01

    After haptic habituation to a ring display, infants perceived the rings in two experiments as parts of one connected object. In both haptic and visual modes, infants appeared to perceive object unity by analyzing motion but not by analyzing figural goodness. (RH)

  3. Teaching Classical Mechanics Concepts Using Visuo-Haptic Simulators

    ERIC Educational Resources Information Center

    Neri, Luis; Noguez, Julieta; Robledo-Rella, Victor; Escobar-Castillejos, David; Gonzalez-Nucamendi, Andres

    2018-01-01

    In this work, the design and implementation of several physics scenarios using haptic devices are presented and discussed. Four visuo-haptic applications were developed for an undergraduate engineering physics course. Experiments with experimental and control groups were designed and implemented. Activities and exercises related to classical…

  4. Defining Identities through Multiliteracies: EL Teens Narrate Their Immigration Experiences as Graphic Stories

    ERIC Educational Resources Information Center

    Danzak, Robin L.

    2011-01-01

    Based on a framework of identity-as-narrative and multiliteracies, this article describes "Graphic Journeys," a multimedia literacy project in which English learners (ELs) in middle school created graphic stories that expressed their families' immigration experiences. The process involved reading graphic novels, journaling, interviewing, and…

  5. Writing from behind the Fence: Incarcerated Youths and a Graphic Novel on HIV/AIDS

    ERIC Educational Resources Information Center

    Gavigan, Karen; Albright, Kendra

    2015-01-01

    Graphic novels are an increasingly popular format that educators can use as a tool to teach reading and writing skills across the K-12 curriculum. This article describes a project in which incarcerated youths collaborated with a graphic illustrator to create a graphic novel about teens dealing with issues related to HIV/AIDS. The graphic novel is…

  6. Operator dynamics for stability condition in haptic and teleoperation system: A survey.

    PubMed

    Li, Hongbing; Zhang, Lei; Kawashima, Kenji

    2018-04-01

    Currently, haptic systems ignore the varying impedance of the human hand with its countless configurations and thus cannot recreate the complex haptic interactions. The literature does not reveal a comprehensive survey on the methods proposed and this study is an attempt to bridge this gap. The paper includes an extensive review of human arm impedance modeling and control deployed to address inherent stability and transparency issues in haptic interaction and teleoperation systems. Detailed classification and comparative study of various contributions in human arm modeling are presented and summarized in tables and diagrams. The main challenges in modeling human arm impedance for haptic robotic applications are identified. The possible future research directions are outlined based on the gaps identified in the survey. Copyright © 2018 John Wiley & Sons, Ltd.

  7. A Haptic-Enhanced System for Molecular Sensing

    NASA Astrophysics Data System (ADS)

    Comai, Sara; Mazza, Davide

    The science of haptics has received an enormous attention in the last decade. One of the major application trends of haptics technology is data visualization and training. In this paper, we present a haptically-enhanced system for manipulation and tactile exploration of molecules.The geometrical models of molecules is extracted either from theoretical or empirical data using file formats widely adopted in chemical and biological fields. The addition of information computed with computational chemistry tools, allows users to feel the interaction forces between an explored molecule and a charge associated to the haptic device, and to visualize a huge amount of numerical data in a more comprehensible way. The developed tool can be used either for teaching or research purposes due to its high reliance on both theoretical and experimental data.

  8. A Haptics Symposium Retrospective: 20 Years

    NASA Technical Reports Server (NTRS)

    Colgate, J. Edward; Adelstein, Bernard

    2012-01-01

    The very first "Haptics Symposium" actually went by the name "Issues in the Development of Kinesthetic Displays of Teleoperation and Virtual environments." The word "Haptic" didn't make it into the name until the next year. Not only was the most important word absent but so were RFPs, journals and commercial markets. And yet, as we prepare for the 2012 symposium, haptics is a thriving and amazingly diverse field of endeavor. In this talk we'll reflect on the origins of this field and on its evolution over the past twenty years, as well as the evolution of the Haptics Symposium itself. We hope to share with you some of the excitement we've felt along the way, and that we continue to feel as we look toward the future of our field.

  9. Haptic interfaces: Hardware, software and human performance

    NASA Technical Reports Server (NTRS)

    Srinivasan, Mandayam A.

    1995-01-01

    Virtual environments are computer-generated synthetic environments with which a human user can interact to perform a wide variety of perceptual and motor tasks. At present, most of the virtual environment systems engage only the visual and auditory senses, and not the haptic sensorimotor system that conveys the sense of touch and feel of objects in the environment. Computer keyboards, mice, and trackballs constitute relatively simple haptic interfaces. Gloves and exoskeletons that track hand postures have more interaction capabilities and are available in the market. Although desktop and wearable force-reflecting devices have been built and implemented in research laboratories, the current capabilities of such devices are quite limited. To realize the full promise of virtual environments and teleoperation of remote systems, further developments of haptic interfaces are critical. In this paper, the status and research needs in human haptics, technology development and interactions between the two are described. In particular, the excellent performance characteristics of Phantom, a haptic interface recently developed at MIT, are highlighted. Realistic sensations of single point of contact interactions with objects of variable geometry (e.g., smooth, textured, polyhedral) and material properties (e.g., friction, impedance) in the context of a variety of tasks (e.g., needle biopsy, switch panels) achieved through this device are described and the associated issues in haptic rendering are discussed.

  10. Invited Article: A review of haptic optical tweezers for an interactive microworld exploration

    NASA Astrophysics Data System (ADS)

    Pacoret, Cécile; Régnier, Stéphane

    2013-08-01

    This paper is the first review of haptic optical tweezers, a new technique which associates force feedback teleoperation with optical tweezers. This technique allows users to explore the microworld by sensing and exerting picoNewton-scale forces with trapped microspheres. Haptic optical tweezers also allow improved dexterity of micromanipulation and micro-assembly. One of the challenges of this technique is to sense and magnify picoNewton-scale forces by a factor of 1012 to enable human operators to perceive interactions that they have never experienced before, such as adhesion phenomena, extremely low inertia, and high frequency dynamics of extremely small objects. The design of optical tweezers for high quality haptic feedback is challenging, given the requirements for very high sensitivity and dynamic stability. The concept, design process, and specification of optical tweezers reviewed here are focused on those intended for haptic teleoperation. In this paper, two new specific designs as well as the current state-of-the-art are presented. Moreover, the remaining important issues are identified for further developments. The initial results obtained are promising and demonstrate that optical tweezers have a significant potential for haptic exploration of the microworld. Haptic optical tweezers will become an invaluable tool for force feedback micromanipulation of biological samples and nano- and micro-assembly parts.

  11. Direct Visuo-Haptic 4D Volume Rendering Using Respiratory Motion Models.

    PubMed

    Fortmeier, Dirk; Wilms, Matthias; Mastmeyer, Andre; Handels, Heinz

    2015-01-01

    This article presents methods for direct visuo-haptic 4D volume rendering of virtual patient models under respiratory motion. Breathing models are computed based on patient-specific 4D CT image data sequences. Virtual patient models are visualized in real-time by ray casting based rendering of a reference CT image warped by a time-variant displacement field, which is computed using the motion models at run-time. Furthermore, haptic interaction with the animated virtual patient models is provided by using the displacements computed at high rendering rates to translate the position of the haptic device into the space of the reference CT image. This concept is applied to virtual palpation and the haptic simulation of insertion of a virtual bendable needle. To this aim, different motion models that are applicable in real-time are presented and the methods are integrated into a needle puncture training simulation framework, which can be used for simulated biopsy or vessel puncture in the liver. To confirm real-time applicability, a performance analysis of the resulting framework is given. It is shown that the presented methods achieve mean update rates around 2,000 Hz for haptic simulation and interactive frame rates for volume rendering and thus are well suited for visuo-haptic rendering of virtual patients under respiratory motion.

  12. Evaluation of Motor Control Using Haptic Device

    NASA Astrophysics Data System (ADS)

    Nuruki, Atsuo; Kawabata, Takuro; Shimozono, Tomoyuki; Yamada, Masafumi; Yunokuchi, Kazutomo

    When the kinesthesia and the touch act at the same time, such perception is called haptic perception. This sense has the key role in motor information on the force and position control. The haptic perception is important in the field where the evaluation of the motor control is needed. The purpose of this paper is to evaluate the motor control, perception of heaviness and distance in normal and fatigue conditions using psychophysical experiment. We used a haptic device in order to generate precise force and distance, but the precedent of the evaluation system with the haptic device has been few. Therefore, it is another purpose to examine whether the haptic device is useful as evaluation system for the motor control. The psychophysical quantity of force and distance was measured by two kinds of experiments. Eight healthy subjects participated in this study. The stimulation was presented by haptic device [PHANTOM Omni: SensAble Company]. The subjects compared between standard and test stimulation, and answered it had felt which stimulation was strong. In the result of the psychophysical quantity of force, just noticeable difference (JND) had a significant difference, and point of subjective equality (PSE) was not different between normal and muscle fatigue. On the other hand, in the result of the psychophysical quantity of distance, JND and PSE were not difference between normal and muscle fatigue. These results show that control of force was influenced, but control of distance was not influenced in muscle fatigue. Moreover, these results suggested that the haptic device is useful as the evaluation system for the motor control.

  13. Mental rotation of tactile stimuli: using directional haptic cues in mobile devices.

    PubMed

    Gleeson, Brian T; Provancher, William R

    2013-01-01

    Haptic interfaces have the potential to enrich users' interactions with mobile devices and convey information without burdening the user's visual or auditory attention. Haptic stimuli with directional content, for example, navigational cues, may be difficult to use in handheld applications; the user's hand, where the cues are delivered, may not be aligned with the world, where the cues are to be interpreted. In such a case, the user would be required to mentally transform the stimuli between different reference frames. We examine the mental rotation of directional haptic stimuli in three experiments, investigating: 1) users' intuitive interpretation of rotated stimuli, 2) mental rotation of haptic stimuli about a single axis, and 3) rotation about multiple axes and the effects of specific hand poses and joint rotations. We conclude that directional haptic stimuli are suitable for use in mobile applications, although users do not naturally interpret rotated stimuli in any one universal way. We find evidence of cognitive processes involving the rotation of analog, spatial representations and discuss how our results fit into the larger body of mental rotation research. For small angles (e.g., less than 40 degree), these mental rotations come at little cost, but rotations with larger misalignment angles impact user performance. When considering the design of a handheld haptic device, our results indicate that hand pose must be carefully considered, as certain poses increase the difficulty of stimulus interpretation. Generally, all tested joint rotations impact task difficulty, but finger flexion and wrist rotation interact to greatly increase the cost of stimulus interpretation; such hand poses should be avoided when designing a haptic interface.

  14. Haptic-2D: A new haptic test battery assessing the tactual abilities of sighted and visually impaired children and adolescents with two-dimensional raised materials.

    PubMed

    Mazella, Anaïs; Albaret, Jean-Michel; Picard, Delphine

    2016-01-01

    To fill an important gap in the psychometric assessment of children and adolescents with impaired vision, we designed a new battery of haptic tests, called Haptic-2D, for visually impaired and sighted individuals aged five to 18 years. Unlike existing batteries, ours uses only two-dimensional raised materials that participants explore using active touch. It is composed of 11 haptic tests, measuring scanning skills, tactile discrimination skills, spatial comprehension skills, short-term tactile memory, and comprehension of tactile pictures. We administered this battery to 138 participants, half of whom were sighted (n=69), and half visually impaired (blind, n=16; low vision, n=53). Results indicated a significant main effect of age on haptic scores, but no main effect of vision or Age × Vision interaction effect. Reliability of test items was satisfactory (Cronbach's alpha, α=0.51-0.84). Convergent validity was good, as shown by a significant correlation (age partialled out) between total haptic scores and scores on the B101 test (rp=0.51, n=47). Discriminant validity was also satisfactory, as attested by a lower but still significant partial correlation between total haptic scores and the raw score on the verbal WISC (rp=0.43, n=62). Finally, test-retest reliability was good (rs=0.93, n=12; interval of one to two months). This new psychometric tool should prove useful to practitioners working with young people with impaired vision. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Performance evaluation of a robot-assisted catheter operating system with haptic feedback.

    PubMed

    Song, Yu; Guo, Shuxiang; Yin, Xuanchun; Zhang, Linshuai; Hirata, Hideyuki; Ishihara, Hidenori; Tamiya, Takashi

    2018-06-20

    In this paper, a novel robot-assisted catheter operating system (RCOS) has been proposed as a method to reduce physical stress and X-ray exposure time to physicians during endovascular procedures. The unique design of this system allows the physician to apply conventional bedside catheterization skills (advance, retreat and rotate) to an input catheter, which is placed at the master side to control another patient catheter placed at the slave side. For this purpose, a magnetorheological (MR) fluids-based master haptic interface has been developed to measure the axial and radial motions of an input catheter, as well as to provide the haptic feedback to the physician during the operation. In order to achieve a quick response of the haptic force in the master haptic interface, a hall sensor-based closed-loop control strategy is employed. In slave side, a catheter manipulator is presented to deliver the patient catheter, according to position commands received from the master haptic interface. The contact forces between the patient catheter and blood vessel system can be measured by designed force sensor unit of catheter manipulator. Four levels of haptic force are provided to make the operator aware of the resistance encountered by the patient catheter during the insertion procedure. The catheter manipulator was evaluated for precision positioning. The time lag from the sensed motion to replicated motion is tested. To verify the efficacy of the proposed haptic feedback method, the evaluation experiments in vitro are carried out. The results demonstrate that the proposed system has the ability to enable decreasing the contact forces between the catheter and vasculature.

  16. Investigations into haptic space and haptic perception of shape for active touch

    NASA Astrophysics Data System (ADS)

    Sanders, A. F. J.

    2008-12-01

    This thesis presents a number of psychophysical investigations into haptic space and haptic perception of shape. Haptic perception is understood to include the two subsystems of the cutaneous sense and kinesthesis. Chapter 2 provides an extensive quantitative study into haptic perception of curvature. I investigated bimanual curvature discrimination of cylindrically curved, hand-sized surfaces. I found that discrimination thresholds were in the same range as unimanual thresholds reported in previous studies. Moreover, the distance between the surfaces or the position of the setup with respect to the observer had no effect on thresholds. Finally, I found idiosyncratic biases: A number of observers judged two surfaces that had different radii as equally curved. Biases were of the same order of magnitude as thresholds. In Chapter 3, I investigated haptic space. Here, haptic space is understood to be (1) the set of observer’s judgments of spatial relations in physical space, and (2) a set of constraints by which these judgments are internally consistent. I asked blindfolded observers to construct straight lines in a number of different tasks. I show that the shape of the haptically straight line depends on the task used to produce it. I therefore conclude that there is no unique definition of the haptically straight line and that doubts are cast on the usefulness of the concept of haptic space. In Chapter 4, I present a new experiment into haptic length perception. I show that when observers trace curved pathways with their index finger and judge distance traversed, their distance estimates depend on the geometry of the paths: Lengths of convex, cylindrically curved pathways were overestimated and lengths of concave pathways were underestimated. In addition, I show that a kinematic mechanism must underlie this interaction: (1) the geometry of the path traced by the finger affects movement speed and consequently movement time, and (2) movement time is taken as a measure of traversed length. The study presented in Chapter 5 addresses the question of how kinematic properties of exploratory movements affect perceived shape. I identify a kinematic invariant for the case of a single finger moving across cylindrically curved strips under conditions of slip. I found that the rotation angle of the finger increased linearly with the curvature of the stimulus. In addition, I show that observers took rotation angle as their primary measure of perceived curvature: Observers rotated their finger less on a concave curvature by a constant amount, and consequently, they overestimated the radius of the concave strips compared to the convex ones. Finally, in Chapter 6, I investigated the haptic filled-space illusion for dynamic touch: Observers move their fingertip across an unfilled extent or an extent filled with intermediate stimulations. Previous researchers have reported lengths of filled extents to be overestimated, but the parameters affecting the strength of the illusion are still largely unknown. Factors investigated in this chapter include end point effects, filler density and overall average movement speed.

  17. Haptic Glove Technology: Skill Development through Video Game Play

    ERIC Educational Resources Information Center

    Bargerhuff, Mary Ellen; Cowan, Heidi; Oliveira, Francisco; Quek, Francis; Fang, Bing

    2010-01-01

    This article introduces a recently developed haptic glove system and describes how the participants used a video game that was purposely designed to train them in skills that are needed for the efficient use of the haptic glove. Assessed skills included speed, efficiency, embodied skill, and engagement. The findings and implications for future…

  18. The Role of Visual Experience on the Representation and Updating of Novel Haptic Scenes

    ERIC Educational Resources Information Center

    Pasqualotto, Achille; Newell, Fiona N.

    2007-01-01

    We investigated the role of visual experience on the spatial representation and updating of haptic scenes by comparing recognition performance across sighted, congenitally and late blind participants. We first established that spatial updating occurs in sighted individuals to haptic scenes of novel objects. All participants were required to…

  19. Effect of Auditory Interference on Memory of Haptic Perceptions.

    ERIC Educational Resources Information Center

    Anater, Paul F.

    1980-01-01

    The effect of auditory interference on the processing of haptic information by 61 visually impaired students (8 to 20 years old) was the focus of the research described in this article. It was assumed that as the auditory interference approximated the verbalized activity of the haptic task, accuracy of recall would decline. (Author)

  20. Investigating Students' Ideas about Buoyancy and the Influence of Haptic Feedback

    ERIC Educational Resources Information Center

    Minogue, James; Borland, David

    2016-01-01

    While haptics (simulated touch) represents a potential breakthrough technology for science teaching and learning, there is relatively little research into its differential impact in the context of teaching and learning. This paper describes the testing of a haptically enhanced simulation (HES) for learning about buoyancy. Despite a lifetime of…

  1. Superior haptic-to-visual shape matching in autism spectrum disorders.

    PubMed

    Nakano, Tamami; Kato, Nobumasa; Kitazawa, Shigeru

    2012-04-01

    A weak central coherence theory in autism spectrum disorder (ASD) proposes that a cognitive bias toward local processing in ASD derives from a weakness in integrating local elements into a coherent whole. Using this theory, we hypothesized that shape perception through active touch, which requires sequential integration of sensorimotor traces of exploratory finger movements into a shape representation, would be impaired in ASD. Contrary to our expectation, adults with ASD showed superior performance in a haptic-to-visual delayed shape-matching task compared to adults without ASD. Accuracy in discriminating haptic lengths or haptic orientations, which lies within the somatosensory modality, did not differ between adults with ASD and adults without ASD. Moreover, this superior ability in inter-modal haptic-to-visual shape matching was not explained by the score in a unimodal visuospatial rotation task. These results suggest that individuals with ASD are not impaired in integrating sensorimotor traces into a global visual shape and that their multimodal shape representations and haptic-to-visual information transfer are more accurate than those of individuals without ASD. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Haptic Recreation of Elbow Spasticity

    PubMed Central

    Kim, Jonghyun; Damiano, Diane L.

    2013-01-01

    The aim of this paper is to develop a haptic device capable of presenting standardized recreation of elbow spasticity. Using the haptic device, clinicians will be able to repeatedly practice the assessment of spasticity without requiring patient involvement, and these practice opportunities will help improve accuracy and reliability of the assessment itself. Haptic elbow spasticity simulator (HESS) was designed and prototyped according to mechanical requirements to recreate the feel of elbow spasticity. Based on the data collected from subjects with elbow spasticity, a mathematical model representing elbow spasticity is proposed. As an attempt to differentiate the feel of each score in Modified Ashworth Scale (MAS), parameters of the model were obtained respectively for three different MAS scores 1, 1+, and 2. The implemented haptic recreation was evaluated by experienced clinicians who were asked to give MAS scores by manipulating the haptic device. The clinicians who participated in the study were blinded to each other’s scores and to the given models. They distinguished the three models and the MAS scores given to the recreated models matched 100% with the original MAS scores from the patients. PMID:22275660

  3. Control of a haptic gear shifting assistance device utilizing a magnetorheological clutch

    NASA Astrophysics Data System (ADS)

    Han, Young-Min; Choi, Seung-Bok

    2014-10-01

    This paper proposes a haptic clutch driven gear shifting assistance device that can help when the driver shifts the gear of a transmission system. In order to achieve this goal, a magnetorheological (MR) fluid-based clutch is devised to be capable of the rotary motion of an accelerator pedal to which the MR clutch is integrated. The proposed MR clutch is then manufactured, and its transmission torque is experimentally evaluated according to the magnetic field intensity. The manufactured MR clutch is integrated with the accelerator pedal to transmit a haptic cue signal to the driver. The impending control issue is to cue the driver to shift the gear via the haptic force. Therefore, a gear-shifting decision algorithm is constructed by considering the vehicle engine speed concerned with engine combustion dynamics, vehicle dynamics and driving resistance. Then, the algorithm is integrated with a compensation strategy for attaining the desired haptic force. In this work, the compensator is also developed and implemented through the discrete version of the inverse hysteretic model. The control performances, such as the haptic force tracking responses and fuel consumption, are experimentally evaluated.

  4. Torque Measurement of 3-DOF Haptic Master Operated by Controllable Electrorheological Fluid

    NASA Astrophysics Data System (ADS)

    Oh, Jong-Seok; Choi, Seung-Bok; Lee, Yang-Sub

    2015-02-01

    This work presents a torque measurement method of 3-degree-of-freedom (3-DOF) haptic master featuring controllable electrorheological (ER) fluid. In order to reflect the sense of an organ for a surgeon, the ER haptic master which can generate the repulsive torque of an organ is utilized as a remote controller for a surgery robot. Since accurate representation of organ feeling is essential for the success of the robot-assisted surgery, it is indispensable to develop a proper torque measurement method of 3-DOF ER haptic master. After describing the structural configuration of the haptic master, the torque models of ER spherical joint are mathematically derived based on the Bingham model of ER fluid. A new type of haptic device which has pitching, rolling, and yawing motions is then designed and manufactured using a spherical joint mechanism. Subsequently, the field-dependent parameters of the Bingham model are identified and generating repulsive torque according to applied electric field is measured. In addition, in order to verify the effectiveness of the proposed torque model, a comparative work between simulated and measured torques is undertaken.

  5. Dynamics modeling for parallel haptic interfaces with force sensing and control.

    PubMed

    Bernstein, Nicholas; Lawrence, Dale; Pao, Lucy

    2013-01-01

    Closed-loop force control can be used on haptic interfaces (HIs) to mitigate the effects of mechanism dynamics. A single multidimensional force-torque sensor is often employed to measure the interaction force between the haptic device and the user's hand. The parallel haptic interface at the University of Colorado (CU) instead employs smaller 1D force sensors oriented along each of the five actuating rods to build up a 5D force vector. This paper shows that a particular manipulandum/hand partition in the system dynamics is induced by the placement and type of force sensing, and discusses the implications on force and impedance control for parallel haptic interfaces. The details of a "squaring down" process are also discussed, showing how to obtain reduced degree-of-freedom models from the general six degree-of-freedom dynamics formulation.

  6. Development of haptic system for surgical robot

    NASA Astrophysics Data System (ADS)

    Gang, Han Gyeol; Park, Jiong Min; Choi, Seung-Bok; Sohn, Jung Woo

    2017-04-01

    In this paper, a new type of haptic system for surgical robot application is proposed and its performances are evaluated experimentally. The proposed haptic system consists of an effective master device and a precision slave robot. The master device has 3-DOF rotational motion as same as human wrist motion. It has lightweight structure with a gyro sensor and three small-sized MR brakes for position measurement and repulsive torque generation, respectively. The slave robot has 3-DOF rotational motion using servomotors, five bar linkage and a torque sensor is used to measure resistive torque. It has been experimentally demonstrated that the proposed haptic system has good performances on tracking control of desired position and repulsive torque. It can be concluded that the proposed haptic system can be effectively applied to the surgical robot system in real field.

  7. fMRI-Compatible Electromagnetic Haptic Interface.

    PubMed

    Riener, R; Villgrattner, T; Kleiser, R; Nef, T; Kollias, S

    2005-01-01

    A new haptic interface device is suggested, which can be used for functional magnetic resonance imaging (fMRI) studies. The basic component of this 1 DOF haptic device are two coils that produce a Lorentz force induced by the large static magnetic field of the MR scanner. A MR-compatible optical angular encoder and a optical force sensor enable the implementation of different control architectures for haptic interactions. The challenge was to provide a large torque, and not to affect image quality by the currents applied in the device. The haptic device was tested in a 3T MR scanner. With a current of up to 1A and a distance of 1m to the focal point of the MR-scanner it was possible to generate torques of up to 4 Nm. Within these boundaries image quality was not affected.

  8. Assessment of Haptic Interaction for Home-Based Physical Tele-Therapy using Wearable Devices and Depth Sensors.

    PubMed

    Barmpoutis, Angelos; Alzate, Jose; Beekhuizen, Samantha; Delgado, Horacio; Donaldson, Preston; Hall, Andrew; Lago, Charlie; Vidal, Kevin; Fox, Emily J

    2016-01-01

    In this paper a prototype system is presented for home-based physical tele-therapy using a wearable device for haptic feedback. The haptic feedback is generated as a sequence of vibratory cues from 8 vibrator motors equally spaced along an elastic wearable band. The motors guide the patients' movement as they perform a prescribed exercise routine in a way that replaces the physical therapists' haptic guidance in an unsupervised or remotely supervised home-based therapy session. A pilot study of 25 human subjects was performed that focused on: a) testing the capability of the system to guide the users in arbitrary motion paths in the space and b) comparing the motion of the users during typical physical therapy exercises with and without haptic-based guidance. The results demonstrate the efficacy of the proposed system.

  9. The effect of haptic guidance and visual feedback on learning a complex tennis task.

    PubMed

    Marchal-Crespo, Laura; van Raai, Mark; Rauter, Georg; Wolf, Peter; Riener, Robert

    2013-11-01

    While haptic guidance can improve ongoing performance of a motor task, several studies have found that it ultimately impairs motor learning. However, some recent studies suggest that the haptic demonstration of optimal timing, rather than movement magnitude, enhances learning in subjects trained with haptic guidance. Timing of an action plays a crucial role in the proper accomplishment of many motor skills, such as hitting a moving object (discrete timing task) or learning a velocity profile (time-critical tracking task). The aim of the present study is to evaluate which feedback conditions-visual or haptic guidance-optimize learning of the discrete and continuous elements of a timing task. The experiment consisted in performing a fast tennis forehand stroke in a virtual environment. A tendon-based parallel robot connected to the end of a racket was used to apply haptic guidance during training. In two different experiments, we evaluated which feedback condition was more adequate for learning: (1) a time-dependent discrete task-learning to start a tennis stroke and (2) a tracking task-learning to follow a velocity profile. The effect that the task difficulty and subject's initial skill level have on the selection of the optimal training condition was further evaluated. Results showed that the training condition that maximizes learning of the discrete time-dependent motor task depends on the subjects' initial skill level. Haptic guidance was especially suitable for less-skilled subjects and in especially difficult discrete tasks, while visual feedback seems to benefit more skilled subjects. Additionally, haptic guidance seemed to promote learning in a time-critical tracking task, while visual feedback tended to deteriorate the performance independently of the task difficulty and subjects' initial skill level. Haptic guidance outperformed visual feedback, although additional studies are needed to further analyze the effect of other types of feedback visualization on motor learning of time-critical tasks.

  10. Office of Education Guide to Graphic Art Software

    NASA Technical Reports Server (NTRS)

    Davis, Angela M.

    1995-01-01

    During the summer experience in the LARSS program, the author created a performance support system showing the techniques of creating text in Quark XPress, placed the text into Adobe Illustrator along with scanned images, signatures and art work partially created in Adobe Photoshop. The purpose of the project was to familiarize the Office of Education Staff with Graphic Arts and the computer skills utilized to typeset and design certificates, brochures, cover pages, manuals, etc.

  11. Haptic Cues for Balance: Use of a Cane Provides Immediate Body Stabilization

    PubMed Central

    Sozzi, Stefania; Crisafulli, Oscar; Schieppati, Marco

    2017-01-01

    Haptic cues are important for balance. Knowledge of the temporal features of their effect may be crucial for the design of neural prostheses. Touching a stable surface with a fingertip reduces body sway in standing subjects eyes closed (EC), and removal of haptic cue reinstates a large sway pattern. Changes in sway occur rapidly on changing haptic conditions. Here, we describe the effects and time-course of stabilization produced by a haptic cue derived from a walking cane. We intended to confirm that cane use reduces body sway, to evaluate the effect of vision on stabilization by a cane, and to estimate the delay of the changes in body sway after addition and withdrawal of haptic input. Seventeen healthy young subjects stood in tandem position on a force platform, with eyes closed or open (EO). They gently lowered the cane onto and lifted it from a second force platform. Sixty trials per direction of haptic shift (Touch → NoTouch, T-NT; NoTouch → Touch, NT-T) and visual condition (EC-EO) were acquired. Traces of Center of foot Pressure (CoP) and the force exerted by cane were filtered, rectified, and averaged. The position in space of a reflective marker positioned on the cane tip was also acquired by an optoelectronic device. Cross-correlation (CC) analysis was performed between traces of cane tip and CoP displacement. Latencies of changes in CoP oscillation in the frontal plane EC following the T-NT and NT-T haptic shift were statistically estimated. The CoP oscillations were larger in EC than EO under both T and NT (p < 0.001) and larger during NT than T conditions (p < 0.001). Haptic-induced effect under EC (Romberg quotient NT/T ~ 1.2) was less effective than that of vision under NT condition (EC/EO ~ 1.5) (p < 0.001). With EO cane had little effect. Cane displacement lagged CoP displacement under both EC and EO. Latencies to changes in CoP oscillations were longer after addition (NT-T, about 1.6 s) than withdrawal (T-NT, about 0.9 s) of haptic input (p < 0.001). These latencies were similar to those occurring on fingertip touch, as previously shown. Overall, data speak in favor of substantial equivalence of the haptic information derived from both “direct” fingertip contact and “indirect” contact with the floor mediated by the cane. Cane, finger and visual inputs would be similarly integrated in the same neural centers for balance control. Haptic input from a walking aid and its processing time should be considered when designing prostheses for locomotion. PMID:29311785

  12. Multiple reference frames in haptic spatial processing

    NASA Astrophysics Data System (ADS)

    Volčič, R.

    2008-08-01

    The present thesis focused on haptic spatial processing. In particular, our interest was directed to the perception of spatial relations with the main focus on the perception of orientation. To this end, we studied haptic perception in different tasks, either in isolation or in combination with vision. The parallelity task, where participants have to match the orientations of two spatially separated bars, was used in its two-dimensional and three-dimensional versions in Chapter 2 and Chapter 3, respectively. The influence of non-informative vision and visual interference on performance in the parallelity task was studied in Chapter 4. A different task, the mental rotation task, was introduced in a purely haptic study in Chapter 5 and in a visuo-haptic cross-modal study in Chapter 6. The interaction of multiple reference frames and their influence on haptic spatial processing were the common denominators of these studies. In this thesis we approached the problems of which reference frames play the major role in haptic spatial processing and how the relative roles of distinct reference frames change depending on the available information and the constraints imposed by different tasks. We found that the influence of a reference frame centered on the hand was the major cause of the deviations from veridicality observed in both the two-dimensional and three-dimensional studies. The results were described by a weighted average model, in which the hand-centered egocentric reference frame is supposed to have a biasing influence on the allocentric reference frame. Performance in haptic spatial processing has been shown to depend also on sources of information or processing that are not strictly connected to the task at hand. When non-informative vision was provided, a beneficial effect was observed in the haptic performance. This improvement was interpreted as a shift from the egocentric to the allocentric reference frame. Moreover, interfering visual information presented in the vicinity of the haptic stimuli parametrically modulated the magnitude of the deviations. The influence of the hand-centered reference frame was shown also in the haptic mental rotation task where participants were quicker in judging the parity of objects when these were aligned with respect to the hands than when they were physically aligned. Similarly, in the visuo-haptic cross-modal mental rotation task the parity judgments were influenced by the orientation of the exploring hand with respect to the viewing direction. This effect was shown to be modulated also by an intervening temporal delay that supposedly counteracts the influence of the hand-centered reference frame. We suggest that the hand-centered reference frame is embedded in a hierarchical structure of reference frames where some of these emerge depending on the demands and the circumstances of the surrounding environment and the needs of an active perceiver.

  13. A Study on Immersion and Presence of a Portable Hand Haptic System for Immersive Virtual Reality

    PubMed Central

    Kim, Mingyu; Jeon, Changyu; Kim, Jinmo

    2017-01-01

    This paper proposes a portable hand haptic system using Leap Motion as a haptic interface that can be used in various virtual reality (VR) applications. The proposed hand haptic system was designed as an Arduino-based sensor architecture to enable a variety of tactile senses at low cost, and is also equipped with a portable wristband. As a haptic system designed for tactile feedback, the proposed system first identifies the left and right hands and then sends tactile senses (vibration and heat) to each fingertip (thumb and index finger). It is incorporated into a wearable band-type system, making its use easy and convenient. Next, hand motion is accurately captured using the sensor of the hand tracking system and is used for virtual object control, thus achieving interaction that enhances immersion. A VR application was designed with the purpose of testing the immersion and presence aspects of the proposed system. Lastly, technical and statistical tests were carried out to assess whether the proposed haptic system can provide a new immersive presence to users. According to the results of the presence questionnaire and the simulator sickness questionnaire, we confirmed that the proposed hand haptic system, in comparison to the existing interaction that uses only the hand tracking system, provided greater presence and a more immersive environment in the virtual reality. PMID:28513545

  14. Design of a 4-DOF MR haptic master for application to robot surgery: virtual environment work

    NASA Astrophysics Data System (ADS)

    Oh, Jong-Seok; Choi, Seung-Hyun; Choi, Seung-Bok

    2014-09-01

    This paper presents the design and control performance of a novel type of 4-degrees-of-freedom (4-DOF) haptic master in cyberspace for a robot-assisted minimally invasive surgery (RMIS) application. By using a controllable magnetorheological (MR) fluid, the proposed haptic master can have a feedback function for a surgical robot. Due to the difficulty in utilizing real human organs in the experiment, the cyberspace that features the virtual object is constructed to evaluate the performance of the haptic master. In order to realize the cyberspace, a volumetric deformable object is represented by a shape-retaining chain-linked (S-chain) model, which is a fast volumetric model and is suitable for real-time applications. In the haptic architecture for an RMIS application, the desired torque and position induced from the virtual object of the cyberspace and the haptic master of real space are transferred to each other. In order to validate the superiority of the proposed master and volumetric model, a tracking control experiment is implemented with a nonhomogenous volumetric cubic object to demonstrate that the proposed model can be utilized in real-time haptic rendering architecture. A proportional-integral-derivative (PID) controller is then designed and empirically implemented to accomplish the desired torque trajectories. It has been verified from the experiment that tracking the control performance for torque trajectories from a virtual slave can be successfully achieved.

  15. A Study on Immersion and Presence of a Portable Hand Haptic System for Immersive Virtual Reality.

    PubMed

    Kim, Mingyu; Jeon, Changyu; Kim, Jinmo

    2017-05-17

    This paper proposes a portable hand haptic system using Leap Motion as a haptic interface that can be used in various virtual reality (VR) applications. The proposed hand haptic system was designed as an Arduino-based sensor architecture to enable a variety of tactile senses at low cost, and is also equipped with a portable wristband. As a haptic system designed for tactile feedback, the proposed system first identifies the left and right hands and then sends tactile senses (vibration and heat) to each fingertip (thumb and index finger). It is incorporated into a wearable band-type system, making its use easy and convenient. Next, hand motion is accurately captured using the sensor of the hand tracking system and is used for virtual object control, thus achieving interaction that enhances immersion. A VR application was designed with the purpose of testing the immersion and presence aspects of the proposed system. Lastly, technical and statistical tests were carried out to assess whether the proposed haptic system can provide a new immersive presence to users. According to the results of the presence questionnaire and the simulator sickness questionnaire, we confirmed that the proposed hand haptic system, in comparison to the existing interaction that uses only the hand tracking system, provided greater presence and a more immersive environment in the virtual reality.

  16. Computer Recreations.

    ERIC Educational Resources Information Center

    Dewdney, A. K.

    1989-01-01

    Discussed are three examples of computer graphics including biomorphs, Truchet tilings, and fractal popcorn. The graphics are shown and the basic algorithm using multiple iteration of a particular function or mathematical operation is described. An illustration of a snail shell created by computer graphics is presented. (YP)

  17. Cyber integrated MEMS microhand for biological applications

    NASA Astrophysics Data System (ADS)

    Weissman, Adam; Frazier, Athena; Pepen, Michael; Lu, Yen-Wen; Yang, Shanchieh Jay

    2009-05-01

    Anthropomorphous robotic hands at microscales have been developed to receive information and perform tasks for biological applications. To emulate a human hand's dexterity, the microhand requires a master-slave interface with a wearable controller, force sensors, and perception displays for tele-manipulation. Recognizing the constraints and complexity imposed in developing feedback interface during miniaturization, this project address the need by creating an integrated cyber environment incorporating sensors with a microhand, haptic/visual display, and object model, to emulates human hands' psychophysical perception at microscale.

  18. Perceptual Grouping in Haptic Search: The Influence of Proximity, Similarity, and Good Continuation

    ERIC Educational Resources Information Center

    Overvliet, Krista E.; Krampe, Ralf Th.; Wagemans, Johan

    2012-01-01

    We conducted a haptic search experiment to investigate the influence of the Gestalt principles of proximity, similarity, and good continuation. We expected faster search when the distractors could be grouped. We chose edges at different orientations as stimuli because they are processed similarly in the haptic and visual modality. We therefore…

  19. Immediate Memory for Haptically-Examined Braille Symbols by Blind and Sighted Subjects.

    ERIC Educational Resources Information Center

    Newman, Slater E.; And Others

    The paper reports on two experiments in Braille learning which compared blind and sighted subjects on the immediate recall of haptically-examined Braille symbols. In the first study, sighted subjects (N=64) haptically examined each of a set of Braille symbols with their preferred or nonpreferred hand and immediately recalled the symbol by drawing…

  20. Haptic Cues Used for Outdoor Wayfinding by Individuals with Visual Impairments

    ERIC Educational Resources Information Center

    Koutsoklenis, Athanasios; Papadopoulos, Konstantinos

    2014-01-01

    Introduction: The study presented here examines which haptic cues individuals with visual impairments use more frequently and determines which of these cues are deemed by these individuals to be the most important for way-finding in urban environments. It also investigates the ways in which these haptic cues are used by individuals with visual…

  1. Cortical Activation Patterns during Long-Term Memory Retrieval of Visually or Haptically Encoded Objects and Locations

    ERIC Educational Resources Information Center

    Stock, Oliver; Roder, Brigitte; Burke, Michael; Bien, Siegfried; Rosler, Frank

    2009-01-01

    The present study used functional magnetic resonance imaging to delineate cortical networks that are activated when objects or spatial locations encoded either visually (visual encoding group, n = 10) or haptically (haptic encoding group, n = 10) had to be retrieved from long-term memory. Participants learned associations between auditorily…

  2. Physical Student-Robot Interaction with the ETHZ Haptic Paddle

    ERIC Educational Resources Information Center

    Gassert, R.; Metzger, J.; Leuenberger, K.; Popp, W. L.; Tucker, M. R.; Vigaru, B.; Zimmermann, R.; Lambercy, O.

    2013-01-01

    Haptic paddles--low-cost one-degree-of-freedom force feedback devices--have been used with great success at several universities throughout the US to teach the basic concepts of dynamic systems and physical human-robot interaction (pHRI) to students. The ETHZ haptic paddle was developed for a new pHRI course offered in the undergraduate…

  3. Haptic perception and body representation in lateral and medial occipito-temporal cortices.

    PubMed

    Costantini, Marcello; Urgesi, Cosimo; Galati, Gaspare; Romani, Gian Luca; Aglioti, Salvatore M

    2011-04-01

    Although vision is the primary sensory modality that humans and other primates use to identify objects in the environment, we can recognize crucial object features (e.g., shape, size) using the somatic modality. Previous studies have shown that the occipito-temporal areas dedicated to the visual processing of object forms, faces and bodies also show category-selective responses when the preferred stimuli are haptically explored out of view. Visual processing of human bodies engages specific areas in lateral (extrastriate body area, EBA) and medial (fusiform body area, FBA) occipito-temporal cortex. This study aimed at exploring the relative involvement of EBA and FBA in the haptic exploration of body parts. During fMRI scanning, participants were asked to haptically explore either real-size fake body parts or objects. We found a selective activation of right and left EBA, but not of right FBA, while participants haptically explored body parts as compared to real objects. This suggests that EBA may integrate visual body representations with somatosensory information regarding body parts and form a multimodal representation of the body. Furthermore, both left and right EBA showed a comparable level of body selectivity during haptic perception and visual imagery. However, right but not left EBA was more activated during haptic exploration than visual imagery of body parts, ruling out that the response to haptic body exploration was entirely due to the use of visual imagery. Overall, the results point to the existence of different multimodal body representations in the occipito-temporal cortex which are activated during perception and imagery of human body parts. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. Perceptual grouping determines haptic contextual modulation.

    PubMed

    Overvliet, K E; Sayim, B

    2016-09-01

    Since the early phenomenological demonstrations of Gestalt principles, one of the major challenges of Gestalt psychology has been to quantify these principles. Here, we show that contextual modulation, i.e. the influence of context on target perception, can be used as a tool to quantify perceptual grouping in the haptic domain, similar to the visual domain. We investigated the influence of target-flanker grouping on performance in haptic vernier offset discrimination. We hypothesized that when, despite the apparent differences between vision and haptics, similar grouping principles are operational, a similar pattern of flanker interference would be observed in the haptic as in the visual domain. Participants discriminated the offset of a haptic vernier. The vernier was flanked by different flanker configurations: no flankers, single flanking lines, 10 flanking lines, rectangles and single perpendicular lines, varying the degree to which the vernier grouped with the flankers. Additionally, we used two different flanker widths (same width as and narrower than the target), again to vary target-flanker grouping. Our results show a clear effect of flankers: performance was much better when the vernier was presented alone compared to when it was presented with flankers. In the majority of flanker configurations, grouping between the target and the flankers determined the strength of interference, similar to the visual domain. However, in the same width rectangular flanker condition we found aberrant results. We discuss the results of our study in light of similarities and differences between vision and haptics and the interaction between different grouping principles. We conclude that in haptics, similar organization principles apply as in visual perception and argue that grouping and Gestalt are key organization principles not only of vision, but of the perceptual system in general. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Haptic biofeedback for improving compliance with lower-extremity partial weight bearing.

    PubMed

    Fu, Michael C; DeLuke, Levi; Buerba, Rafael A; Fan, Richard E; Zheng, Ying Jean; Leslie, Michael P; Baumgaertner, Michael R; Grauer, Jonathan N

    2014-11-01

    After lower-extremity orthopedic trauma and surgery, patients are often advised to restrict weight bearing on the affected limb. Conventional training methods are not effective at enabling patients to comply with recommendations for partial weight bearing. The current study assessed a novel method of using real-time haptic (vibratory/vibrotactile) biofeedback to improve compliance with instructions for partial weight bearing. Thirty healthy, asymptomatic participants were randomized into 1 of 3 groups: verbal instruction, bathroom scale training, and haptic biofeedback. Participants were instructed to restrict lower-extremity weight bearing in a walking boot with crutches to 25 lb, with an acceptable range of 15 to 35 lb. A custom weight bearing sensor and biofeedback system was attached to all participants, but only those in the haptic biofeedback group were given a vibrotactile signal if they exceeded the acceptable range. Weight bearing in all groups was measured with a separate validated commercial system. The verbal instruction group bore an average of 60.3±30.5 lb (mean±standard deviation). The bathroom scale group averaged 43.8±17.2 lb, whereas the haptic biofeedback group averaged 22.4±9.1 lb (P<.05). As a percentage of body weight, the verbal instruction group averaged 40.2±19.3%, the bathroom scale group averaged 32.5±16.9%, and the haptic biofeedback group averaged 14.5±6.3% (P<.05). In this initial evaluation of the use of haptic biofeedback to improve compliance with lower-extremity partial weight bearing, haptic biofeedback was superior to conventional physical therapy methods. Further studies in patients with clinical orthopedic trauma are warranted. Copyright 2014, SLACK Incorporated.

  6. Enhanced operator perception through 3D vision and haptic feedback

    NASA Astrophysics Data System (ADS)

    Edmondson, Richard; Light, Kenneth; Bodenhamer, Andrew; Bosscher, Paul; Wilkinson, Loren

    2012-06-01

    Polaris Sensor Technologies (PST) has developed a stereo vision upgrade kit for TALON® robot systems comprised of a replacement gripper camera and a replacement mast zoom camera on the robot, and a replacement display in the Operator Control Unit (OCU). Harris Corporation has developed a haptic manipulation upgrade for TALON® robot systems comprised of a replacement arm and gripper and an OCU that provides haptic (force) feedback. PST and Harris have recently collaborated to integrate the 3D vision system with the haptic manipulation system. In multiple studies done at Fort Leonard Wood, Missouri it has been shown that 3D vision and haptics provide more intuitive perception of complicated scenery and improved robot arm control, allowing for improved mission performance and the potential for reduced time on target. This paper discusses the potential benefits of these enhancements to robotic systems used for the domestic homeland security mission.

  7. [Haptic tracking control for minimally invasive robotic surgery].

    PubMed

    Xu, Zhaohong; Song, Chengli; Wu, Wenwu

    2012-06-01

    Haptic feedback plays a significant role in minimally invasive robotic surgery (MIRS). A major deficiency of the current MIRS is the lack of haptic perception for the surgeon, including the commercially available robot da Vinci surgical system. In this paper, a dynamics model of a haptic robot is established based on Newton-Euler method. Because it took some period of time in exact dynamics solution, we used a digital PID arithmetic dependent on robot dynamics to ensure real-time bilateral control, and it could improve tracking precision and real-time control efficiency. To prove the proposed method, an experimental system in which two Novint Falcon haptic devices acting as master-slave system has been developed. Simulations and experiments showed proposed methods could give instrument force feedbacks to operator, and bilateral control strategy is an effective method to master-slave MIRS. The proposed methods could be used to tele-robotic system.

  8. Human-computer interface including haptically controlled interactions

    DOEpatents

    Anderson, Thomas G.

    2005-10-11

    The present invention provides a method of human-computer interfacing that provides haptic feedback to control interface interactions such as scrolling or zooming within an application. Haptic feedback in the present method allows the user more intuitive control of the interface interactions, and allows the user's visual focus to remain on the application. The method comprises providing a control domain within which the user can control interactions. For example, a haptic boundary can be provided corresponding to scrollable or scalable portions of the application domain. The user can position a cursor near such a boundary, feeling its presence haptically (reducing the requirement for visual attention for control of scrolling of the display). The user can then apply force relative to the boundary, causing the interface to scroll the domain. The rate of scrolling can be related to the magnitude of applied force, providing the user with additional intuitive, non-visual control of scrolling.

  9. GPFrontend and GPGraphics: graphical analysis tools for genetic association studies.

    PubMed

    Uebe, Steffen; Pasutto, Francesca; Krumbiegel, Mandy; Schanze, Denny; Ekici, Arif B; Reis, André

    2010-09-21

    Most software packages for whole genome association studies are non-graphical, purely text based programs originally designed to run with UNIX-like operating systems. Graphical output is often not intended or supposed to be performed with other command line tools, e.g. gnuplot. Using the Microsoft .NET 2.0 platform and Visual Studio 2005, we have created a graphical software package to analyze data from microarray whole genome association studies, both for a DNA-pooling based approach as well as regular single sample data. Part of this package was made to integrate with GenePool 0.8.2, a previously existing software suite for GNU/Linux systems, which we have modified to run in a Microsoft Windows environment. Further modifications cause it to generate some additional data. This enables GenePool to interact with the .NET parts created by us. The programs we developed are GPFrontend, a graphical user interface and frontend to use GenePool and create metadata files for it, and GPGraphics, a program to further analyze and graphically evaluate output of different WGA analysis programs, among them also GenePool. Our programs enable regular MS Windows users without much experience in bioinformatics to easily visualize whole genome data from a variety of sources.

  10. Structural health monitoring for bolt loosening via a non-invasive vibro-haptics human-machine cooperative interface

    NASA Astrophysics Data System (ADS)

    Pekedis, Mahmut; Mascerañas, David; Turan, Gursoy; Ercan, Emre; Farrar, Charles R.; Yildiz, Hasan

    2015-08-01

    For the last two decades, developments in damage detection algorithms have greatly increased the potential for autonomous decisions about structural health. However, we are still struggling to build autonomous tools that can match the ability of a human to detect and localize the quantity of damage in structures. Therefore, there is a growing interest in merging the computational and cognitive concepts to improve the solution of structural health monitoring (SHM). The main object of this research is to apply the human-machine cooperative approach on a tower structure to detect damage. The cooperation approach includes haptic tools to create an appropriate collaboration between SHM sensor networks, statistical compression techniques and humans. Damage simulation in the structure is conducted by releasing some of the bolt loads. Accelerometers are bonded to various locations of the tower members to acquire the dynamic response of the structure. The obtained accelerometer results are encoded in three different ways to represent them as a haptic stimulus for the human subjects. Then, the participants are subjected to each of these stimuli to detect the bolt loosened damage in the tower. Results obtained from the human-machine cooperation demonstrate that the human subjects were able to recognize the damage with an accuracy of 88 ± 20.21% and response time of 5.87 ± 2.33 s. As a result, it is concluded that the currently developed human-machine cooperation SHM may provide a useful framework to interact with abstract entities such as data from a sensor network.

  11. Innovative approaches to the rehabilitation of upper extremity hemiparesis using virtual environments

    PubMed Central

    MERIANS, A. S.; TUNIK, E.; FLUET, G. G.; QIU, Q.; ADAMOVICH, S. V.

    2017-01-01

    Aim Upper-extremity interventions for hemiparesis are a challenging aspect of stroke rehabilitation. Purpose of this paper is to report the feasibility of using virtual environments (VEs) in combination with robotics to assist recovery of hand-arm function and to present preliminary data demonstrating the potential of using sensory manipulations in VE to drive activation in targeted neural regions. Methods We trained 8 subjects for 8 three hour sessions using a library of complex VE’s integrated with robots, comparing training arm and hand separately to training arm and hand together. Instrumented gloves and hand exoskeleton were used for hand tracking and haptic effects. Haptic Master robotic arm was used for arm tracking and generating three-dimensional haptic VEs. To investigate the use of manipulations in VE to drive neural activations, we created a “virtual mirror” that subjects used while performing a unimanual task. Cortical activation was measured with functional MRI (fMRI) and transcranial magnetic stimulation. Results Both groups showed improvement in kinematics and measures of real-world function. The group trained using their arm and hand together showed greater improvement. In a stroke subject, fMRI data suggested virtual mirror feedback could activate the sensorimotor cortex contralateral to the reflected hand (ipsilateral to the moving hand) thus recruiting the lesioned hemisphere. Conclusion Gaming simulations interfaced with robotic devices provide a training medium that can modify movement patterns. In addition to showing that our VE therapies can optimize behavioral performance, we show preliminary evidence to support the potential of using specific sensory manipulations to selectively recruit targeted neural circuits. PMID:19158659

  12. A design of hardware haptic interface for gastrointestinal endoscopy simulation.

    PubMed

    Gu, Yunjin; Lee, Doo Yong

    2011-01-01

    Gastrointestinal endoscopy simulations have been developed to train endoscopic procedures which require hundreds of practices to be competent in the skills. Even though realistic haptic feedback is important to provide realistic sensation to the user, most of previous simulations including commercialized simulation have mainly focused on providing realistic visual feedback. In this paper, we propose a novel design of portable haptic interface, which provides 2DOF force feedback, for the gastrointestinal endoscopy simulation. The haptic interface consists of translational and rotational force feedback mechanism which are completely decoupled, and gripping mechanism for controlling connection between the endoscope and the force feedback mechanism.

  13. Haptic Guidance Improves the Visuo-Manual Tracking of Trajectories

    PubMed Central

    Bluteau, Jérémy; Coquillart, Sabine; Payan, Yohan; Gentaz, Edouard

    2008-01-01

    Background Learning to perform new movements is usually achieved by following visual demonstrations. Haptic guidance by a force feedback device is a recent and original technology which provides additional proprioceptive cues during visuo-motor learning tasks. The effects of two types of haptic guidances-control in position (HGP) or in force (HGF)–on visuo-manual tracking (“following”) of trajectories are still under debate. Methodology/Principals Findings Three training techniques of haptic guidance (HGP, HGF or control condition, NHG, without haptic guidance) were evaluated in two experiments. Movements produced by adults were assessed in terms of shapes (dynamic time warping) and kinematics criteria (number of velocity peaks and mean velocity) before and after the training sessions. Trajectories consisted of two Arabic and two Japanese-inspired letters in Experiment 1 and ellipses in Experiment 2. We observed that the use of HGF globally improves the fluency of the visuo-manual tracking of trajectories while no significant improvement was found for HGP or NHG. Conclusion/Significance These results show that the addition of haptic information, probably encoded in force coordinates, play a crucial role on the visuo-manual tracking of new trajectories. PMID:18335049

  14. The effect of perceptual grouping on haptic numerosity perception.

    PubMed

    Verlaers, K; Wagemans, J; Overvliet, K E

    2015-01-01

    We used a haptic enumeration task to investigate whether enumeration can be facilitated by perceptual grouping in the haptic modality. Eight participants were asked to count tangible dots as quickly and accurately as possible, while moving their finger pad over a tactile display. In Experiment 1, we manipulated the number and organization of the dots, while keeping the total exploration area constant. The dots were either evenly distributed on a horizontal line (baseline condition) or organized into groups based on either proximity (dots placed in closer proximity to each other) or configural cues (dots placed in a geometric configuration). In Experiment 2, we varied the distance between the subsets of dots. We hypothesized that when subsets of dots can be grouped together, the enumeration time will be shorter and accuracy will be higher than in the baseline condition. The results of both experiments showed faster enumeration for the configural condition than for the baseline condition, indicating that configural grouping also facilitates haptic enumeration. In Experiment 2, faster enumeration was also observed for the proximity condition than for the baseline condition. Thus, perceptual grouping speeds up haptic enumeration by both configural and proximity cues, suggesting that similar mechanisms underlie perceptual grouping in both visual and haptic enumeration.

  15. Acquisition and Visualization Techniques of Human Motion Using Master-Slave System and Haptograph

    NASA Astrophysics Data System (ADS)

    Katsura, Seiichiro; Ohishi, Kiyoshi

    Artificial acquisition and reproduction of human sensations are basic technologies of communication engineering. For example, auditory information is obtained by a microphone, and a speaker reproduces it by artificial means. Furthermore, a video camera and a television make it possible to transmit visual sensation by broadcasting. On the contrary, since tactile or haptic information is subject to the Newton's “law of action and reaction” in the real world, a device which acquires, transmits, and reproduces the information has not been established. From the point of view, real-world haptics is the key technology for future haptic communication engineering. This paper proposes a novel acquisition method of haptic information named “haptograph”. The haptograph visualizes the haptic information like photograph. Since temporal and spatial analyses are conducted to represent haptic information as the haptograph, it is possible to be recognized and to be evaluated intuitively. In this paper, the proposed haptograph is applied to visualization of human motion. It is possible to represent the motion characteristics, the expert's skill and the personal habit, and so on. In other words, a personal encyclopedia is attained. Once such a personal encyclopedia is stored in ubiquitous environment, the future human support technology will be developed.

  16. Haptic Stylus and Empirical Studies on Braille, Button, and Texture Display

    PubMed Central

    Kyung, Ki-Uk; Lee, Jun-Young; Park, Junseok

    2008-01-01

    This paper presents a haptic stylus interface with a built-in compact tactile display module and an impact module as well as empirical studies on Braille, button, and texture display. We describe preliminary evaluations verifying the tactile display's performance indicating that it can satisfactorily represent Braille numbers for both the normal and the blind. In order to prove haptic feedback capability of the stylus, an experiment providing impact feedback mimicking the click of a button has been conducted. Since the developed device is small enough to be attached to a force feedback device, its applicability to combined force and tactile feedback display in a pen-held haptic device is also investigated. The handle of pen-held haptic interface was replaced by the pen-like interface to add tactile feedback capability to the device. Since the system provides combination of force, tactile and impact feedback, three haptic representation methods for texture display have been compared on surface with 3 texture groups which differ in direction, groove width, and shape. In addition, we evaluate its capacity to support touch screen operations by providing tactile sensations when a user rubs against an image displayed on a monitor. PMID:18317520

  17. Haptic stylus and empirical studies on braille, button, and texture display.

    PubMed

    Kyung, Ki-Uk; Lee, Jun-Young; Park, Junseok

    2008-01-01

    This paper presents a haptic stylus interface with a built-in compact tactile display module and an impact module as well as empirical studies on Braille, button, and texture display. We describe preliminary evaluations verifying the tactile display's performance indicating that it can satisfactorily represent Braille numbers for both the normal and the blind. In order to prove haptic feedback capability of the stylus, an experiment providing impact feedback mimicking the click of a button has been conducted. Since the developed device is small enough to be attached to a force feedback device, its applicability to combined force and tactile feedback display in a pen-held haptic device is also investigated. The handle of pen-held haptic interface was replaced by the pen-like interface to add tactile feedback capability to the device. Since the system provides combination of force, tactile and impact feedback, three haptic representation methods for texture display have been compared on surface with 3 texture groups which differ in direction, groove width, and shape. In addition, we evaluate its capacity to support touch screen operations by providing tactile sensations when a user rubs against an image displayed on a monitor.

  18. OzBot and haptics: remote surveillance to physical presence

    NASA Astrophysics Data System (ADS)

    Mullins, James; Fielding, Mick; Nahavandi, Saeid

    2009-05-01

    This paper reports on robotic and haptic technologies and capabilities developed for the law enforcement and defence community within Australia by the Centre for Intelligent Systems Research (CISR). The OzBot series of small and medium surveillance robots have been designed in Australia and evaluated by law enforcement and defence personnel to determine suitability and ruggedness in a variety of environments. Using custom developed digital electronics and featuring expandable data busses including RS485, I2C, RS232, video and Ethernet, the robots can be directly connected to many off the shelf payloads such as gas sensors, x-ray sources and camera systems including thermal and night vision. Differentiating the OzBot platform from its peers is its ability to be integrated directly with haptic technology or the 'haptic bubble' developed by CISR. Haptic interfaces allow an operator to physically 'feel' remote environments through position-force control and experience realistic force feedback. By adding the capability to remotely grasp an object, feel its weight, texture and other physical properties in real-time from the remote ground control unit, an operator's situational awareness is greatly improved through Haptic augmentation in an environment where remote-system feedback is often limited.

  19. Haptics-based dynamic implicit solid modeling.

    PubMed

    Hua, Jing; Qin, Hong

    2004-01-01

    This paper systematically presents a novel, interactive solid modeling framework, Haptics-based Dynamic Implicit Solid Modeling, which is founded upon volumetric implicit functions and powerful physics-based modeling. In particular, we augment our modeling framework with a haptic mechanism in order to take advantage of additional realism associated with a 3D haptic interface. Our dynamic implicit solids are semi-algebraic sets of volumetric implicit functions and are governed by the principles of dynamics, hence responding to sculpting forces in a natural and predictable manner. In order to directly manipulate existing volumetric data sets as well as point clouds, we develop a hierarchical fitting algorithm to reconstruct and represent discrete data sets using our continuous implicit functions, which permit users to further design and edit those existing 3D models in real-time using a large variety of haptic and geometric toolkits, and visualize their interactive deformation at arbitrary resolution. The additional geometric and physical constraints afford more sophisticated control of the dynamic implicit solids. The versatility of our dynamic implicit modeling enables the user to easily modify both the geometry and the topology of modeled objects, while the inherent physical properties can offer an intuitive haptic interface for direct manipulation with force feedback.

  20. System and method for creating expert systems

    NASA Technical Reports Server (NTRS)

    Hughes, Peter M. (Inventor); Luczak, Edward C. (Inventor)

    1998-01-01

    A system and method provides for the creation of a highly graphical expert system without the need for programming in code. An expert system is created by initially building a data interface, defining appropriate Mission, User-Defined, Inferred, and externally-generated GenSAA (EGG) data variables whose data values will be updated and input into the expert system. Next, rules of the expert system are created by building appropriate conditions of the rules which must be satisfied and then by building appropriate actions of rules which are to be executed upon corresponding conditions being satisfied. Finally, an appropriate user interface is built which can be highly graphical in nature and which can include appropriate message display and/or modification of display characteristics of a graphical display object, to visually alert a user of the expert system of varying data values, upon conditions of a created rule being satisfied. The data interface building, rule building, and user interface building are done in an efficient manner and can be created without the need for programming in code.

  1. Creating a Strong Foundation with Engineering Design Graphics.

    ERIC Educational Resources Information Center

    Newcomer, Jeffrey L.; McKell, Eric K.; Raudebaugh, Robert A.; Kelley, David S.

    2001-01-01

    Describes the two-course engineering design graphics sequence on introductory design and graphics topics. The first course focuses on conceptual design and the development of visualization and sketching skills while the second one concentrates on detail design and parametric modeling. (Contains 28 references.) (Author/ASK)

  2. Sharing control with haptics: seamless driver support from manual to automatic control.

    PubMed

    Mulder, Mark; Abbink, David A; Boer, Erwin R

    2012-10-01

    Haptic shared control was investigated as a human-machine interface that can intuitively share control between drivers and an automatic controller for curve negotiation. As long as automation systems are not fully reliable, a role remains for the driver to be vigilant to the system and the environment to catch any automation errors. The conventional binary switches between supervisory and manual control has many known issues, and haptic shared control is a promising alternative. A total of 42 respondents of varying age and driving experience participated in a driving experiment in a fixed-base simulator, in which curve negotiation behavior during shared control was compared to during manual control, as well as to three haptic tunings of an automatic controller without driver intervention. Under the experimental conditions studied, the main beneficial effect of haptic shared control compared to manual control was that less control activity (16% in steering wheel reversal rate, 15% in standard deviation of steering wheel angle) was needed for realizing an improved safety performance (e.g., 11% in peak lateral error). Full automation removed the need for any human control activity and improved safety performance (e.g., 35% in peak lateral error) but put the human in a supervisory position. Haptic shared control kept the driver in the loop, with enhanced performance at reduced control activity, mitigating the known issues that plague full automation. Haptic support for vehicular control ultimately seeks to intuitively combine human intelligence and creativity with the benefits of automation systems.

  3. Early visual experience and the recognition of basic facial expressions: involvement of the middle temporal and inferior frontal gyri during haptic identification by the early blind

    PubMed Central

    Kitada, Ryo; Okamoto, Yuko; Sasaki, Akihiro T.; Kochiyama, Takanori; Miyahara, Motohide; Lederman, Susan J.; Sadato, Norihiro

    2012-01-01

    Face perception is critical for social communication. Given its fundamental importance in the course of evolution, the innate neural mechanisms can anticipate the computations necessary for representing faces. However, the effect of visual deprivation on the formation of neural mechanisms that underlie face perception is largely unknown. We previously showed that sighted individuals can recognize basic facial expressions by haptics surprisingly well. Moreover, the inferior frontal gyrus (IFG) and posterior superior temporal sulcus (pSTS) in the sighted subjects are involved in haptic and visual recognition of facial expressions. Here, we conducted both psychophysical and functional magnetic-resonance imaging (fMRI) experiments to determine the nature of the neural representation that subserves the recognition of basic facial expressions in early blind individuals. In a psychophysical experiment, both early blind and sighted subjects haptically identified basic facial expressions at levels well above chance. In the subsequent fMRI experiment, both groups haptically identified facial expressions and shoe types (control). The sighted subjects then completed the same task visually. Within brain regions activated by the visual and haptic identification of facial expressions (relative to that of shoes) in the sighted group, corresponding haptic identification in the early blind activated regions in the inferior frontal and middle temporal gyri. These results suggest that the neural system that underlies the recognition of basic facial expressions develops supramodally even in the absence of early visual experience. PMID:23372547

  4. Design of a 7-DOF slave robot integrated with a magneto-rheological haptic master

    NASA Astrophysics Data System (ADS)

    Hwang, Yong-Hoon; Cha, Seung-Woo; Kang, Seok-Rae; Choi, Seung-Bok

    2017-04-01

    In this study, a 7-DOF slave robot integrated with the haptic master is designed and its dynamic motion is controlled. The haptic master is made using a controllable magneto-rheological (MR) clutch and brake and it provides the surgeon with a sense of touch by using both kinetic and kinesthetic information. Due to the size constraint of the slave robot, a wire actuating is adopted to make the desired motion of the end-effector which has 3-DOF instead of a conventional direct-driven motor. Another motions of the link parts that have 4-DOF use direct-driven motor. In total system, for working as a haptic device, the haptic master need to receive the information of repulsive forces applied on the slave robot. Therefore, repulsive forces on the end-effector are sensed by using three uniaxial torque transducer inserted in the wire actuating system and another repulsive forces applied on link part are sensed by using 6-axis transducer that is able to sense forces and torques. Using another 6-axis transducer, verify the reliability of force information on final end of slave robot. Lastly, integrated with a MR haptic master, psycho-physical test is conducted by different operators who can feel the different repulsive force or torque generated from the haptic master which is equivalent to the force or torque occurred on the end-effector to demonstrate the effectiveness of the proposed system.

  5. Haptic exploration of fingertip-sized geometric features using a multimodal tactile sensor

    NASA Astrophysics Data System (ADS)

    Ponce Wong, Ruben D.; Hellman, Randall B.; Santos, Veronica J.

    2014-06-01

    Haptic perception remains a grand challenge for artificial hands. Dexterous manipulators could be enhanced by "haptic intelligence" that enables identification of objects and their features via touch alone. Haptic perception of local shape would be useful when vision is obstructed or when proprioceptive feedback is inadequate, as observed in this study. In this work, a robot hand outfitted with a deformable, bladder-type, multimodal tactile sensor was used to replay four human-inspired haptic "exploratory procedures" on fingertip-sized geometric features. The geometric features varied by type (bump, pit), curvature (planar, conical, spherical), and footprint dimension (1.25 - 20 mm). Tactile signals generated by active fingertip motions were used to extract key parameters for use as inputs to supervised learning models. A support vector classifier estimated order of curvature while support vector regression models estimated footprint dimension once curvature had been estimated. A distal-proximal stroke (along the long axis of the finger) enabled estimation of order of curvature with an accuracy of 97%. Best-performing, curvature-specific, support vector regression models yielded R2 values of at least 0.95. While a radial-ulnar stroke (along the short axis of the finger) was most helpful for estimating feature type and size for planar features, a rolling motion was most helpful for conical and spherical features. The ability to haptically perceive local shape could be used to advance robot autonomy and provide haptic feedback to human teleoperators of devices ranging from bomb defusal robots to neuroprostheses.

  6. Early visual experience and the recognition of basic facial expressions: involvement of the middle temporal and inferior frontal gyri during haptic identification by the early blind.

    PubMed

    Kitada, Ryo; Okamoto, Yuko; Sasaki, Akihiro T; Kochiyama, Takanori; Miyahara, Motohide; Lederman, Susan J; Sadato, Norihiro

    2013-01-01

    Face perception is critical for social communication. Given its fundamental importance in the course of evolution, the innate neural mechanisms can anticipate the computations necessary for representing faces. However, the effect of visual deprivation on the formation of neural mechanisms that underlie face perception is largely unknown. We previously showed that sighted individuals can recognize basic facial expressions by haptics surprisingly well. Moreover, the inferior frontal gyrus (IFG) and posterior superior temporal sulcus (pSTS) in the sighted subjects are involved in haptic and visual recognition of facial expressions. Here, we conducted both psychophysical and functional magnetic-resonance imaging (fMRI) experiments to determine the nature of the neural representation that subserves the recognition of basic facial expressions in early blind individuals. In a psychophysical experiment, both early blind and sighted subjects haptically identified basic facial expressions at levels well above chance. In the subsequent fMRI experiment, both groups haptically identified facial expressions and shoe types (control). The sighted subjects then completed the same task visually. Within brain regions activated by the visual and haptic identification of facial expressions (relative to that of shoes) in the sighted group, corresponding haptic identification in the early blind activated regions in the inferior frontal and middle temporal gyri. These results suggest that the neural system that underlies the recognition of basic facial expressions develops supramodally even in the absence of early visual experience.

  7. Learning Layouts for Single-Page Graphic Designs.

    PubMed

    O'Donovan, Peter; Agarwala, Aseem; Hertzmann, Aaron

    2014-08-01

    This paper presents an approach for automatically creating graphic design layouts using a new energy-based model derived from design principles. The model includes several new algorithms for analyzing graphic designs, including the prediction of perceived importance, alignment detection, and hierarchical segmentation. Given the model, we use optimization to synthesize new layouts for a variety of single-page graphic designs. Model parameters are learned with Nonlinear Inverse Optimization (NIO) from a small number of example layouts. To demonstrate our approach, we show results for applications including generating design layouts in various styles, retargeting designs to new sizes, and improving existing designs. We also compare our automatic results with designs created using crowdsourcing and show that our approach performs slightly better than novice designers.

  8. Ascending and Descending in Virtual Reality: Simple and Safe System Using Passive Haptics.

    PubMed

    Nagao, Ryohei; Matsumoto, Keigo; Narumi, Takuji; Tanikawa, Tomohiro; Hirose, Michitaka

    2018-04-01

    This paper presents a novel interactive system that provides users with virtual reality (VR) experiences, wherein users feel as if they are ascending/descending stairs through passive haptic feedback. The passive haptic stimuli are provided by small bumps under the feet of users; these stimuli are provided to represent the edges of the stairs in the virtual environment. The visual stimuli of the stairs and shoes, provided by head-mounted displays, evoke a visuo-haptic interaction that modifies a user's perception of the floor shape. Our system enables users to experience all types of stairs, such as half-turn and spiral stairs, in a VR setting. We conducted a preliminary user study and two experiments to evaluate the proposed technique. The preliminary user study investigated the effectiveness of the basic idea associated with the proposed technique for the case of a user ascending stairs. The results demonstrated that the passive haptic feedback produced by the small bumps enhanced the user's feeling of presence and sense of ascending. We subsequently performed an experiment to investigate an improved viewpoint manipulation method and the interaction of the manipulation and haptics for both the ascending and descending cases. The experimental results demonstrated that the participants had a feeling of presence and felt a steep stair gradient under the condition of haptic feedback and viewpoint manipulation based on the characteristics of actual stair walking data. However, these results also indicated that the proposed system may not be as effective in providing a sense of descending stairs without an optimization of the haptic stimuli. We then redesigned the shape of the small bumps, and evaluated the design in a second experiment. The results indicated that the best shape to present haptic stimuli is a right triangle cross section in both the ascending and descending cases. Although it is necessary to install small protrusions in the determined direction, by using this optimized shape the users feeling of presence of the stairs and the sensation of walking up and down was enhanced.

  9. Intuitive tactile zooming for graphics accessed by individuals who are blind and visually impaired.

    PubMed

    Rastogi, Ravi; Pawluk, T V Dianne; Ketchum, Jessica

    2013-07-01

    One possibility of providing access to visual graphics for those who are visually impaired is to present them tactually: unfortunately, details easily available to vision need to be magnified to be accessible through touch. For this, we propose an "intuitive" zooming algorithm to solve potential problems with directly applying visual zooming techniques to haptic displays that sense the current location of a user on a virtual diagram with a position sensor and, then, provide the appropriate local information either through force or tactile feedback. Our technique works by determining and then traversing the levels of an object tree hierarchy of a diagram. In this manner, the zoom steps adjust to the content to be viewed, avoid clipping and do not zoom when no object is present. The algorithm was tested using a small, "mouse-like" display with tactile feedback on pictures representing houses in a community and boats on a lake. We asked the users to answer questions related to details in the pictures. Comparing our technique to linear and logarithmic step zooming, we found a significant increase in the correctness of the responses (odds ratios of 2.64:1 and 2.31:1, respectively) and usability (differences of 36% and 19%, respectively) using our "intuitive" zooming technique.

  10. A Novel Temporal Bone Simulation Model Using 3D Printing Techniques.

    PubMed

    Mowry, Sarah E; Jammal, Hachem; Myer, Charles; Solares, Clementino Arturo; Weinberger, Paul

    2015-09-01

    An inexpensive temporal bone model for use in a temporal bone dissection laboratory setting can be made using a commercially available, consumer-grade 3D printer. Several models for a simulated temporal bone have been described but use commercial-grade printers and materials to produce these models. The goal of this project was to produce a plastic simulated temporal bone on an inexpensive 3D printer that recreates the visual and haptic experience associated with drilling a human temporal bone. Images from a high-resolution CT of a normal temporal bone were converted into stereolithography files via commercially available software, with image conversion and print settings adjusted to achieve optimal print quality. The temporal bone model was printed using acrylonitrile butadiene styrene (ABS) plastic filament on a MakerBot 2x 3D printer. Simulated temporal bones were drilled by seven expert temporal bone surgeons, assessing the fidelity of the model as compared with a human cadaveric temporal bone. Using a four-point scale, the simulated bones were assessed for haptic experience and recreation of the temporal bone anatomy. The created model was felt to be an accurate representation of a human temporal bone. All raters felt strongly this would be a good training model for junior residents or to simulate difficult surgical anatomy. Material cost for each model was $1.92. A realistic, inexpensive, and easily reproducible temporal bone model can be created on a consumer-grade desktop 3D printer.

  11. Teaching graphics in technical communication classes

    NASA Technical Reports Server (NTRS)

    Spurgeon, K. C.

    1981-01-01

    Graphic aids convey and clarify information more efficiently and accurately than words alone therefore, most technical writing includes the use of graphics. Ways of accumulating and presenting graphics illustrations on a shoestring budget are suggested. These include collecting graphics from companies, annual reports and laminating them for workshop use or putting them on a flip chart for classroom presentation, creating overhead transparencies to demonstrate different levels of effectiveness of graphic aids, and bringing in grahic artists for question/answer periods or in class workshops. Also included are an extensive handout as an introduction to graphics, sample assignments, and a selected and annotated bibliography.

  12. Program for Generating Graphs and Charts

    NASA Technical Reports Server (NTRS)

    Ackerson, C. T.

    1986-01-01

    Office Automation Pilot (OAP) Graphics Database system offers IBM personal computer user assistance in producing wide variety of graphs and charts and convenient data-base system, called chart base, for creating and maintaining data associated with graphs and charts. Thirteen different graphics packages available. Access graphics capabilities obtained in similar manner. User chooses creation, revision, or chartbase-maintenance options from initial menu; Enters or modifies data displayed on graphic chart. OAP graphics data-base system written in Microsoft PASCAL.

  13. A survey of telerobotic surface finishing

    NASA Astrophysics Data System (ADS)

    Höglund, Thomas; Alander, Jarmo; Mantere, Timo

    2018-05-01

    This is a survey of research published on the subjects of telerobotics, haptic feedback, and mixed reality applied to surface finishing. The survey especially focuses on how visuo-haptic feedback can be used to improve a grinding process using a remote manipulator or robot. The benefits of teleoperation and reasons for using haptic feedback are presented. The use of genetic algorithms for optimizing haptic sensing is briefly discussed. Ways of augmenting the operator's vision are described. Visual feedback can be used to find defects and analyze the quality of the surface resulting from the surface finishing process. Visual cues can also be used to aid a human operator in manipulating a robot precisely and avoiding collisions.

  14. Differences between early-blind, late-blind, and blindfolded-sighted people in haptic spatial-configuration learning and resulting memory traces.

    PubMed

    Postma, Albert; Zuidhoek, Sander; Noordzij, Matthijs L; Kappers, Astrid M L

    2007-01-01

    The roles of visual and haptic experience in different aspects of haptic processing of objects in peripersonal space are examined. In three trials, early-blind, late-blind, and blindfolded-sighted individuals had to match ten shapes haptically to the cut-outs in a board as fast as possible. Both blind groups were much faster than the sighted in all three trials. All three groups improved considerably from trial to trial. In particular, the sighted group showed a strong improvement from the first to the second trial. While superiority of the blind remained for speeded matching after rotation of the stimulus frame, coordinate positional-memory scores in a non-speeded free-recall trial showed no significant differences between the groups. Moreover, when assessed with a verbal response, categorical spatial-memory appeared strongest in the late-blind group. The role of haptic and visual experience thus appears to depend on the task aspect tested.

  15. Forces on intraocular lens haptics induced by capsular fibrosis. An experimental study.

    PubMed

    Guthoff, R; Abramo, F; Draeger, J; Chumbley, L C; Lang, G K; Neumann, W

    1990-01-01

    Electronic dynamometry measurements, performed upon intraocular lens (IOL) haptics of prototype one-piece three-loop silicone lenses, accurately defined the relationships between elastic force and haptic displacement. Lens implantations in the capsular bag of dogs (loop span equal to capsular bag diameter, loops underformed immediately after the operation) were evaluated macrophotographically 5-8 months postoperatively. The highly constant elastic property of silicon rubber permitted quantitative correlation of subsequent in vivo haptic displacement with the resultant force vectors responsible for tissue contraction. The lens optics were well centered in 17 (85%) and slightly offcenter in 3 (15%) of 20 implanted eyes. Of the 60 supporting loops, 28 could be visualized sufficiently well to permit reliable haptic measurement. Of these 28, 20 (71%) were clearly displaced, ranging from 0.45 mm away from to 1.4 mm towards the lens' optic center. These extremes represented resultant vector forces of 0.20 and 1.23 mN respectively. Quantitative vector analysis permits better understanding of IOL-capsular interactions.

  16. A haptic-inspired audio approach for structural health monitoring decision-making

    NASA Astrophysics Data System (ADS)

    Mao, Zhu; Todd, Michael; Mascareñas, David

    2015-03-01

    Haptics is the field at the interface of human touch (tactile sensation) and classification, whereby tactile feedback is used to train and inform a decision-making process. In structural health monitoring (SHM) applications, haptic devices have been introduced and applied in a simplified laboratory scale scenario, in which nonlinearity, representing the presence of damage, was encoded into a vibratory manual interface. In this paper, the "spirit" of haptics is adopted, but here ultrasonic guided wave scattering information is transformed into audio (rather than tactile) range signals. After sufficient training, the structural damage condition, including occurrence and location, can be identified through the encoded audio waveforms. Different algorithms are employed in this paper to generate the transformed audio signals and the performance of each encoding algorithms is compared, and also compared with standard machine learning classifiers. In the long run, the haptic decision-making is aiming to detect and classify structural damages in a more rigorous environment, and approaching a baseline-free fashion with embedded temperature compensation.

  17. Real-time dual-band haptic music player for mobile devices.

    PubMed

    Hwang, Inwook; Lee, Hyeseon; Choi, Seungmoon

    2013-01-01

    We introduce a novel dual-band haptic music player for real-time simultaneous vibrotactile playback with music in mobile devices. Our haptic music player features a new miniature dual-mode actuator that can produce vibrations consisting of two principal frequencies and a real-time vibration generation algorithm that can extract vibration commands from a music file for dual-band playback (bass and treble). The algorithm uses a "haptic equalizer" and provides plausible sound-to-touch modality conversion based on human perceptual data. In addition, we present a user study carried out to evaluate the subjective performance (precision, harmony, fun, and preference) of the haptic music player, in comparison with the current practice of bass-band-only vibrotactile playback via a single-frequency voice-coil actuator. The evaluation results indicated that the new dual-band playback outperforms the bass-only rendering, also providing several insights for further improvements. The developed system and experimental findings have implications for improving the multimedia experience with mobile devices.

  18. Learning to perceive haptic distance-to-break in the presence of friction.

    PubMed

    Altenhoff, Bliss M; Pagano, Christopher C; Kil, Irfan; Burg, Timothy C

    2017-02-01

    Two experiments employed attunement and calibration training to investigate whether observers are able to identify material break points in compliant materials through haptic force application. The task required participants to attune to a recently identified haptic invariant, distance-to-break (DTB), rather than haptic stimulation not related to the invariant, including friction. In the first experiment participants probed simulated force-displacement relationships (materials) under 3 levels of friction with the aim of pushing as far as possible into the materials without breaking them. In a second experiment a different set of participants pulled on the materials. Results revealed that participants are sensitive to DTB for both pushing and pulling, even in the presence of varying levels of friction, and this sensitivity can be improved through training. The results suggest that the simultaneous presence of friction may assist participants in perceiving DTB. Potential applications include the development of haptic training programs for minimally invasive (laparoscopic) surgery to reduce accidental tissue damage. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  19. Mnemonic neuronal activity in somatosensory cortex.

    PubMed Central

    Zhou, Y D; Fuster, J M

    1996-01-01

    Single-unit activity was recorded from the hand areas of the somatosensory cortex of monkeys trained to perform a haptic delayed matching to sample task with objects of identical dimensions but different surface features. During the memory retention period of the task (delay), many units showed sustained firing frequency change, either excitation or inhibition. In some cases, firing during that period was significantly higher after one sample object than after another. These observations indicate the participation of somatosensory neurons not only in the perception but in the short-term memory of tactile stimuli. Neurons most directly implicated in tactile memory are (i) those with object-selective delay activity, (ii) those with nondifferential delay activity but without activity related to preparation for movement, and (iii) those with delay activity in the haptic-haptic delayed matching task but no such activity in a control visuo-haptic delayed matching task. The results indicate that cells in early stages of cortical somatosensory processing participate in haptic short-term memory. PMID:8927629

  20. Ergonomic evaluation of 3D plane positioning using a mouse and a haptic device.

    PubMed

    Paul, Laurent; Cartiaux, Olivier; Docquier, Pierre-Louis; Banse, Xavier

    2009-12-01

    Preoperative planning and intraoperative assistance are needed to improve accuracy in tumour surgery. To be accepted, these processes must be efficient. An experiment was conducted to compare a mouse and a haptic device, with and without force feedback, to perform plan positioning in a 3D space. Ergonomics and performance factors were investigated during the experiment. Positioning strategies were observed. The task completion time, number of 3D orientations and failure rate were analysed. A questionnaire on ergonomics was filled out by each participant. The haptic device showed a significantly lower failure rate and was quicker and more ergonomic than the mouse. The force feedback was not beneficial to the accomplishment of the task. The haptic device is intuitive, ergonomic and more efficient than the mouse for positioning a 3D plane into a 3D space. Useful observations regarding positioning strategies will improve the integration of haptic devices into medical applications. Copyright (c) 2009 John Wiley & Sons, Ltd.

  1. Detection of Membrane Puncture with Haptic Feedback using a Tip-Force Sensing Needle.

    PubMed

    Elayaperumal, Santhi; Bae, Jung Hwa; Daniel, Bruce L; Cutkosky, Mark R

    2014-09-01

    This paper presents calibration and user test results of a 3-D tip-force sensing needle with haptic feedback. The needle is a modified MRI-compatible biopsy needle with embedded fiber Bragg grating (FBG) sensors for strain detection. After calibration, the needle is interrogated at 2 kHz, and dynamic forces are displayed remotely with a voice coil actuator. The needle is tested in a single-axis master/slave system, with the voice coil haptic display at the master, and the needle at the slave end. Tissue phantoms with embedded membranes were used to determine the ability of the tip-force sensors to provide real-time haptic feedback as compared to external sensors at the needle base during needle insertion via the master/slave system. Subjects were able to determine the position of the embedded membranes with significantly better accuracy using FBG tip feedback than with base feedback using a commercial force/torque sensor (p = 0.045) or with no added haptic feedback (p = 0.0024).

  2. Detection of Membrane Puncture with Haptic Feedback using a Tip-Force Sensing Needle

    PubMed Central

    Elayaperumal, Santhi; Bae, Jung Hwa; Daniel, Bruce L.; Cutkosky, Mark R.

    2015-01-01

    This paper presents calibration and user test results of a 3-D tip-force sensing needle with haptic feedback. The needle is a modified MRI-compatible biopsy needle with embedded fiber Bragg grating (FBG) sensors for strain detection. After calibration, the needle is interrogated at 2 kHz, and dynamic forces are displayed remotely with a voice coil actuator. The needle is tested in a single-axis master/slave system, with the voice coil haptic display at the master, and the needle at the slave end. Tissue phantoms with embedded membranes were used to determine the ability of the tip-force sensors to provide real-time haptic feedback as compared to external sensors at the needle base during needle insertion via the master/slave system. Subjects were able to determine the position of the embedded membranes with significantly better accuracy using FBG tip feedback than with base feedback using a commercial force/torque sensor (p = 0.045) or with no added haptic feedback (p = 0.0024). PMID:26509101

  3. The role of haptic versus visual volume cues in the size-weight illusion.

    PubMed

    Ellis, R R; Lederman, S J

    1993-03-01

    Three experiments establish the size-weight illusion as a primarily haptic phenomenon, despite its having been more traditionally considered an example of vision influencing haptic processing. Experiment 1 documents, across a broad range of stimulus weights and volumes, the existence of a purely haptic size-weight illusion, equal in strength to the traditional illusion. Experiment 2 demonstrates that haptic volume cues are both sufficient and necessary for a full-strength illusion. In contrast, visual volume cues are merely sufficient, and produce a relatively weaker effect. Experiment 3 establishes that congenitally blind subjects experience an effect as powerful as that of blindfolded sighted observers, thus demonstrating that visual imagery is also unnecessary for a robust size-weight illusion. The results are discussed in terms of their implications for both sensory and cognitive theories of the size-weight illusion. Applications of this work to a human factors design and to sensor-based systems for robotic manipulation are also briefly considered.

  4. Visual and visually mediated haptic illusions with Titchener's ⊥.

    PubMed

    Landwehr, Klaus

    2014-05-01

    For a replication and expansion of a previous experiment of mine, 14 newly recruited participants provided haptic and verbal estimates of the lengths of the two lines that make up Titchener's ⊥. The stimulus was presented at two different orientations (frontoparallel vs. horizontal) and rotated in steps of 45 deg around 2π. Haptically, the divided line of the ⊥ was generally underestimated, especially at a horizontal orientation. Verbal judgments also differed according to presentation condition and to which line was the target, with the overestimation of the undivided line ranging between 6.2 % and 15.3 %. The results are discussed with reference to the two-visual-systems theory of perception and action, neuroscientific accounts, and also recent historical developments (the use of handheld touchscreens, in particular), because the previously reported "haptic induction effect" (the scaling of haptic responses to the divided line of the ⊥, depending on the length of the undivided one) did not replicate.

  5. Command Recognition of Robot with Low Dimension Whole-Body Haptic Sensor

    NASA Astrophysics Data System (ADS)

    Ito, Tatsuya; Tsuji, Toshiaki

    The authors have developed “haptic armor”, a whole-body haptic sensor that has an ability to estimate contact position. Although it is developed for safety assurance of robots in human environment, it can also be used as an interface. This paper proposes a command recognition method based on finger trace information. This paper also discusses some technical issues for improving recognition accuracy of this system.

  6. Persistent Neuronal Firing in Primary Somatosensory Cortex in the Absence of Working Memory of Trial-Specific Features of the Sample Stimuli in a Haptic Working Memory Task

    ERIC Educational Resources Information Center

    Wang, Liping; Li, Xianchun; Hsiao, Steven S.; Bodner, Mark; Lenz, Fred; Zhou, Yong-Di

    2012-01-01

    Previous studies suggested that primary somatosensory (SI) neurons in well-trained monkeys participated in the haptic-haptic unimodal delayed matching-to-sample (DMS) task. In this study, 585 SI neurons were recorded in monkeys performing a task that was identical to that in the previous studies but without requiring discrimination and active…

  7. Exploring laterality and memory effects in the haptic discrimination of verbal and non-verbal shapes.

    PubMed

    Stoycheva, Polina; Tiippana, Kaisa

    2018-03-14

    The brain's left hemisphere often displays advantages in processing verbal information, while the right hemisphere favours processing non-verbal information. In the haptic domain due to contra-lateral innervations, this functional lateralization is reflected in a hand advantage during certain functions. Findings regarding the hand-hemisphere advantage for haptic information remain contradictory, however. This study addressed these laterality effects and their interaction with memory retention times in the haptic modality. Participants performed haptic discrimination of letters, geometric shapes and nonsense shapes at memory retention times of 5, 15 and 30 s with the left and right hand separately, and we measured the discriminability index d'. The d' values were significantly higher for letters and geometric shapes than for nonsense shapes. This might result from dual coding (naming + spatial) or/and from a low stimulus complexity. There was no stimulus-specific laterality effect. However, we found a time-dependent laterality effect, which revealed that the performance of the left hand-right hemisphere was sustained up to 15 s, while the performance of the right-hand-left hemisphere decreased progressively throughout all retention times. This suggests that haptic memory traces are more robust to decay when they are processed by the left hand-right hemisphere.

  8. Haptic feedback improves surgeons' user experience and fracture reduction in facial trauma simulation.

    PubMed

    Girod, Sabine; Schvartzman, Sara C; Gaudilliere, Dyani; Salisbury, Kenneth; Silva, Rebeka

    2016-01-01

    Computer-assisted surgical (CAS) planning tools are available for craniofacial surgery, but are usually based on computer-aided design (CAD) tools that lack the ability to detect the collision of virtual objects (i.e., fractured bone segments). We developed a CAS system featuring a sense of touch (haptic) that enables surgeons to physically interact with individual, patient-specific anatomy and immerse in a three-dimensional virtual environment. In this study, we evaluated initial user experience with our novel system compared to an existing CAD system. Ten surgery resident trainees received a brief verbal introduction to both the haptic and CAD systems. Users simulated mandibular fracture reduction in three clinical cases within a 15 min time limit for each system and completed a questionnaire to assess their subjective experience. We compared standard landmarks and linear and angular measurements between the simulated results and the actual surgical outcome and found that haptic simulation results were not significantly different from actual postoperative outcomes. In contrast, CAD results significantly differed from both the haptic simulation and actual postoperative results. In addition to enabling a more accurate fracture repair, the haptic system provided a better user experience than the CAD system in terms of intuitiveness and self-reported quality of repair.

  9. Simulation and training of lumbar punctures using haptic volume rendering and a 6DOF haptic device

    NASA Astrophysics Data System (ADS)

    Färber, Matthias; Heller, Julika; Handels, Heinz

    2007-03-01

    The lumbar puncture is performed by inserting a needle into the spinal chord of the patient to inject medicaments or to extract liquor. The training of this procedure is usually done on the patient guided by experienced supervisors. A virtual reality lumbar puncture simulator has been developed in order to minimize the training costs and the patient's risk. We use a haptic device with six degrees of freedom (6DOF) to feedback forces that resist needle insertion and rotation. An improved haptic volume rendering approach is used to calculate the forces. This approach makes use of label data of relevant structures like skin, bone, muscles or fat and original CT data that contributes information about image structures that can not be segmented. A real-time 3D visualization with optional stereo view shows the punctured region. 2D visualizations of orthogonal slices enable a detailed impression of the anatomical context. The input data consisting of CT and label data and surface models of relevant structures is defined in an XML file together with haptic rendering and visualization parameters. In a first evaluation the visible human male data has been used to generate a virtual training body. Several users with different medical experience tested the lumbar puncture trainer. The simulator gives a good haptic and visual impression of the needle insertion and the haptic volume rendering technique enables the feeling of unsegmented structures. Especially, the restriction of transversal needle movement together with rotation constraints enabled by the 6DOF device facilitate a realistic puncture simulation.

  10. The control data "GIRAFFE" system for interactive graphic finite element analysis

    NASA Technical Reports Server (NTRS)

    Park, S.; Brandon, D. M., Jr.

    1975-01-01

    The Graphical Interface for Finite Elements (GIRAFFE) general purpose interactive graphics application package was described. This system may be used as a pre/post processor for structural analysis computer programs. It facilitates the operations of creating, editing, or reviewing all the structural input/output data on a graphics terminal in a time-sharing mode of operation. An application program for a simple three-dimensional plate problem was illustrated.

  11. Force modeling for incision surgery into tissue with haptic application

    NASA Astrophysics Data System (ADS)

    Kim, Pyunghwa; Kim, Soomin; Choi, Seung-Hyun; Oh, Jong-Seok; Choi, Seung-Bok

    2015-04-01

    This paper presents a novel force modeling for an incision surgery into tissue and its haptic application for a surgeon. During the robot-assisted incision surgery, it is highly urgent to develop the haptic system for realizing sense of touch in the surgical area because surgeons cannot sense sensations. To achieve this goal, the force modeling related to reaction force of biological tissue is proposed in the perspective on energy. The force model describes reaction force focused on the elastic feature of tissue during the incision surgery. Furthermore, the force is realized using calculated information from the model by haptic device using magnetorheological fluid (MRF). The performance of realized force that is controlled by PID controller with open loop control is evaluated.

  12. A Model for Steering with Haptic-Force Guidance

    NASA Astrophysics Data System (ADS)

    Yang, Xing-Dong; Irani, Pourang; Boulanger, Pierre; Bischof, Walter F.

    Trajectory-based tasks are common in many applications and have been widely studied. Recently, researchers have shown that even very simple tasks, such as selecting items from cascading menus, can benefit from haptic-force guidance. Haptic guidance is also of significant value in many applications such as medical training, handwriting learning, and in applications requiring precise manipulations. There are, however, only very few guiding principles for selecting parameters that are best suited for proper force guiding. In this paper, we present a model, derived from the steering law that relates movement time to the essential components of a tunneling task in the presence of haptic-force guidance. Results of an experiment show that our model is highly accurate for predicting performance times in force-enhanced tunneling tasks.

  13. Adolescents and "Autographics": Reading and Writing Coming-of-Age Graphic Novels

    ERIC Educational Resources Information Center

    Hughes, Janette Michelle; King, Alyson; Perkins, Peggy; Fuke, Victor

    2011-01-01

    Students at two different sites (a 12th-grade English class focused on workplace preparation and an alternative program for students who had been expelled from school) read graphic novels and, using ComicLife software, created their own graphic sequences called "autographics" based on their personal experiences. The authors explore how…

  14. Using R in Introductory Statistics Courses with the pmg Graphical User Interface

    ERIC Educational Resources Information Center

    Verzani, John

    2008-01-01

    The pmg add-on package for the open source statistics software R is described. This package provides a simple to use graphical user interface (GUI) that allows introductory statistics students, without advanced computing skills, to quickly create the graphical and numeric summaries expected of them. (Contains 9 figures.)

  15. Creating Interactive Graphical Overlays in the Advanced Weather Interactive Processing System Using Shapefiles and DGM Files

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Lafosse, Richard; Hood, Doris; Hoeth, Brian

    2007-01-01

    Graphical overlays can be created in real-time in the Advanced Weather Interactive Processing System (AWIPS) using shapefiles or Denver AWIPS Risk Reduction and Requirements Evaluation (DARE) Graphics Metafile (DGM) files. This presentation describes how to create graphical overlays on-the-fly for AWIPS, by using two examples of AWIPS applications that were created by the Applied Meteorology Unit (AMU) located at Cape Canaveral Air Force Station (CCAFS), Florida. The first example is the Anvil Threat Corridor Forecast Tool, which produces a shapefile that depicts a graphical threat corridor of the forecast movement of thunderstorm anvil clouds, based on the observed or forecast upper-level winds. This tool is used by the Spaceflight Meteorology Group (SMG) at Johnson Space Center, Texas and 45th Weather Squadron (45 WS) at CCAFS to analyze the threat of natural or space vehicle-triggered lightning over a location. The second example is a launch and landing trajectory tool that produces a DGM file that plots the ground track of space vehicles during launch or landing. The trajectory tool can be used by SMG and the 45 WS forecasters to analyze weather radar imagery along a launch or landing trajectory. The presentation will list the advantages and disadvantages of both file types for creating interactive graphical overlays in future AWIPS applications. Shapefiles are a popular format used extensively in Geographical Information Systems. They are usually used in AWIPS to depict static map backgrounds. A shapefile stores the geometry and attribute information of spatial features in a dataset (ESRI 1998). Shapefiles can contain point, line, and polygon features. Each shapefile contains a main file, index file, and a dBASE table. The main file contains a record for each spatial feature, which describes the feature with a list of its vertices. The index file contains the offset of each record from the beginning of the main file. The dBASE table contains records for each attribute. Attributes are commonly used to label spatial features. Shapefiles can be viewed, but not created in AWIPS. As a result, either third-party software can be installed on an AWIPS workstation, or new software must be written to create shapefiles in the correct format.

  16. The force pyramid: a spatial analysis of force application during virtual reality brain tumor resection.

    PubMed

    Azarnoush, Hamed; Siar, Samaneh; Sawaya, Robin; Zhrani, Gmaan Al; Winkler-Schwartz, Alexander; Alotaibi, Fahad Eid; Bugdadi, Abdulgadir; Bajunaid, Khalid; Marwa, Ibrahim; Sabbagh, Abdulrahman Jafar; Del Maestro, Rolando F

    2017-07-01

    OBJECTIVE Virtual reality simulators allow development of novel methods to analyze neurosurgical performance. The concept of a force pyramid is introduced as a Tier 3 metric with the ability to provide visual and spatial analysis of 3D force application by any instrument used during simulated tumor resection. This study was designed to answer 3 questions: 1) Do study groups have distinct force pyramids? 2) Do handedness and ergonomics influence force pyramid structure? 3) Are force pyramids dependent on the visual and haptic characteristics of simulated tumors? METHODS Using a virtual reality simulator, NeuroVR (formerly NeuroTouch), ultrasonic aspirator force application was continually assessed during resection of simulated brain tumors by neurosurgeons, residents, and medical students. The participants performed simulated resections of 18 simulated brain tumors with different visual and haptic characteristics. The raw data, namely, coordinates of the instrument tip as well as contact force values, were collected by the simulator. To provide a visual and qualitative spatial analysis of forces, the authors created a graph, called a force pyramid, representing force sum along the z-coordinate for different xy coordinates of the tool tip. RESULTS Sixteen neurosurgeons, 15 residents, and 84 medical students participated in the study. Neurosurgeon, resident and medical student groups displayed easily distinguishable 3D "force pyramid fingerprints." Neurosurgeons had the lowest force pyramids, indicating application of the lowest forces, followed by resident and medical student groups. Handedness, ergonomics, and visual and haptic tumor characteristics resulted in distinct well-defined 3D force pyramid patterns. CONCLUSIONS Force pyramid fingerprints provide 3D spatial assessment displays of instrument force application during simulated tumor resection. Neurosurgeon force utilization and ergonomic data form a basis for understanding and modulating resident force application and improving patient safety during tumor resection.

  17. Visual-perceptual mismatch in robotic surgery.

    PubMed

    Abiri, Ahmad; Tao, Anna; LaRocca, Meg; Guan, Xingmin; Askari, Syed J; Bisley, James W; Dutson, Erik P; Grundfest, Warren S

    2017-08-01

    The principal objective of the experiment was to analyze the effects of the clutch operation of robotic surgical systems on the performance of the operator. The relative coordinate system introduced by the clutch operation can introduce a visual-perceptual mismatch which can potentially have negative impact on a surgeon's performance. We also assess the impact of the introduction of additional tactile sensory information on reducing the impact of visual-perceptual mismatch on the performance of the operator. We asked 45 novice subjects to complete peg transfers using the da Vinci IS 1200 system with grasper-mounted, normal force sensors. The task involves picking up a peg with one of the robotic arms, passing it to the other arm, and then placing it on the opposite side of the view. Subjects were divided into three groups: aligned group (no mismatch), the misaligned group (10 cm z axis mismatch), and the haptics-misaligned group (haptic feedback and z axis mismatch). Each subject performed the task five times, during which the grip force, time of completion, and number of faults were recorded. Compared to the subjects that performed the tasks using a properly aligned controller/arm configuration, subjects with a single-axis misalignment showed significantly more peg drops (p = 0.011) and longer time to completion (p < 0.001). Additionally, it was observed that addition of tactile feedback helps reduce the negative effects of visual-perceptual mismatch in some cases. Grip force data recorded from grasper-mounted sensors showed no difference between the different groups. The visual-perceptual mismatch created by the misalignment of the robotic controls relative to the robotic arms has a negative impact on the operator of a robotic surgical system. Introduction of other sensory information and haptic feedback systems can help in potentially reducing this effect.

  18. Haptic over visual information in the distribution of visual attention after tool-use in near and far space.

    PubMed

    Park, George D; Reed, Catherine L

    2015-10-01

    Despite attentional prioritization for grasping space near the hands, tool-use appears to transfer attentional bias to the tool's end/functional part. The contributions of haptic and visual inputs to attentional distribution along a tool were investigated as a function of tool-use in near (Experiment 1) and far (Experiment 2) space. Visual attention was assessed with a 50/50, go/no-go, target discrimination task, while a tool was held next to targets appearing near the tool-occupied hand or tool-end. Target response times (RTs) and sensitivity (d-prime) were measured at target locations, before and after functional tool practice for three conditions: (1) open-tool: tool-end visible (visual + haptic inputs), (2) hidden-tool: tool-end visually obscured (haptic input only), and (3) short-tool: stick missing tool's length/end (control condition: hand occupied but no visual/haptic input). In near space, both open- and hidden-tool groups showed a tool-end, attentional bias (faster RTs toward tool-end) before practice; after practice, RTs near the hand improved. In far space, the open-tool group showed no bias before practice; after practice, target RTs near the tool-end improved. However, the hidden-tool group showed a consistent tool-end bias despite practice. Lack of short-tool group results suggested that hidden-tool group results were specific to haptic inputs. In conclusion, (1) allocation of visual attention along a tool due to tool practice differs in near and far space, and (2) visual attention is drawn toward the tool's end even when visually obscured, suggesting haptic input provides sufficient information for directing attention along the tool.

  19. Haptic spatial matching in near peripersonal space.

    PubMed

    Kaas, Amanda L; Mier, Hanneke I van

    2006-04-01

    Research has shown that haptic spatial matching at intermanual distances over 60 cm is prone to large systematic errors. The error pattern has been explained by the use of reference frames intermediate between egocentric and allocentric coding. This study investigated haptic performance in near peripersonal space, i.e. at intermanual distances of 60 cm and less. Twelve blindfolded participants (six males and six females) were presented with two turn bars at equal distances from the midsagittal plane, 30 or 60 cm apart. Different orientations (vertical/horizontal or oblique) of the left bar had to be matched by adjusting the right bar to either a mirror symmetric (/ \\) or parallel (/ /) position. The mirror symmetry task can in principle be performed accurately in both an egocentric and an allocentric reference frame, whereas the parallel task requires an allocentric representation. Results showed that parallel matching induced large systematic errors which increased with distance. Overall error was significantly smaller in the mirror task. The task difference also held for the vertical orientation at 60 cm distance, even though this orientation required the same response in both tasks, showing a marked effect of task instruction. In addition, men outperformed women on the parallel task. Finally, contrary to our expectations, systematic errors were found in the mirror task, predominantly at 30 cm distance. Based on these findings, we suggest that haptic performance in near peripersonal space might be dominated by different mechanisms than those which come into play at distances over 60 cm. Moreover, our results indicate that both inter-individual differences and task demands affect task performance in haptic spatial matching. Therefore, we conclude that the study of haptic spatial matching in near peripersonal space might reveal important additional constraints for the specification of adequate models of haptic spatial performance.

  20. Mixed reality temporal bone surgical dissector: mechanical design.

    PubMed

    Hochman, Jordan Brent; Sepehri, Nariman; Rampersad, Vivek; Kraut, Jay; Khazraee, Milad; Pisa, Justyn; Unger, Bertram

    2014-08-08

    The Development of a Novel Mixed Reality (MR) Simulation. An evolving training environment emphasizes the importance of simulation. Current haptic temporal bone simulators have difficulty representing realistic contact forces and while 3D printed models convincingly represent vibrational properties of bone, they cannot reproduce soft tissue. This paper introduces a mixed reality model, where the effective elements of both simulations are combined; haptic rendering of soft tissue directly interacts with a printed bone model. This paper addresses one aspect in a series of challenges, specifically the mechanical merger of a haptic device with an otic drill. This further necessitates gravity cancelation of the work assembly gripper mechanism. In this system, the haptic end-effector is replaced by a high-speed drill and the virtual contact forces need to be repositioned to the drill tip from the mid wand. Previous publications detail generation of both the requisite printed and haptic simulations. Custom software was developed to reposition the haptic interaction point to the drill tip. A custom fitting, to hold the otic drill, was developed and its weight was offset using the haptic device. The robustness of the system to disturbances and its stable performance during drilling were tested. The experiments were performed on a mixed reality model consisting of two drillable rapid-prototyped layers separated by a free-space. Within the free-space, a linear virtual force model is applied to simulate drill contact with soft tissue. Testing illustrated the effectiveness of gravity cancellation. Additionally, the system exhibited excellent performance given random inputs and during the drill's passage between real and virtual components of the model. No issues with registration at model boundaries were encountered. These tests provide a proof of concept for the initial stages in the development of a novel mixed-reality temporal bone simulator.

  1. Absence of modulatory action on haptic height perception with musical pitch

    PubMed Central

    Geronazzo, Michele; Avanzini, Federico; Grassi, Massimo

    2015-01-01

    Although acoustic frequency is not a spatial property of physical objects, in common language, pitch, i.e., the psychological correlated of frequency, is often labeled spatially (i.e., “high in pitch” or “low in pitch”). Pitch-height is known to modulate (and interact with) the response of participants when they are asked to judge spatial properties of non-auditory stimuli (e.g., visual) in a variety of behavioral tasks. In the current study we investigated whether the modulatory action of pitch-height extended to the haptic estimation of height of a virtual step. We implemented a HW/SW setup which is able to render virtual 3D objects (stair-steps) haptically through a PHANTOM device, and to provide real-time continuous auditory feedback depending on the user interaction with the object. The haptic exploration was associated with a sinusoidal tone whose pitch varied as a function of the interaction point's height within (i) a narrower and (ii) a wider pitch range, or (iii) a random pitch variation acting as a control audio condition. Explorations were also performed with no sound (haptic only). Participants were instructed to explore the virtual step freely, and to communicate height estimation by opening their thumb and index finger to mimic the step riser height, or verbally by reporting the height in centimeters of the step riser. We analyzed the role of musical expertise by dividing participants into non-musicians and musicians. Results showed no effects of musical pitch on high-realistic haptic feedback. Overall there is no difference between the two groups in the proposed multimodal conditions. Additionally, we observed a different haptic response distribution between musicians and non-musicians when estimations of the auditory conditions are matched with estimations in the no sound condition. PMID:26441745

  2. In vivo biomechanical measurement and haptic simulation of portal placement procedure in shoulder arthroscopic surgery

    PubMed Central

    Chae, Sanghoon; Jung, Sung-Weon

    2018-01-01

    A survey of 67 experienced orthopedic surgeons indicated that precise portal placement was the most important skill in arthroscopic surgery. However, none of the currently available virtual reality simulators include simulation / training in portal placement, including haptic feedback of the necessary puncture force. This study aimed to: (1) measure the in vivo force and stiffness during a portal placement procedure in an actual operating room and (2) implement active haptic simulation of a portal placement procedure using the measured in vivo data. We measured the force required for port placement and the stiffness of the joint capsule during portal placement procedures performed by an experienced arthroscopic surgeon. Based on the acquired mechanical property values, we developed a cable-driven active haptic simulator designed to train the portal placement skill and evaluated the validity of the simulated haptics. Ten patients diagnosed with rotator cuff tears were enrolled in this experiment. The maximum peak force and joint capsule stiffness during posterior portal placement procedures were 66.46 (±10.76N) and 2560.82(±252.92) N/m, respectively. We then designed an active haptic simulator using the acquired data. Our cable-driven mechanism structure had a friction force of 3.763 ± 0.341 N, less than 6% of the mean puncture force. Simulator performance was evaluated by comparing the target stiffness and force with the stiffness and force reproduced by the device. R-squared values were 0.998 for puncture force replication and 0.902 for stiffness replication, indicating that the in vivo data can be used to implement a realistic haptic simulator. PMID:29494691

  3. Subthalamic nucleus deep brain stimulation improves somatosensory function in Parkinson's disease.

    PubMed

    Aman, Joshua E; Abosch, Aviva; Bebler, Maggie; Lu, Chia-Hao; Konczak, Jürgen

    2014-02-01

    An established treatment for the motor symptoms of Parkinson's disease (PD) is deep brain stimulation (DBS) of the subthalamic nucleus (STN). Mounting evidence suggests that PD is also associated with somatosensory deficits, yet the effect of STN-DBS on somatosensory processing is largely unknown. This study investigated whether STN-DBS affects somatosensory processing, specifically the processing of tactile and proprioceptive cues, by systematically examining the accuracy of haptic perception of object size. (Haptic perception refers to one's ability to extract object features such as shape and size by active touch.) Without vision, 13 PD patients with implanted STN-DBS and 13 healthy controls haptically explored the heights of 2 successively presented 3-dimensional (3D) blocks using a precision grip. Participants verbally indicated which block was taller and then used their nonprobing hand to motorically match the perceived size of the comparison block. Patients were tested during ON and OFF stimulation, following a 12-hour medication washout period. First, when compared to controls, the PD group's haptic discrimination threshold during OFF stimulation was elevated by 192% and mean hand aperture error was increased by 105%. Second, DBS lowered the haptic discrimination threshold by 26% and aperture error decreased by 20%. Third, during DBS ON, probing with the motorically more affected hand decreased haptic precision compared to probing with the less affected hand. This study offers the first evidence that STN-DBS improves haptic precision, further indicating that somatosensory function is improved by STN-DBS. We conclude that DBS-related improvements are not explained by improvements in motor function alone, but rather by enhanced somatosensory processing. © 2013 Movement Disorder Society.

  4. Sensorimotor Interactions in the Haptic Perception of Virtual Objects

    DTIC Science & Technology

    1997-01-01

    the human user. 2 Compared to our understanding of vision and audition , our knowledge of the human haptic perception is very limited. Many basic...modalities such as vision and audition on haptic perception of viscosity or mass, for example. 116 Some preliminary work has already been done in this...string[3]; *posx="x" *forf="f’ *velv="v" * acca ="a" trial[64]; resp[64]; /* random number */ /* trial number */ /* index */ /* array holding stim

  5. Towards open-source, low-cost haptics for surgery simulation.

    PubMed

    Suwelack, Stefan; Sander, Christian; Schill, Julian; Serf, Manuel; Danz, Marcel; Asfour, Tamim; Burger, Wolfgang; Dillmann, Rüdiger; Speidel, Stefanie

    2014-01-01

    In minimally invasive surgery (MIS), virtual reality (VR) training systems have become a promising education tool. However, the adoption of these systems in research and clinical settings is still limited by the high costs of dedicated haptics hardware for MIS. In this paper, we present ongoing research towards an open-source, low-cost haptic interface for MIS simulation. We demonstrate the basic mechanical design of the device, the sensor setup as well as its software integration.

  6. Multimodal Virtual Environments: MAGIC Toolkit and Visual-Haptic Interaction Paradigms

    DTIC Science & Technology

    1998-01-01

    2.7.3 Load/Save Options ..... 2.7.4 Information Display .... 2.8 Library Files. 2.9 Evaluation .............. 3 Visual-Haptic Interactions 3.1...Northwestern University[ Colgate , 1994]. It is possible for a user to touch one side of a thin object and be propelled out the opposite side, because...when there is a high correlation in motion and force between the visual and haptic realms. * Chapter 7 concludes with an evaluation of the application

  7. Experimental evaluation of a miniature MR device for a wide range of human perceivable haptic sensations

    NASA Astrophysics Data System (ADS)

    Yang, Tae-Heon; Koo, Jeong-Hoi

    2017-12-01

    Humans can experience a realistic and vivid haptic sensations by the sense of touch. In order to have a fully immersive haptic experience, both kinaesthetic and vibrotactile information must be presented to human users. Currently, little haptic research has been performed on small haptic actuators that can covey both vibrotactile feedback based on the frequency of vibrations up to the human-perceivable limit and multiple levels of kinaesthetic feedback rapidly. Therefore, this study intends to design a miniature haptic device based on MR fluid and experimentally evaluate its ability to convey vibrotactile feedback up to 300 Hz along with kinaesthetic feedback. After constructing a prototype device, a series of testing was performed to evaluate its performance of the prototype using an experimental setup, consisting of a precision dynamic mechanical analyzer and an accelerometer. The kinaesthetic testing results show that the prototype device can provide the force rate up to 89% at 5 V (360 mA), which can be discretized into multiple levels of ‘just noticeable difference’ force rate, indicating that the device can convey a wide range of kinaesthetic sensations. To evaluate the high frequency vibrotactile feedback performance of the device, its acceleration responses were measured and processed using the FFT analysis. The results indicate that the device can convey high frequency vibrotactile sensations up to 300 Hz with the sufficiently large intensity of accelerations that human can feel.

  8. Vision-Based Haptic Feedback for Remote Micromanipulation in-SEM Environment

    NASA Astrophysics Data System (ADS)

    Bolopion, Aude; Dahmen, Christian; Stolle, Christian; Haliyo, Sinan; Régnier, Stéphane; Fatikow, Sergej

    2012-07-01

    This article presents an intuitive environment for remote micromanipulation composed of both haptic feedback and virtual reconstruction of the scene. To enable nonexpert users to perform complex teleoperated micromanipulation tasks, it is of utmost importance to provide them with information about the 3-D relative positions of the objects and the tools. Haptic feedback is an intuitive way to transmit such information. Since position sensors are not available at this scale, visual feedback is used to derive information about the scene. In this work, three different techniques are implemented, evaluated, and compared to derive the object positions from scanning electron microscope images. The modified correlation matching with generated template algorithm is accurate and provides reliable detection of objects. To track the tool, a marker-based approach is chosen since fast detection is required for stable haptic feedback. Information derived from these algorithms is used to propose an intuitive remote manipulation system that enables users situated in geographically distant sites to benefit from specific equipments, such as SEMs. Stability of the haptic feedback is ensured by the minimization of the delays, the computational efficiency of vision algorithms, and the proper tuning of the haptic coupling. Virtual guides are proposed to avoid any involuntary collisions between the tool and the objects. This approach is validated by a teleoperation involving melamine microspheres with a diameter of less than 2 μ m between Paris, France and Oldenburg, Germany.

  9. HeatmapGenerator: high performance RNAseq and microarray visualization software suite to examine differential gene expression levels using an R and C++ hybrid computational pipeline.

    PubMed

    Khomtchouk, Bohdan B; Van Booven, Derek J; Wahlestedt, Claes

    2014-01-01

    The graphical visualization of gene expression data using heatmaps has become an integral component of modern-day medical research. Heatmaps are used extensively to plot quantitative differences in gene expression levels, such as those measured with RNAseq and microarray experiments, to provide qualitative large-scale views of the transcriptonomic landscape. Creating high-quality heatmaps is a computationally intensive task, often requiring considerable programming experience, particularly for customizing features to a specific dataset at hand. Software to create publication-quality heatmaps is developed with the R programming language, C++ programming language, and OpenGL application programming interface (API) to create industry-grade high performance graphics. We create a graphical user interface (GUI) software package called HeatmapGenerator for Windows OS and Mac OS X as an intuitive, user-friendly alternative to researchers with minimal prior coding experience to allow them to create publication-quality heatmaps using R graphics without sacrificing their desired level of customization. The simplicity of HeatmapGenerator is that it only requires the user to upload a preformatted input file and download the publicly available R software language, among a few other operating system-specific requirements. Advanced features such as color, text labels, scaling, legend construction, and even database storage can be easily customized with no prior programming knowledge. We provide an intuitive and user-friendly software package, HeatmapGenerator, to create high-quality, customizable heatmaps generated using the high-resolution color graphics capabilities of R. The software is available for Microsoft Windows and Apple Mac OS X. HeatmapGenerator is released under the GNU General Public License and publicly available at: http://sourceforge.net/projects/heatmapgenerator/. The Mac OS X direct download is available at: http://sourceforge.net/projects/heatmapgenerator/files/HeatmapGenerator_MAC_OSX.tar.gz/download. The Windows OS direct download is available at: http://sourceforge.net/projects/heatmapgenerator/files/HeatmapGenerator_WINDOWS.zip/download.

  10. Creating objects and object categories for studying perception and perceptual learning.

    PubMed

    Hauffen, Karin; Bart, Eugene; Brady, Mark; Kersten, Daniel; Hegdé, Jay

    2012-11-02

    In order to quantitatively study object perception, be it perception by biological systems or by machines, one needs to create objects and object categories with precisely definable, preferably naturalistic, properties. Furthermore, for studies on perceptual learning, it is useful to create novel objects and object categories (or object classes) with such properties. Many innovative and useful methods currently exist for creating novel objects and object categories (also see refs. 7,8). However, generally speaking, the existing methods have three broad types of shortcomings. First, shape variations are generally imposed by the experimenter, and may therefore be different from the variability in natural categories, and optimized for a particular recognition algorithm. It would be desirable to have the variations arise independently of the externally imposed constraints. Second, the existing methods have difficulty capturing the shape complexity of natural objects. If the goal is to study natural object perception, it is desirable for objects and object categories to be naturalistic, so as to avoid possible confounds and special cases. Third, it is generally hard to quantitatively measure the available information in the stimuli created by conventional methods. It would be desirable to create objects and object categories where the available information can be precisely measured and, where necessary, systematically manipulated (or 'tuned'). This allows one to formulate the underlying object recognition tasks in quantitative terms. Here we describe a set of algorithms, or methods, that meet all three of the above criteria. Virtual morphogenesis (VM) creates novel, naturalistic virtual 3-D objects called 'digital embryos' by simulating the biological process of embryogenesis. Virtual phylogenesis (VP) creates novel, naturalistic object categories by simulating the evolutionary process of natural selection. Objects and object categories created by these simulations can be further manipulated by various morphing methods to generate systematic variations of shape characteristics. The VP and morphing methods can also be applied, in principle, to novel virtual objects other than digital embryos, or to virtual versions of real-world objects. Virtual objects created in this fashion can be rendered as visual images using a conventional graphical toolkit, with desired manipulations of surface texture, illumination, size, viewpoint and background. The virtual objects can also be 'printed' as haptic objects using a conventional 3-D prototyper. We also describe some implementations of these computational algorithms to help illustrate the potential utility of the algorithms. It is important to distinguish the algorithms from their implementations. The implementations are demonstrations offered solely as a 'proof of principle' of the underlying algorithms. It is important to note that, in general, an implementation of a computational algorithm often has limitations that the algorithm itself does not have. Together, these methods represent a set of powerful and flexible tools for studying object recognition and perceptual learning by biological and computational systems alike. With appropriate extensions, these methods may also prove useful in the study of morphogenesis and phylogenesis.

  11. The mere exposure effect in the domain of haptics.

    PubMed

    Jakesch, Martina; Carbon, Claus-Christian

    2012-01-01

    Zajonc showed that the attitude towards stimuli that one had been previously exposed to is more positive than towards novel stimuli. This mere exposure effect (MEE) has been tested extensively using various visual stimuli. Research on the MEE is sparse, however, for other sensory modalities. We used objects of two material categories (stone and wood) and two complexity levels (simple and complex) to test the influence of exposure frequency (F0 = novel stimuli, F2 = stimuli exposed twice, F10 = stimuli exposed ten times) under two sensory modalities (haptics only and haptics & vision). Effects of exposure frequency were found for high complex stimuli with significantly increasing liking from F0 to F2 and F10, but only for the stone category. Analysis of "Need for Touch" data showed the MEE in participants with high need for touch, which suggests different sensitivity or saturation levels of MEE. This different sensitivity or saturation levels might also reflect the effects of expertise on the haptic evaluation of objects. It seems that haptic and cross-modal MEEs are influenced by factors similar to those in the visual domain indicating a common cognitive basis.

  12. Aging and the haptic perception of 3D surface shape.

    PubMed

    Norman, J Farley; Kappers, Astrid M L; Beers, Amanda M; Scott, A Kate; Norman, Hideko F; Koenderink, Jan J

    2011-04-01

    Two experiments evaluated the ability of older and younger adults to perceive the three-dimensional (3D) shape of object surfaces from active touch (haptics). The ages of the older adults ranged from 64 to 84 years, while those of the younger adults ranged from 18 to 27 years. In Experiment 1, the participants haptically judged the shape of large (20 cm diameter) surfaces with an entire hand. In contrast, in Experiment 2, the participants explored the shape of small (5 cm diameter) surfaces with a single finger. The haptic surfaces varied in shape index (Koenderink, Solid shape, 1990; Koenderink, Image and Vision Computing, 10, 557-564, 1992) from -1.0 to +1.0 in steps of 0.25. For both types of surfaces (large and small), the participants were able to judge surface shape reliably. The older participants' judgments of surface shape were just as accurate and precise as those of the younger participants. The results of the current study demonstrate that while older adults do possess reductions in tactile sensitivity and acuity, they nevertheless can effectively perceive 3D surface shape from haptic exploration.

  13. Haptic-STM: a human-in-the-loop interface to a scanning tunneling microscope.

    PubMed

    Perdigão, Luís M A; Saywell, Alex

    2011-07-01

    The operation of a haptic device interfaced with a scanning tunneling microscope (STM) is presented here. The user moves the STM tip in three dimensions by means of a stylus attached to the haptic instrument. The tunneling current measured by the STM is converted to a vertical force, applied to the stylus and felt by the user, with the user being incorporated into the feedback loop that controls the tip-surface distance. A haptic-STM interface of this nature allows the user to feel atomic features on the surface and facilitates the tactile manipulation of the adsorbate/substrate system. The operation of this device is demonstrated via the room temperature STM imaging of C(60) molecules adsorbed on an Au(111) surface in ultra-high vacuum.

  14. Haptics-based immersive telerobotic system for improvised explosive device disposal: Are two hands better than one?

    NASA Astrophysics Data System (ADS)

    Erickson, David; Lacheray, Hervé; Lambert, Jason Michel; Mantegh, Iraj; Crymble, Derry; Daly, John; Zhao, Yan

    2012-06-01

    State-of-the-art robotic explosive ordnance disposal robotics have not, in general, adopted recent advances in control technology and man-machine interfaces and lag many years behind academia. This paper describes the Haptics-based Immersive Telerobotic System project investigating an immersive telepresence envrionment incorporating advanced vehicle control systems, Augmented immersive sensory feedback, dynamic 3D visual information, and haptic feedback for explosive ordnance disposal operators. The project aim is to provide operatiors a more sophisticated interface and expand sensory input to perform complex tasks to defeat improvised explosive devices successfully. The introduction of haptics and immersive teleprescence has the potential to shift the way teleprescence systems work for explosive ordnance disposal tasks or more widely for first responders scenarios involving remote unmanned ground vehicles.

  15. Using Every Word and Image: Framing Graphic Novel Instruction in the Expanded Four Resources Model

    ERIC Educational Resources Information Center

    Meyer, Carla K.; Jiménez, Laura M.

    2017-01-01

    In many classrooms, teachers have started to incorporate graphic novels in classroom instruction. However, research has suggested that some readers may have limited understanding of how to read graphic novels, which can create challenges for teachers using the medium. Drawing from a larger study, this article highlights two cases, an expert…

  16. Preventing Death by PowerPoint[R]: Tips for Effective Presentations that Inform and Engage

    ERIC Educational Resources Information Center

    Donohue, Chip

    2009-01-01

    PowerPoint[R] and other familiar presentation graphics programs like Apple[R] Keynote, Corel[R] Presentations[TM], Harvard Graphics[R] Pro Presentations, Lotus[R] Freelance Graphics, and OpenOffice Impress can help one become a more effective presenter. These programs are designed to organize words and images to create slides, speaker notes, and…

  17. Recruitment of Foveal Retinotopic Cortex During Haptic Exploration of Shapes and Actions in the Dark.

    PubMed

    Monaco, Simona; Gallivan, Jason P; Figley, Teresa D; Singhal, Anthony; Culham, Jody C

    2017-11-29

    The role of the early visual cortex and higher-order occipitotemporal cortex has been studied extensively for visual recognition and to a lesser degree for haptic recognition and visually guided actions. Using a slow event-related fMRI experiment, we investigated whether tactile and visual exploration of objects recruit the same "visual" areas (and in the case of visual cortex, the same retinotopic zones) and if these areas show reactivation during delayed actions in the dark toward haptically explored objects (and if so, whether this reactivation might be due to imagery). We examined activation during visual or haptic exploration of objects and action execution (grasping or reaching) separated by an 18 s delay. Twenty-nine human volunteers (13 females) participated in this study. Participants had their eyes open and fixated on a point in the dark. The objects were placed below the fixation point and accordingly visual exploration activated the cuneus, which processes retinotopic locations in the lower visual field. Strikingly, the occipital pole (OP), representing foveal locations, showed higher activation for tactile than visual exploration, although the stimulus was unseen and location in the visual field was peripheral. Moreover, the lateral occipital tactile-visual area (LOtv) showed comparable activation for tactile and visual exploration. Psychophysiological interaction analysis indicated that the OP showed stronger functional connectivity with anterior intraparietal sulcus and LOtv during the haptic than visual exploration of shapes in the dark. After the delay, the cuneus, OP, and LOtv showed reactivation that was independent of the sensory modality used to explore the object. These results show that haptic actions not only activate "visual" areas during object touch, but also that this information appears to be used in guiding grasping actions toward targets after a delay. SIGNIFICANCE STATEMENT Visual presentation of an object activates shape-processing areas and retinotopic locations in early visual areas. Moreover, if the object is grasped in the dark after a delay, these areas show "reactivation." Here, we show that these areas are also activated and reactivated for haptic object exploration and haptically guided grasping. Touch-related activity occurs not only in the retinotopic location of the visual stimulus, but also at the occipital pole (OP), corresponding to the foveal representation, even though the stimulus was unseen and located peripherally. That is, the same "visual" regions are implicated in both visual and haptic exploration; however, touch also recruits high-acuity central representation within early visual areas during both haptic exploration of objects and subsequent actions toward them. Functional connectivity analysis shows that the OP is more strongly connected with ventral and dorsal stream areas when participants explore an object in the dark than when they view it. Copyright © 2017 the authors 0270-6474/17/3711572-20$15.00/0.

  18. Integration of serious games and wearable haptic interfaces for Neuro Rehabilitation of children with movement disorders: A feasibility study.

    PubMed

    Bortone, Ilaria; Leonardis, Daniele; Solazzi, Massimiliano; Procopio, Caterina; Crecchi, Alessandra; Bonfiglio, Luca; Frisoli, Antonio

    2017-07-01

    The past decade has seen the emergence of rehabilitation treatments using virtual reality environments. One of the advantages in using this technology is the potential to create positive motivation, by means of engaging environments and tasks shaped in the form of serious games. In this work, we propose a novel Neuro Rehabilitation System for children with movement disorders, that is based on serious games in immersive virtual reality with haptic feedback. The system design aims to enhance involvement and engagement of patients, to provide congruent multi-sensory afferent feedback during motor exercises, and to benefit from the flexibility of virtual reality in adapting exercises to the patient's needs. We present a feasibility study of the method conducted through an experimental rehabilitation session in a group of 4 children with Cerebral Palsy and Developmental Dyspraxia, 4 Typically Developing children and 4 healthy adults. Subjects and patients were able to accomplish the proposed rehabilitation session and average performance of the motor exercises in patients were lower, although comparable, to healthy subjects. Together with positive comments reported by children after the rehabilitation session, results are encouraging for application of the method in a prolonged rehabilitation treatment.

  19. Experimental Research Examining How People Can Cope with Uncertainty Through Soft Haptic Sensations.

    PubMed

    van Horen, Femke; Mussweiler, Thomas

    2015-09-16

    Human beings are constantly surrounded by uncertainty and change. The question arises how people cope with such uncertainty. To date, most research has focused on the cognitive strategies people adopt to deal with uncertainty. However, especially when uncertainty is due to unpredictable societal events (e.g., economical crises, political revolutions, terrorism threats) of which one is unable to judge the impact on one's future live, cognitive strategies (like seeking additional information) is likely to fail to combat uncertainty. Instead, the current paper discusses a method demonstrating that people might deal with uncertainty experientially through soft haptic sensations. More specifically, because touching something soft creates a feeling of comfort and security, people prefer objects with softer as compared to harder properties when feeling uncertain. Seeking for softness is a highly efficient and effective tool to deal with uncertainty as our hands are available at all times. This protocol describes a set of methods demonstrating 1) how environmental (un)certainty can be situationally activated with an experiential priming procedure, 2) that the quality of the softness experience (what type of softness and how it is experienced) matters and 3) how uncertainty can be reduced using different methods.

  20. GIS and RDBMS Used with Offline FAA Airspace Databases

    NASA Technical Reports Server (NTRS)

    Clark, J.; Simmons, J.; Scofield, E.; Talbott, B.

    1994-01-01

    A geographic information system (GIS) and relational database management system (RDBMS) were used in a Macintosh environment to access, manipulate, and display off-line FAA databases of airport and navigational aid locations, airways, and airspace boundaries. This proof-of-concept effort used data available from the Adaptation Controlled Environment System (ACES) and Digital Aeronautical Chart Supplement (DACS) databases to allow FAA cartographers and others to create computer-assisted charts and overlays as reference material for air traffic controllers. These products were created on an engineering model of the future GRASP (GRaphics Adaptation Support Position) workstation that will be used to make graphics and text products for the Advanced Automation System (AAS), which will upgrade and replace the current air traffic control system. Techniques developed during the prototyping effort have shown the viability of using databases to create graphical products without the need for an intervening data entry step.

  1. 3D imaging, 3D printing and 3D virtual planning in endodontics.

    PubMed

    Shah, Pratik; Chong, B S

    2018-03-01

    The adoption and adaptation of recent advances in digital technology, such as three-dimensional (3D) printed objects and haptic simulators, in dentistry have influenced teaching and/or management of cases involving implant, craniofacial, maxillofacial, orthognathic and periodontal treatments. 3D printed models and guides may help operators plan and tackle complicated non-surgical and surgical endodontic treatment and may aid skill acquisition. Haptic simulators may assist in the development of competency in endodontic procedures through the acquisition of psycho-motor skills. This review explores and discusses the potential applications of 3D printed models and guides, and haptic simulators in the teaching and management of endodontic procedures. An understanding of the pertinent technology related to the production of 3D printed objects and the operation of haptic simulators are also presented.

  2. Morphologic compatibility or intraocular lens haptics and the lens capsule.

    PubMed

    Nagamoto, T; Eguchi, G

    1997-10-01

    To evaluate the mechanical relationship between the intraocular lens (IOL) haptic and the capsular bag by quantitatively analyzing the fit of the haptic with the capsule equator and the capsular bag deformity induced by the implanted lens haptics. Division of Morphogenesis, Department of Developmental Biology, National Institute for Basic Biology, Okazaki, Japan. Following implantation of a poly(methyl methacrylate)(PMMA) ring in three excised human capsular bags with continuous curvilinear capsulorhexis (CCC), IOLs with different overall lengths or haptic designs were implanted in the bags and photographed. The straight length of the area of contact between the haptic and the capsule equator on the photographs was measured to provide a quantitative index of in-the-bag fixation and the length from the external margin of the PMMA ring to the external margin of the loop along the maximal diameter of the capsular bag, to indicate the quantitative degree of capsular deformity induced by an IOL. An IOL with modified-C loops produced better fit along the capsule equator and less deformity than an IOL with modified-J loops, and an IOL with an overall length of 12.0 or 12.5 mm produced a sufficiently good fit and less distortion of the capsular bag than an IOL with an overall length over 13.0 mm. An IOL with modified-C loops and an overall length of 12.0 or 12.5 mm is adequate for in-the-bag implantation following CCC.

  3. Fragility of haptic memory in human full-term newborns.

    PubMed

    Lejeune, Fleur; Borradori Tolsa, Cristina; Gentaz, Edouard; Barisnikov, Koviljka

    2018-05-31

    Numerous studies have established that newborns can memorize tactile information about the specific features of an object with their hands and detect differences with another object. However, the robustness of haptic memory abilities has already been examined in preterm newborns and in full-term infants, but not yet in full-term newborns. This research is aimed to better understand the robustness of haptic memory abilities at birth by examining the effects of a change in the objects' temperature and haptic interference. Sixty-eight full-term newborns (mean postnatal age: 2.5 days) were included. The two experiments were conducted in three phases: habituation (repeated presentation of the same object, a prism or cylinder in the newborn's hand), discrimination (presentation of a novel object), and recognition (presentation of the familiar object). In Experiment 1, the change in the objects' temperature was controlled during the three phases. Results reveal that newborns can memorize specific features that differentiate prism and cylinder shapes by touch, and discriminate between them, but surprisingly they did not show evidence of recognizing them after interference. As no significant effect of the temperature condition was observed in habituation, discrimination and recognition abilities, these findings suggest that discrimination abilities in newborns may be determined by the detection of shape differences. Overall, it seems that the ontogenesis of haptic recognition memory is not linear. The developmental schedule is likely crucial for haptic development between 34 and 40 GW. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. Design and implementation of visual-haptic assistive control system for virtual rehabilitation exercise and teleoperation manipulation.

    PubMed

    Veras, Eduardo J; De Laurentis, Kathryn J; Dubey, Rajiv

    2008-01-01

    This paper describes the design and implementation of a control system that integrates visual and haptic information to give assistive force feedback through a haptic controller (Omni Phantom) to the user. A sensor-based assistive function and velocity scaling program provides force feedback that helps the user complete trajectory following exercises for rehabilitation purposes. This system also incorporates a PUMA robot for teleoperation, which implements a camera and a laser range finder, controlled in real time by a PC, were implemented into the system to help the user to define the intended path to the selected target. The real-time force feedback from the remote robot to the haptic controller is made possible by using effective multithreading programming strategies in the control system design and by novel sensor integration. The sensor-based assistant function concept applied to teleoperation as well as shared control enhances the motion range and manipulation capabilities of the users executing rehabilitation exercises such as trajectory following along a sensor-based defined path. The system is modularly designed to allow for integration of different master devices and sensors. Furthermore, because this real-time system is versatile the haptic component can be used separately from the telerobotic component; in other words, one can use the haptic device for rehabilitation purposes for cases in which assistance is needed to perform tasks (e.g., stroke rehab) and also for teleoperation with force feedback and sensor assistance in either supervisory or automatic modes.

  5. Optimal visual-haptic integration with articulated tools.

    PubMed

    Takahashi, Chie; Watt, Simon J

    2017-05-01

    When we feel and see an object, the nervous system integrates visual and haptic information optimally, exploiting the redundancy in multiple signals to estimate properties more precisely than is possible from either signal alone. We examined whether optimal integration is similarly achieved when using articulated tools. Such tools (tongs, pliers, etc) are a defining characteristic of human hand function, but complicate the classical sensory 'correspondence problem' underlying multisensory integration. Optimal integration requires establishing the relationship between signals acquired by different sensors (hand and eye) and, therefore, in fundamentally unrelated units. The system must also determine when signals refer to the same property of the world-seeing and feeling the same thing-and only integrate those that do. This could be achieved by comparing the pattern of current visual and haptic input to known statistics of their normal relationship. Articulated tools disrupt this relationship, however, by altering the geometrical relationship between object properties and hand posture (the haptic signal). We examined whether different tool configurations are taken into account in visual-haptic integration. We indexed integration by measuring the precision of size estimates, and compared our results to optimal predictions from a maximum-likelihood integrator. Integration was near optimal, independent of tool configuration/hand posture, provided that visual and haptic signals referred to the same object in the world. Thus, sensory correspondence was determined correctly (trial-by-trial), taking tool configuration into account. This reveals highly flexible multisensory integration underlying tool use, consistent with the brain constructing internal models of tools' properties.

  6. A perspective on the role and utility of haptic feedback in laparoscopic skills training.

    PubMed

    Singapogu, Ravikiran; Burg, Timothy; Burg, Karen J L; Smith, Dane E; Eckenrode, Amanda H

    2014-01-01

    Laparoscopic surgery is a minimally invasive surgical technique with significant potential benefits to the patient, including shorter recovery time, less scarring, and decreased costs. There is a growing need to teach surgical trainees this emerging surgical technique. Simulators, ranging from simple "box" trainers to complex virtual reality (VR) trainers, have emerged as the most promising method for teaching basic laparoscopic surgical skills. Current box trainers require oversight from an expert surgeon for both training and assessing skills. VR trainers decrease the dependence on expert teachers during training by providing objective, real-time feedback and automatic skills evaluation. However, current VR trainers generally have limited credibility as a means to prepare new surgeons and have often fallen short of educators' expectations. Several researchers have speculated that the missing component in modern VR trainers is haptic feedback, which refers to the range of touch sensations encountered during surgery. These force types and ranges need to be adequately rendered by simulators for a more complete training experience. This article presents a perspective of the role and utility of haptic feedback during laparoscopic surgery and laparoscopic skills training by detailing the ranges and types of haptic sensations felt by the operating surgeon, along with quantitative studies of how this feedback is used. Further, a number of research studies that have documented human performance effects as a result of the presence of haptic feedback are critically reviewed. Finally, key research directions in using haptic feedback for laparoscopy training simulators are identified.

  7. Creating Visual Aids with Graphic Organisers on an Infinite Canvas--The Impact on the Presenter

    ERIC Educational Resources Information Center

    Casteleyn, Jordi; Mottart, Andre; Valcke, Martin

    2015-01-01

    Instead of the traditional set of slides, the visual aids of a presentation can now be graphic organisers (concept maps, knowledge maps, mind maps) on an infinite canvas. Constructing graphic organisers has a beneficial impact on learning, but this topic has not been studied in the context of giving a presentation. The present study examined this…

  8. Addendum I, BIOPLUME III Graphics Conversion to SURFER Format

    EPA Pesticide Factsheets

    This procedure can be used to create a SURFER® compatible grid file from Bioplume III input and output graphics. The input data and results from Bioplume III can be contoured and printed directly from SURFER.

  9. Graphics Software Packages as Instructional Tools.

    ERIC Educational Resources Information Center

    Chiavaroli, Julius J.; Till, Ronald J.

    1985-01-01

    Graphics software can assist hearing-impaired students in visualizing and comparing ideas and can also demonstrate spatial relations and encourage creativity. Teachers and students can create and present data, diagrams, drawings, or charts quickly and accurately. (Author/CL)

  10. Improving aircraft conceptual design - A PHIGS interactive graphics interface for ACSYNT

    NASA Technical Reports Server (NTRS)

    Wampler, S. G.; Myklebust, A.; Jayaram, S.; Gelhausen, P.

    1988-01-01

    A CAD interface has been created for the 'ACSYNT' aircraft conceptual design code that permits the execution and control of the design process via interactive graphics menus. This CAD interface was coded entirely with the new three-dimensional graphics standard, the Programmer's Hierarchical Interactive Graphics System. The CAD/ACSYNT system is designed for use by state-of-the-art high-speed imaging work stations. Attention is given to the approaches employed in modeling, data storage, and rendering.

  11. Force Sensitive Handles and Capacitive Touch Sensor for Driving a Flexible Haptic-Based Immersive System

    PubMed Central

    Covarrubias, Mario; Bordegoni, Monica; Cugini, Umberto

    2013-01-01

    In this article, we present an approach that uses both two force sensitive handles (FSH) and a flexible capacitive touch sensor (FCTS) to drive a haptic-based immersive system. The immersive system has been developed as part of a multimodal interface for product design. The haptic interface consists of a strip that can be used by product designers to evaluate the quality of a 3D virtual shape by using touch, vision and hearing and, also, to interactively change the shape of the virtual object. Specifically, the user interacts with the FSH to move the virtual object and to appropriately position the haptic interface for retrieving the six degrees of freedom required for both manipulation and modification modalities. The FCTS allows the system to track the movement and position of the user's fingers on the strip, which is used for rendering visual and sound feedback. Two evaluation experiments are described, which involve both the evaluation and the modification of a 3D shape. Results show that the use of the haptic strip for the evaluation of aesthetic shapes is effective and supports product designers in the appreciation of the aesthetic qualities of the shape. PMID:24113680

  12. The Mere Exposure Effect in the Domain of Haptics

    PubMed Central

    Jakesch, Martina; Carbon, Claus-Christian

    2012-01-01

    Background Zajonc showed that the attitude towards stimuli that one had been previously exposed to is more positive than towards novel stimuli. This mere exposure effect (MEE) has been tested extensively using various visual stimuli. Research on the MEE is sparse, however, for other sensory modalities. Methodology/Principal Findings We used objects of two material categories (stone and wood) and two complexity levels (simple and complex) to test the influence of exposure frequency (F0 = novel stimuli, F2 = stimuli exposed twice, F10 = stimuli exposed ten times) under two sensory modalities (haptics only and haptics & vision). Effects of exposure frequency were found for high complex stimuli with significantly increasing liking from F0 to F2 and F10, but only for the stone category. Analysis of “Need for Touch” data showed the MEE in participants with high need for touch, which suggests different sensitivity or saturation levels of MEE. Conclusions/Significance This different sensitivity or saturation levels might also reflect the effects of expertise on the haptic evaluation of objects. It seems that haptic and cross-modal MEEs are influenced by factors similar to those in the visual domain indicating a common cognitive basis. PMID:22347451

  13. Development of a Robotic Colonoscopic Manipulation System, Using Haptic Feedback Algorithm.

    PubMed

    Woo, Jaehong; Choi, Jae Hyuk; Seo, Jong Tae; Kim, Tae Il; Yi, Byung Ju

    2017-01-01

    Colonoscopy is one of the most effective diagnostic and therapeutic tools for colorectal diseases. We aim to propose a master-slave robotic colonoscopy that is controllable in remote site using conventional colonoscopy. The master and slave robot were developed to use conventional flexible colonoscopy. The robotic colonoscopic procedure was performed using a colonoscope training model by one expert endoscopist and two unexperienced engineers. To provide the haptic sensation, the insertion force and the rotating torque were measured and sent to the master robot. A slave robot was developed to hold the colonoscopy and its knob, and perform insertion, rotation, and two tilting motions of colonoscope. A master robot was designed to teach motions of the slave robot. These measured force and torque were scaled down by one tenth to provide the operator with some reflection force and torque at the haptic device. The haptic sensation and feedback system was successful and helpful to feel the constrained force or torque in colon. The insertion time using robotic system decreased with repeated procedures. This work proposed a robotic approach for colonoscopy using haptic feedback algorithm, and this robotic device would effectively perform colonoscopy with reduced burden and comparable safety for patients in remote site.

  14. Experimental Study on the Perception Characteristics of Haptic Texture by Multidimensional Scaling.

    PubMed

    Wu, Juan; Li, Na; Liu, Wei; Song, Guangming; Zhang, Jun

    2015-01-01

    Recent works regarding real texture perception demonstrate that physical factors such as stiffness and spatial period play a fundamental role in texture perception. This research used a multidimensional scaling (MDS) analysis to further characterize and quantify the effects of the simulation parameters on haptic texture rendering and perception. In a pilot experiment, 12 haptic texture samples were generated by using a 3-degrees-of-freedom (3-DOF) force-feedback device with varying spatial period, height, and stiffness coefficient parameter values. The subjects' perceptions of the virtual textures indicate that roughness, denseness, flatness and hardness are distinguishing characteristics of texture. In the main experiment, 19 participants rated the dissimilarities of the textures and estimated the magnitudes of their characteristics. The MDS method was used to recover the underlying perceptual space and reveal the significance of the space from the recorded data. The physical parameters and their combinations have significant effects on the perceptual characteristics. A regression model was used to quantitatively analyze the parameters and their effects on the perceptual characteristics. This paper is to illustrate that haptic texture perception based on force feedback can be modeled in two- or three-dimensional space and provide suggestions on improving perception-based haptic texture rendering.

  15. Force sensitive handles and capacitive touch sensor for driving a flexible haptic-based immersive system.

    PubMed

    Covarrubias, Mario; Bordegoni, Monica; Cugini, Umberto

    2013-10-09

    In this article, we present an approach that uses both two force sensitive handles (FSH) and a flexible capacitive touch sensor (FCTS) to drive a haptic-based immersive system. The immersive system has been developed as part of a multimodal interface for product design. The haptic interface consists of a strip that can be used by product designers to evaluate the quality of a 3D virtual shape by using touch, vision and hearing and, also, to interactively change the shape of the virtual object. Specifically, the user interacts with the FSH to move the virtual object and to appropriately position the haptic interface for retrieving the six degrees of freedom required for both manipulation and modification modalities. The FCTS allows the system to track the movement and position of the user's fingers on the strip, which is used for rendering visual and sound feedback. Two evaluation experiments are described, which involve both the evaluation and the modification of a 3D shape. Results show that the use of the haptic strip for the evaluation of aesthetic shapes is effective and supports product designers in the appreciation of the aesthetic qualities of the shape.

  16. Haptic shape discrimination and interhemispheric communication.

    PubMed

    Dowell, Catherine J; Norman, J Farley; Moment, Jackie R; Shain, Lindsey M; Norman, Hideko F; Phillips, Flip; Kappers, Astrid M L

    2018-01-10

    In three experiments participants haptically discriminated object shape using unimanual (single hand explored two objects) and bimanual exploration (both hands were used, but each hand, left or right, explored a separate object). Such haptic exploration (one versus two hands) requires somatosensory processing in either only one or both cerebral hemispheres; previous studies related to the perception of shape/curvature found superior performance for unimanual exploration, indicating that shape comparison is more effective when only one hemisphere is utilized. The current results, obtained for naturally shaped solid objects (bell peppers, Capsicum annuum) and simple cylindrical surfaces demonstrate otherwise: bimanual haptic exploration can be as effective as unimanual exploration, showing that there is no necessary reduction in ability when haptic shape comparison requires interhemispheric communication. We found that while successive bimanual exploration produced high shape discriminability, the participants' bimanual performance deteriorated for simultaneous shape comparisons. This outcome suggests that either interhemispheric interference or the need to attend to multiple objects simultaneously reduces shape discrimination ability. The current results also reveal a significant effect of age: older adults' shape discrimination abilities are moderately reduced relative to younger adults, regardless of how objects are manipulated (left hand only, right hand only, or bimanual exploration).

  17. A Review of Simulators with Haptic Devices for Medical Training.

    PubMed

    Escobar-Castillejos, David; Noguez, Julieta; Neri, Luis; Magana, Alejandra; Benes, Bedrich

    2016-04-01

    Medical procedures often involve the use of the tactile sense to manipulate organs or tissues by using special tools. Doctors require extensive preparation in order to perform them successfully; for example, research shows that a minimum of 750 operations are needed to acquire sufficient experience to perform medical procedures correctly. Haptic devices have become an important training alternative and they have been considered to improve medical training because they let users interact with virtual environments by adding the sense of touch to the simulation. Previous articles in the field state that haptic devices enhance the learning of surgeons compared to current training environments used in medical schools (corpses, animals, or synthetic skin and organs). Consequently, virtual environments use haptic devices to improve realism. The goal of this paper is to provide a state of the art review of recent medical simulators that use haptic devices. In particular we focus on stitching, palpation, dental procedures, endoscopy, laparoscopy, and orthopaedics. These simulators are reviewed and compared from the viewpoint of used technology, the number of degrees of freedom, degrees of force feedback, perceived realism, immersion, and feedback provided to the user. In the conclusion, several observations per area and suggestions for future work are provided.

  18. Differential effects of delay upon visually and haptically guided grasping and perceptual judgments.

    PubMed

    Pettypiece, Charles E; Culham, Jody C; Goodale, Melvyn A

    2009-05-01

    Experiments with visual illusions have revealed a dissociation between the systems that mediate object perception and those responsible for object-directed action. More recently, an experiment on a haptic version of the visual size-contrast illusion has provided evidence for the notion that the haptic modality shows a similar dissociation when grasping and estimating the size of objects in real-time. Here we present evidence suggesting that the similarities between the two modalities begin to break down once a delay is introduced between when people feel the target object and when they perform the grasp or estimation. In particular, when grasping after a delay in a haptic paradigm, people scale their grasps differently when the target is presented with a flanking object of a different size (although the difference does not reflect a size-contrast effect). When estimating after a delay, however, it appears that people ignore the size of the flanking objects entirely. This does not fit well with the results commonly found in visual experiments. Thus, introducing a delay reveals important differences in the way in which haptic and visual memories are stored and accessed.

  19. Programming Language Software For Graphics Applications

    NASA Technical Reports Server (NTRS)

    Beckman, Brian C.

    1993-01-01

    New approach reduces repetitive development of features common to different applications. High-level programming language and interactive environment with access to graphical hardware and software created by adding graphical commands and other constructs to standardized, general-purpose programming language, "Scheme". Designed for use in developing other software incorporating interactive computer-graphics capabilities into application programs. Provides alternative to programming entire applications in C or FORTRAN, specifically ameliorating design and implementation of complex control and data structures typifying applications with interactive graphics. Enables experimental programming and rapid development of prototype software, and yields high-level programs serving as executable versions of software-design documentation.

  20. Rapid Assessment of Agility for Conceptual Design Synthesis

    NASA Technical Reports Server (NTRS)

    Biezad, Daniel J.

    1996-01-01

    This project consists of designing and implementing a real-time graphical interface for a workstation-based flight simulator. It is capable of creating a three-dimensional out-the-window scene of the aircraft's flying environment, with extensive information about the aircraft's state displayed in the form of a heads-up-display (HUD) overlay. The code, written in the C programming language, makes calls to Silicon Graphics' Graphics Library (GL) to draw the graphics primitives. Included in this report is a detailed description of the capabilities of the code, including graphical examples, as well as a printout of the code itself

  1. Adaptive displays and controllers using alternative feedback.

    PubMed

    Repperger, D W

    2004-12-01

    Investigations on the design of haptic (force reflecting joystick or force display) controllers were conducted by viewing the display of force information within the context of several different paradigms. First, using analogies from electrical and mechanical systems, certain schemes of the haptic interface were hypothesized which may improve the human-machine interaction with respect to various criteria. A discussion is given on how this interaction benefits the electrical and mechanical system. To generalize this concept to the design of human-machine interfaces, three studies with haptic mechanisms were then synthesized and analyzed.

  2. Does haptic steering guidance instigate speeding? A driving simulator study into causes and remedies.

    PubMed

    Melman, T; de Winter, J C F; Abbink, D A

    2017-01-01

    An important issue in road traffic safety is that drivers show adverse behavioral adaptation (BA) to driver assistance systems. Haptic steering guidance is an upcoming assistance system which facilitates lane-keeping performance while keeping drivers in the loop, and which may be particularly prone to BA. Thus far, experiments on haptic steering guidance have measured driver performance while the vehicle speed was kept constant. The aim of the present driving simulator study was to examine whether haptic steering guidance causes BA in the form of speeding, and to evaluate two types of haptic steering guidance designed not to suffer from BA. Twenty-four participants drove a 1.8m wide car for 13.9km on a curved road, with cones demarcating a single 2.2m narrow lane. Participants completed four conditions in a counterbalanced design: no guidance (Manual), continuous haptic guidance (Cont), continuous guidance that linearly reduced feedback gains from full guidance at 125km/h towards manual control at 130km/h and above (ContRF), and haptic guidance provided only when the predicted lateral position was outside a lateral bandwidth (Band). Participants were familiarized with each condition prior to the experimental runs and were instructed to drive as they normally would while minimizing the number of cone hits. Compared to Manual, the Cont condition yielded a significantly higher driving speed (on average by 7km/h), whereas ContRF and Band did not. All three guidance conditions yielded better lane-keeping performance than Manual, whereas Cont and ContRF yielded lower self-reported workload than Manual. In conclusion, continuous steering guidance entices drivers to increase their speed, thereby diminishing its potential safety benefits. It is possible to prevent BA while retaining safety benefits by making a design adjustment either in lateral (Band) or in longitudinal (ContRF) direction. Copyright © 2016. Published by Elsevier Ltd.

  3. Mixed reality temporal bone surgical dissector: mechanical design

    PubMed Central

    2014-01-01

    Objective The Development of a Novel Mixed Reality (MR) Simulation. An evolving training environment emphasizes the importance of simulation. Current haptic temporal bone simulators have difficulty representing realistic contact forces and while 3D printed models convincingly represent vibrational properties of bone, they cannot reproduce soft tissue. This paper introduces a mixed reality model, where the effective elements of both simulations are combined; haptic rendering of soft tissue directly interacts with a printed bone model. This paper addresses one aspect in a series of challenges, specifically the mechanical merger of a haptic device with an otic drill. This further necessitates gravity cancelation of the work assembly gripper mechanism. In this system, the haptic end-effector is replaced by a high-speed drill and the virtual contact forces need to be repositioned to the drill tip from the mid wand. Previous publications detail generation of both the requisite printed and haptic simulations. Method Custom software was developed to reposition the haptic interaction point to the drill tip. A custom fitting, to hold the otic drill, was developed and its weight was offset using the haptic device. The robustness of the system to disturbances and its stable performance during drilling were tested. The experiments were performed on a mixed reality model consisting of two drillable rapid-prototyped layers separated by a free-space. Within the free-space, a linear virtual force model is applied to simulate drill contact with soft tissue. Results Testing illustrated the effectiveness of gravity cancellation. Additionally, the system exhibited excellent performance given random inputs and during the drill’s passage between real and virtual components of the model. No issues with registration at model boundaries were encountered. Conclusion These tests provide a proof of concept for the initial stages in the development of a novel mixed-reality temporal bone simulator. PMID:25927300

  4. Metal Sounds Stiffer than Drums for Ears, but Not Always for Hands: Low-Level Auditory Features Affect Multisensory Stiffness Perception More than High-Level Categorical Information

    PubMed Central

    Liu, Juan; Ando, Hiroshi

    2016-01-01

    Most real-world events stimulate multiple sensory modalities simultaneously. Usually, the stiffness of an object is perceived haptically. However, auditory signals also contain stiffness-related information, and people can form impressions of stiffness from the different impact sounds of metal, wood, or glass. To understand whether there is any interaction between auditory and haptic stiffness perception, and if so, whether the inferred material category is the most relevant auditory information, we conducted experiments using a force-feedback device and the modal synthesis method to present haptic stimuli and impact sound in accordance with participants’ actions, and to modulate low-level acoustic parameters, i.e., frequency and damping, without changing the inferred material categories of sound sources. We found that metal sounds consistently induced an impression of stiffer surfaces than did drum sounds in the audio-only condition, but participants haptically perceived surfaces with modulated metal sounds as significantly softer than the same surfaces with modulated drum sounds, which directly opposes the impression induced by these sounds alone. This result indicates that, although the inferred material category is strongly associated with audio-only stiffness perception, low-level acoustic parameters, especially damping, are more tightly integrated with haptic signals than the material category is. Frequency played an important role in both audio-only and audio-haptic conditions. Our study provides evidence that auditory information influences stiffness perception differently in unisensory and multisensory tasks. Furthermore, the data demonstrated that sounds with higher frequency and/or shorter decay time tended to be judged as stiffer, and contact sounds of stiff objects had no effect on the haptic perception of soft surfaces. We argue that the intrinsic physical relationship between object stiffness and acoustic parameters may be applied as prior knowledge to achieve robust estimation of stiffness in multisensory perception. PMID:27902718

  5. A pseudo-haptic knot diagram interface

    NASA Astrophysics Data System (ADS)

    Zhang, Hui; Weng, Jianguang; Hanson, Andrew J.

    2011-01-01

    To make progress in understanding knot theory, we will need to interact with the projected representations of mathematical knots which are of course continuous in 3D but significantly interrupted in the projective images. One way to achieve such a goal would be to design an interactive system that allows us to sketch 2D knot diagrams by taking advantage of a collision-sensing controller and explore their underlying smooth structures through a continuous motion. Recent advances of interaction techniques have been made that allow progress to be made in this direction. Pseudo-haptics that simulates haptic effects using pure visual feedback can be used to develop such an interactive system. This paper outlines one such pseudo-haptic knot diagram interface. Our interface derives from the familiar pencil-and-paper process of drawing 2D knot diagrams and provides haptic-like sensations to facilitate the creation and exploration of knot diagrams. A centerpiece of the interaction model simulates a "physically" reactive mouse cursor, which is exploited to resolve the apparent conflict between the continuous structure of the actual smooth knot and the visual discontinuities in the knot diagram representation. Another value in exploiting pseudo-haptics is that an acceleration (or deceleration) of the mouse cursor (or surface locator) can be used to indicate the slope of the curve (or surface) of whom the projective image is being explored. By exploiting these additional visual cues, we proceed to a full-featured extension to a pseudo-haptic 4D visualization system that simulates the continuous navigation on 4D objects and allows us to sense the bumps and holes in the fourth dimension. Preliminary tests of the software show that main features of the interface overcome some expected perceptual limitations in our interaction with 2D knot diagrams of 3D knots and 3D projective images of 4D mathematical objects.

  6. Perceiving Object Shape from Specular Highlight Deformation, Boundary Contour Deformation, and Active Haptic Manipulation.

    PubMed

    Norman, J Farley; Phillips, Flip; Cheeseman, Jacob R; Thomason, Kelsey E; Ronning, Cecilia; Behari, Kriti; Kleinman, Kayla; Calloway, Autum B; Lamirande, Davora

    2016-01-01

    It is well known that motion facilitates the visual perception of solid object shape, particularly when surface texture or other identifiable features (e.g., corners) are present. Conventional models of structure-from-motion require the presence of texture or identifiable object features in order to recover 3-D structure. Is the facilitation in 3-D shape perception similar in magnitude when surface texture is absent? On any given trial in the current experiments, participants were presented with a single randomly-selected solid object (bell pepper or randomly-shaped "glaven") for 12 seconds and were required to indicate which of 12 (for bell peppers) or 8 (for glavens) simultaneously visible objects possessed the same shape. The initial single object's shape was defined either by boundary contours alone (i.e., presented as a silhouette), specular highlights alone, specular highlights combined with boundary contours, or texture. In addition, there was a haptic condition: in this condition, the participants haptically explored with both hands (but could not see) the initial single object for 12 seconds; they then performed the same shape-matching task used in the visual conditions. For both the visual and haptic conditions, motion (rotation in depth or active object manipulation) was present in half of the trials and was not present for the remaining trials. The effect of motion was quantitatively similar for all of the visual and haptic conditions-e.g., the participants' performance in Experiment 1 was 93.5 percent higher in the motion or active haptic manipulation conditions (when compared to the static conditions). The current results demonstrate that deforming specular highlights or boundary contours facilitate 3-D shape perception as much as the motion of objects that possess texture. The current results also indicate that the improvement with motion that occurs for haptics is similar in magnitude to that which occurs for vision.

  7. Perceiving Object Shape from Specular Highlight Deformation, Boundary Contour Deformation, and Active Haptic Manipulation

    PubMed Central

    Cheeseman, Jacob R.; Thomason, Kelsey E.; Ronning, Cecilia; Behari, Kriti; Kleinman, Kayla; Calloway, Autum B.; Lamirande, Davora

    2016-01-01

    It is well known that motion facilitates the visual perception of solid object shape, particularly when surface texture or other identifiable features (e.g., corners) are present. Conventional models of structure-from-motion require the presence of texture or identifiable object features in order to recover 3-D structure. Is the facilitation in 3-D shape perception similar in magnitude when surface texture is absent? On any given trial in the current experiments, participants were presented with a single randomly-selected solid object (bell pepper or randomly-shaped “glaven”) for 12 seconds and were required to indicate which of 12 (for bell peppers) or 8 (for glavens) simultaneously visible objects possessed the same shape. The initial single object’s shape was defined either by boundary contours alone (i.e., presented as a silhouette), specular highlights alone, specular highlights combined with boundary contours, or texture. In addition, there was a haptic condition: in this condition, the participants haptically explored with both hands (but could not see) the initial single object for 12 seconds; they then performed the same shape-matching task used in the visual conditions. For both the visual and haptic conditions, motion (rotation in depth or active object manipulation) was present in half of the trials and was not present for the remaining trials. The effect of motion was quantitatively similar for all of the visual and haptic conditions–e.g., the participants’ performance in Experiment 1 was 93.5 percent higher in the motion or active haptic manipulation conditions (when compared to the static conditions). The current results demonstrate that deforming specular highlights or boundary contours facilitate 3-D shape perception as much as the motion of objects that possess texture. The current results also indicate that the improvement with motion that occurs for haptics is similar in magnitude to that which occurs for vision. PMID:26863531

  8. Perception-based 3D tactile rendering from a single image for human skin examinations by dynamic touch.

    PubMed

    Kim, K; Lee, S

    2015-05-01

    Diagnosis of skin conditions is dependent on the assessment of skin surface properties that are represented by more tactile properties such as stiffness, roughness, and friction than visual information. Due to this reason, adding tactile feedback to existing vision based diagnosis systems can help dermatologists diagnose skin diseases or disorders more accurately. The goal of our research was therefore to develop a tactile rendering system for skin examinations by dynamic touch. Our development consists of two stages: converting a single image to a 3D haptic surface and rendering the generated haptic surface in real-time. Converting to 3D surfaces from 2D single images was implemented with concerning human perception data collected by a psychophysical experiment that measured human visual and haptic sensibility to 3D skin surface changes. For the second stage, we utilized real skin biomechanical properties found by prior studies. Our tactile rendering system is a standalone system that can be used with any single cameras and haptic feedback devices. We evaluated the performance of our system by conducting an identification experiment with three different skin images with five subjects. The participants had to identify one of the three skin surfaces by using a haptic device (Falcon) only. No visual cue was provided for the experiment. The results indicate that our system provides sufficient performance to render discernable tactile rendering with different skin surfaces. Our system uses only a single skin image and automatically generates a 3D haptic surface based on human haptic perception. Realistic skin interactions can be provided in real-time for the purpose of skin diagnosis, simulations, or training. Our system can also be used for other applications like virtual reality and cosmetic applications. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  9. Towards quantifying dynamic human-human physical interactions for robot assisted stroke therapy.

    PubMed

    Mohan, Mayumi; Mendonca, Rochelle; Johnson, Michelle J

    2017-07-01

    Human-Robot Interaction is a prominent field of robotics today. Knowledge of human-human physical interaction can prove vital in creating dynamic physical interactions between human and robots. Most of the current work in studying this interaction has been from a haptic perspective. Through this paper, we present metrics that can be used to identify if a physical interaction occurred between two people using kinematics. We present a simple Activity of Daily Living (ADL) task which involves a simple interaction. We show that we can use these metrics to successfully identify interactions.

  10. Analysis and prediction of meal motion by EMG signals

    NASA Astrophysics Data System (ADS)

    Horihata, S.; Iwahara, H.; Yano, K.

    2007-12-01

    The lack of carers for senior citizens and physically handicapped persons in our country has now become a huge issue and has created a great need for carer robots. The usual carer robots (many of which have switches or joysticks for their interfaces), however, are neither easy to use it nor very popular. Therefore, haptic devices have been adopted for a human-machine interface that will enable an intuitive operation. At this point, a method is being tested that seeks to prevent a wrong operation from occurring from the user's signals. This method matches motions with EMG signals.

  11. A review of haptic simulator for oral and maxillofacial surgery based on virtual reality.

    PubMed

    Chen, Xiaojun; Hu, Junlei

    2018-06-01

    Traditional medical training in oral and maxillofacial surgery (OMFS) may be limited by its low efficiency and high price due to the shortage of cadaver resources. With the combination of visual rendering and feedback force, surgery simulators become increasingly popular in hospitals and medical schools as an alternative to the traditional training. Areas covered: The major goal of this review is to provide a comprehensive reference source of current and future developments of haptic OMFS simulators based on virtual reality (VR) for relevant researchers. Expert commentary: Visual rendering, haptic rendering, tissue deformation, and evaluation are key components of haptic surgery simulator based on VR. Compared with traditional medical training, virtual and tactical fusion of virtual environment in surgery simulator enables considerably vivid sensation, and the operators have more opportunities to practice surgical skills and receive objective evaluation as reference.

  12. Selective attention modulates visual and haptic repetition priming: effects in aging and Alzheimer's disease.

    PubMed

    Ballesteros, Soledad; Reales, José M; Mayas, Julia; Heller, Morton A

    2008-08-01

    In two experiments, we examined the effect of selective attention at encoding on repetition priming in normal aging and Alzheimer's disease (AD) patients for objects presented visually (experiment 1) or haptically (experiment 2). We used a repetition priming paradigm combined with a selective attention procedure at encoding. Reliable priming was found for both young adults and healthy older participants for visually presented pictures (experiment 1) as well as for haptically presented objects (experiment 2). However, this was only found for attended and not for unattended stimuli. The results suggest that independently of the perceptual modality, repetition priming requires attention at encoding and that perceptual facilitation is maintained in normal aging. However, AD patients did not show priming for attended stimuli, or for unattended visual or haptic objects. These findings suggest an early deficit of selective attention in AD. Results are discussed from a cognitive neuroscience approach.

  13. Haptic interfaces using dielectric electroactive polymers

    NASA Astrophysics Data System (ADS)

    Ozsecen, Muzaffer Y.; Sivak, Mark; Mavroidis, Constantinos

    2010-04-01

    Quality, amplitude and frequency of the interaction forces between a human and an actuator are essential traits for haptic applications. A variety of Electro-Active Polymer (EAP) based actuators can provide these characteristics simultaneously with quiet operation, low weight, high power density and fast response. This paper demonstrates a rolled Dielectric Elastomer Actuator (DEA) being used as a telepresence device in a heart beat measurement application. In the this testing, heart signals were acquired from a remote location using a wireless heart rate sensor, sent through a network and DEA was used to haptically reproduce the heart beats at the medical expert's location. A series of preliminary human subject tests were conducted that demonstrated that a) DE based haptic feeling can be used in heart beat measurement tests and b) through subjective testing the stiffness and actuator properties of the EAP can be tuned for a variety of applications.

  14. A three-axis force sensor for dual finger haptic interfaces.

    PubMed

    Fontana, Marco; Marcheschi, Simone; Salsedo, Fabio; Bergamasco, Massimo

    2012-10-10

    In this work we present the design process, the characterization and testing of a novel three-axis mechanical force sensor. This sensor is optimized for use in closed-loop force control of haptic devices with three degrees of freedom. In particular the sensor has been conceived for integration with a dual finger haptic interface that aims at simulating forces that occur during grasping and surface exploration. The sensing spring structure has been purposely designed in order to match force and layout specifications for the application. In this paper the design of the sensor is presented, starting from an analytic model that describes the characteristic matrix of the sensor. A procedure for designing an optimal overload protection mechanism is proposed. In the last part of the paper the authors describe the experimental characterization and the integrated test on a haptic hand exoskeleton showing the improvements in the controller performances provided by the inclusion of the force sensor.

  15. Blindness enhances tactile acuity and haptic 3-D shape discrimination.

    PubMed

    Norman, J Farley; Bartholomew, Ashley N

    2011-10-01

    This study compared the sensory and perceptual abilities of the blind and sighted. The 32 participants were required to perform two tasks: tactile grating orientation discrimination (to determine tactile acuity) and haptic three-dimensional (3-D) shape discrimination. The results indicated that the blind outperformed their sighted counterparts (individually matched for both age and sex) on both tactile tasks. The improvements in tactile acuity that accompanied blindness occurred for all blind groups (congenital, early, and late). However, the improvements in haptic 3-D shape discrimination only occurred for the early-onset and late-onset blindness groups; the performance of the congenitally blind was no better than that of the sighted controls. The results of the present study demonstrate that blindness does lead to an enhancement of tactile abilities, but they also suggest that early visual experience may play a role in facilitating haptic 3-D shape discrimination.

  16. Verticality perception during and after galvanic vestibular stimulation.

    PubMed

    Volkening, Katharina; Bergmann, Jeannine; Keller, Ingo; Wuehr, Max; Müller, Friedemann; Jahn, Klaus

    2014-10-03

    The human brain constructs verticality perception by integrating vestibular, somatosensory, and visual information. Here we investigated whether galvanic vestibular stimulation (GVS) has an effect on verticality perception both during and after application, by assessing the subjective verticals (visual, haptic and postural) in healthy subjects at those times. During stimulation the subjective visual vertical and the subjective haptic vertical shifted towards the anode, whereas this shift was reversed towards the cathode in all modalities once stimulation was turned off. Overall, the effects were strongest for the haptic modality. Additional investigation of the time course of GVS-induced changes in the haptic vertical revealed that anodal shifts persisted for the entire 20-min stimulation interval in the majority of subjects. Aftereffects exhibited different types of decay, with a preponderance for an exponential decay. The existence of such reverse effects after stimulation could have implications for GVS-based therapy. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  17. Dakota Graphical User Interface v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman-Hill, Ernest; Glickman, Matthew; Gibson, Marcus

    Graphical analysis environment for Sandia’s Dakota software for optimization and uncertainty quantification. The Dakota GUI is an interactive graphical analysis environment for creating, running, and interpreting Dakota optimization and uncertainty quantification studies. It includes problem (Dakota study) set-up, option specification, simulation interfacing, analysis execution, and results visualization. Through the use of wizards, templates, and views, Dakota GUI helps uses navigate Dakota’s complex capability landscape.

  18. Active Adjectives: A Word Is Worth a Thousand Pictures

    ERIC Educational Resources Information Center

    Tamplin de Poinsot, Nan

    2010-01-01

    Graphic art can be a tough subject approach with seventh- and eighth-graders, but by mixing a little language arts into the studio lesson, they can have fun with the art of words in a whole new way. In this article, the author describes how students created a graphic design using a word. The purpose of the graphic art is to educate--to teach an…

  19. Perception in statistical graphics

    NASA Astrophysics Data System (ADS)

    VanderPlas, Susan Ruth

    There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.

  20. Evaluation of haptic interfaces for simulation of drill vibration in virtual temporal bone surgery.

    PubMed

    Ghasemloonia, Ahmad; Baxandall, Shalese; Zareinia, Kourosh; Lui, Justin T; Dort, Joseph C; Sutherland, Garnette R; Chan, Sonny

    2016-11-01

    Surgical training is evolving from an observership model towards a new paradigm that includes virtual-reality (VR) simulation. In otolaryngology, temporal bone dissection has become intimately linked with VR simulation as the complexity of anatomy demands a high level of surgeon aptitude and confidence. While an adequate 3D visualization of the surgical site is available in current simulators, the force feedback rendered during haptic interaction does not convey vibrations. This lack of vibration rendering limits the simulation fidelity of a surgical drill such as that used in temporal bone dissection. In order to develop an immersive simulation platform capable of haptic force and vibration feedback, the efficacy of hand controllers for rendering vibration in different drilling circumstances needs to be investigated. In this study, the vibration rendering ability of four different haptic hand controllers were analyzed and compared to find the best commercial haptic hand controller. A test-rig was developed to record vibrations encountered during temporal bone dissection and a software was written to render the recorded signals without adding hardware to the system. An accelerometer mounted on the end-effector of each device recorded the rendered vibration signals. The newly recorded vibration signal was compared with the input signal in both time and frequency domains by coherence and cross correlation analyses to quantitatively measure the fidelity of these devices in terms of rendering vibrotactile drilling feedback in different drilling conditions. This method can be used to assess the vibration rendering ability in VR simulation systems and selection of ideal haptic devices. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Effects of 3D virtual haptics force feedback on brand personality perception: the mediating role of physical presence in advergames.

    PubMed

    Jin, Seung-A Annie

    2010-06-01

    This study gauged the effects of force feedback in the Novint Falcon haptics system on the sensory and cognitive dimensions of a virtual test-driving experience. First, in order to explore the effects of tactile stimuli with force feedback on users' sensory experience, feelings of physical presence (the extent to which virtual physical objects are experienced as actual physical objects) were measured after participants used the haptics interface. Second, to evaluate the effects of force feedback on the cognitive dimension of consumers' virtual experience, this study investigated brand personality perception. The experiment utilized the Novint Falcon haptics controller to induce immersive virtual test-driving through tactile stimuli. The author designed a two-group (haptics stimuli with force feedback versus no force feedback) comparison experiment (N = 238) by manipulating the level of force feedback. Users in the force feedback condition were exposed to tactile stimuli involving various force feedback effects (e.g., terrain effects, acceleration, and lateral forces) while test-driving a rally car. In contrast, users in the control condition test-drove the rally car using the Novint Falcon but were not given any force feedback. Results of ANOVAs indicated that (a) users exposed to force feedback felt stronger physical presence than those in the no force feedback condition, and (b) users exposed to haptics stimuli with force feedback perceived the brand personality of the car to be more rugged than those in the control condition. Managerial implications of the study for product trial in the business world are discussed.

  2. Patient DF's visual brain in action: Visual feedforward control in visual form agnosia.

    PubMed

    Whitwell, Robert L; Milner, A David; Cavina-Pratesi, Cristiana; Barat, Masihullah; Goodale, Melvyn A

    2015-05-01

    Patient DF, who developed visual form agnosia following ventral-stream damage, is unable to discriminate the width of objects, performing at chance, for example, when asked to open her thumb and forefinger a matching amount. Remarkably, however, DF adjusts her hand aperture to accommodate the width of objects when reaching out to pick them up (grip scaling). While this spared ability to grasp objects is presumed to be mediated by visuomotor modules in her relatively intact dorsal stream, it is possible that it may rely abnormally on online visual or haptic feedback. We report here that DF's grip scaling remained intact when her vision was completely suppressed during grasp movements, and it still dissociated sharply from her poor perceptual estimates of target size. We then tested whether providing trial-by-trial haptic feedback after making such perceptual estimates might improve DF's performance, but found that they remained significantly impaired. In a final experiment, we re-examined whether DF's grip scaling depends on receiving veridical haptic feedback during grasping. In one condition, the haptic feedback was identical to the visual targets. In a second condition, the haptic feedback was of a constant intermediate width while the visual target varied trial by trial. Despite this incongruent feedback, DF still scaled her grip aperture to the visual widths of the target blocks, showing only normal adaptation to the false haptically-experienced width. Taken together, these results strengthen the view that DF's spared grasping relies on a normal mode of dorsal-stream functioning, based chiefly on visual feedforward processing. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Advanced Maintenance Simulation by Means of Hand-Based Haptic Interfaces

    NASA Astrophysics Data System (ADS)

    Nappi, Michele; Paolino, Luca; Ricciardi, Stefano; Sebillo, Monica; Vitiello, Giuliana

    Aerospace industry has been involved in virtual simulation for design and testing since the birth of virtual reality. Today this industry is showing a growing interest in the development of haptic-based maintenance training applications, which represent the most advanced way to simulate maintenance and repair tasks within a virtual environment by means of a visual-haptic approach. The goal is to allow the trainee to experiment the service procedures not only as a workflow reproduced at a visual level but also in terms of the kinaesthetic feedback involved with the manipulation of tools and components. This study, conducted in collaboration with aerospace industry specialists, is aimed to the development of an immersive virtual capable of immerging the trainees into a virtual environment where mechanics and technicians can perform maintenance simulation or training tasks by directly manipulating 3D virtual models of aircraft parts while perceiving force feedback through the haptic interface. The proposed system is based on ViRstperson, a virtual reality engine under development at the Italian Center for Aerospace Research (CIRA) to support engineering and technical activities such as design-time maintenance procedure validation, and maintenance training. This engine has been extended to support haptic-based interaction, enabling a more complete level of interaction, also in terms of impedance control, and thus fostering the development of haptic knowledge in the user. The user’s “sense of touch” within the immersive virtual environment is simulated through an Immersion CyberForce® hand-based force-feedback device. Preliminary testing of the proposed system seems encouraging.

  4. Motor skills, haptic perception and social abilities in children with mild speech disorders.

    PubMed

    Müürsepp, Iti; Aibast, Herje; Gapeyeva, Helena; Pääsuke, Mati

    2012-02-01

    The aim of the study was to evaluate motor skills, haptic object recognition and social interaction in 5-year-old children with mild specific expressive language impairment (expressive-SLI) and articulation disorder (AD) in comparison of age- and gender matched healthy children. Twenty nine children (23 boys and 6 girls) with expressive-SLI, 27 children (20 boys and 7 girls) with AD and 30 children (23 boys and 7 girls) with typically developing language as controls participated in our study. The children were examined for manual dexterity, ball skills, static and dynamic balance by M-ABC test, haptic object recognition and for social interaction by questionnaire completed by teachers. Children with mild expressive-SLI demonstrated significantly poorer results in all subtests of motor skills (p<0.05), in haptic object recognition and social interaction (p<0.01) compared to controls. There were no statistically significant differences (p>0.05) in measured parameters between children with AD and controls. Children with expressive-SLI performed considerably poorer compared to AD group in balance subtest (p<0.05), and in overall M-ABC test (p<0.01). In children with mild expressive-SLI the functional motor performance, haptic perception and social interaction are considerably more affected than in children with AD. Although motor difficulties in speech production are prevalent in AD, it is localised and does not involve children's general motor skills, haptic perception or social interaction. Copyright © 2011 The Japanese Society of Child Neurology. Published by Elsevier B.V. All rights reserved.

  5. High-fidelity bilateral teleoperation systems and the effect of multimodal haptics.

    PubMed

    Tavakoli, Mahdi; Aziminejad, Arash; Patel, Rajni V; Moallem, Mehrdad

    2007-12-01

    In master-slave teleoperation applications that deal with a delicate and sensitive environment, it is important to provide haptic feedback of slave/environment interactions to the user's hand as it improves task performance and teleoperation transparency (fidelity), which is the extent of telepresence of the remote environment available to the user through the master-slave system. For haptic teleoperation, in addition to a haptics-capable master interface, often one or more force sensors are also used, which warrant new bilateral control architectures while increasing the cost and the complexity of the teleoperation system. In this paper, we investigate the added benefits of using force sensors that measure hand/master and slave/environment interactions and of utilizing local feedback loops on the teleoperation transparency. We compare the two-channel and the four-channel bilateral control systems in terms of stability and transparency, and study the stability and performance robustness of the four-channel method against nonidealities that arise during bilateral control implementation, which include master-slave communication latency and changes in the environment dynamics. The next issue addressed in the paper deals with the case where the master interface is not haptics capable, but the slave is equipped with a force sensor. In the context of robotics-assisted soft-tissue surgical applications, we explore through human factors experiments whether slave/environment force measurements can be of any help with regard to improving task performance. The last problem we study is whether slave/environment force information, with and without haptic capability in the master interface, can help improve outcomes under degraded visual conditions.

  6. Tips for Good Electronic Presentations.

    ERIC Educational Resources Information Center

    Strasser, Dennis

    1996-01-01

    Describes library uses of presentation graphics software and offers tips for creating electronic presentations. Tips include: audience retention; visual aid options; software package options; presentation planning; presentation showing; and use of text, colors, and graphics. Sidebars note common presentation errors and popular presentation…

  7. The impact of haptic feedback on students' conceptions of the cell

    NASA Astrophysics Data System (ADS)

    Minogue, James

    2005-07-01

    The purpose of this study was to investigate the efficacy of adding haptic (sense of touch) feedback to computer generated visualizations for use in middle school science instruction. Current technology allows for the simulation of tactile and kinesthetic sensations via haptic devices and a computer interface. This study, conducted with middle school students (n = 80), explored the cognitive and affective impacts of this innovative technology on students' conceptions of the cell and the process of passive transport. A pretest-posttest control group design was used and participants were randomly assigned to one of two treatment groups (n = 40 for each). Both groups experienced the same core computer-mediated instructional program. This Cell Exploration program engaged students in a 3-D immersive environment that allowed them to actively investigate the form and function of a typical animal cell including its major organelles. The program also engaged students in a study of the structure and function of the cell membrane as it pertains to the process of passive transport and the mechanisms behind the membrane's selective permeability. As they conducted their investigations, students in the experimental group received bi-modal visual and haptic (simulated tactile and kinesthetic) feedback whereas the control group students experienced the program with only visual stimuli. A battery of assessments, including objective and open-ended written response items as well as a haptic performance assessment, were used to gather quantitative and qualitative data regarding changes in students' understandings of the cell concepts prior to and following their completion of the instructional program. Additionally, the impact of haptics on the affective domain of students' learning was assessed using a post-experience semi-structured interview and an attitudinal survey. Results showed that students from both conditions (Visual-Only and Visual + Haptic) found the instructional program interesting and engaging. Additionally, the vast majority of the students reported that they learned a lot about and were more interested in the topic due to their participation. Moreover, students who received the bi-modal (Visual + Haptic) feedback indicated that they experienced lower levels of frustration and spatial disorientation as they conducted their investigations when compared to individuals that relied solely on vision. There were no significant differences measured across the treatment groups on the cognitive assessment items. Despite this finding, the study provided valuable insight into the theoretical and practical considerations involved in the development of multimodal instructional programs.

  8. The pits and falls of graphical presentation.

    PubMed

    Sperandei, Sandro

    2014-01-01

    Graphics are powerful tools to communicate research results and to gain information from data. However, researchers should be careful when deciding which data to plot and the type of graphic to use, as well as other details. The consequence of bad decisions in these features varies from making research results unclear to distortions of these results, through the creation of "chartjunk" with useless information. This paper is not another tutorial about "good graphics" and "bad graphics". Instead, it presents guidelines for graphic presentation of research results and some uncommon, but useful examples to communicate basic and complex data types, especially multivariate model results, which are commonly presented only by tables. By the end, there are no answers here, just ideas meant to inspire others on how to create their own graphics.

  9. Authoritative Authoring: Software That Makes Multimedia Happen.

    ERIC Educational Resources Information Center

    Florio, Chris; Murie, Michael

    1996-01-01

    Compares seven mid- to high-end multimedia authoring software systems that combine graphics, sound, animation, video, and text for Windows and Macintosh platforms. A run-time project was created with each program using video, animation, graphics, sound, formatted text, hypertext, and buttons. (LRW)

  10. The Ups and Downs of Information Graphics.

    ERIC Educational Resources Information Center

    Jungblut, Joseph A.

    1988-01-01

    Describes the four basic information graphics: fever, bar, pie, and map. Provides five tips for creating visuals for graphs: (1) plot the numbers first; (2) set the numbers horizontally; (3) make it accurate; (4) use artwork that fits; and (5) use appropriate type. (MS)

  11. Planetary Photojournal Home Page Graphic

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This image is an unannotated version of the Planetary Photojournal Home Page graphic. This digital collage contains a highly stylized rendition of our solar system and points beyond. As this graphic was intended to be used as a navigation aid in searching for data within the Photojournal, certain artistic embellishments have been added (color, location, etc.). Several data sets from various planetary and astronomy missions were combined to create this image.

  12. Constructing a Graphic Organizer in the Classroom: Introductory Students' Perception of Achievement Using a Decision Map to Solve Aqueous Acid-Base Equilibria Problems

    ERIC Educational Resources Information Center

    DeMeo, Stephen

    2007-01-01

    Common examples of graphic organizers include flow diagrams, concept maps, and decision trees. The author has created a novel type of graphic organizer called a decision map. A decision map is a directional heuristic that helps learners solve problems within a generic framework. It incorporates questions that the user must answer and contains…

  13. Sharing control between humans and automation using haptic interface: primary and secondary task performance benefits.

    PubMed

    Griffiths, Paul G; Gillespie, R Brent

    2005-01-01

    This paper describes a paradigm for human/automation control sharing in which the automation acts through a motor coupled to a machine's manual control interface. The manual interface becomes a haptic display, continually informing the human about automation actions. While monitoring by feel, users may choose either to conform to the automation or override it and express their own control intentions. This paper's objective is to demonstrate that adding automation through haptic display can be used not only to improve performance on a primary task but also to reduce perceptual demands or free attention for a secondary task. Results are presented from three experiments in which 11 participants completed a lane-following task using a motorized steering wheel on a fixed-base driving simulator. The automation behaved like a copilot, assisting with lane following by applying torques to the steering wheel. Results indicate that haptic assist improves lane following by least 30%, p < .0001, while reducing visual demand by 29%, p < .0001, or improving reaction time in a secondary tone localization task by 18 ms, p = .0009. Potential applications of this research include the design of automation interfaces based on haptics that support human/automation control sharing better than traditional push-button automation interfaces.

  14. A predictive bone drilling force model for haptic rendering with experimental validation using fresh cadaveric bone.

    PubMed

    Lin, Yanping; Chen, Huajiang; Yu, Dedong; Zhang, Ying; Yuan, Wen

    2017-01-01

    Bone drilling simulators with virtual and haptic feedback provide a safe, cost-effective and repeatable alternative to traditional surgical training methods. To develop such a simulator, accurate haptic rendering based on a force model is required to feedback bone drilling forces based on user input. Current predictive bone drilling force models based on bovine bones with various drilling conditions and parameters are not representative of the bone drilling process in bone surgery. The objective of this study was to provide a bone drilling force model for haptic rendering based on calibration and validation experiments in fresh cadaveric bones with different bone densities. Using a commonly used drill bit geometry (2 mm diameter), feed rates (20-60 mm/min) and spindle speeds (4000-6000 rpm) in orthognathic surgeries, the bone drilling forces of specimens from two groups were measured and the calibration coefficients of the specific normal and frictional pressures were determined. The comparison of the predicted forces and the measured forces from validation experiments with a large range of feed rates and spindle speeds demonstrates that the proposed bone drilling forces can predict the trends and average forces well. The presented bone drilling force model can be used for haptic rendering in surgical simulators.

  15. Figure/Ground Segmentation via a Haptic Glance: Attributing Initial Finger Contacts to Objects or Their Supporting Surfaces.

    PubMed

    Pawluk, D; Kitada, R; Abramowicz, A; Hamilton, C; Lederman, S J

    2011-01-01

    The current study addresses the well-known "figure/ground" problem in human perception, a fundamental topic that has received surprisingly little attention from touch scientists to date. Our approach is grounded in, and directly guided by, current knowledge concerning the nature of haptic processing. Given inherent figure/ground ambiguity in natural scenes and limited sensory inputs from first contact (a "haptic glance"), we consider first whether people are even capable of differentiating figure from ground (Experiments 1 and 2). Participants were required to estimate the strength of their subjective impression that they were feeling an object (i.e., figure) as opposed to just the supporting structure (i.e., ground). Second, we propose a tripartite factor classification scheme to further assess the influence of kinetic, geometric (Experiments 1 and 2), and material (Experiment 2) factors on haptic figure/ground segmentation, complemented by more open-ended subjective responses obtained at the end of the experiment. Collectively, the results indicate that under certain conditions it is possible to segment figure from ground via a single haptic glance with a reasonable degree of certainty, and that all three factor classes influence the estimated likelihood that brief, spatially distributed fingertip contacts represent contact with an object and/or its background supporting structure.

  16. Development of a Robotic Colonoscopic Manipulation System, Using Haptic Feedback Algorithm

    PubMed Central

    Woo, Jaehong; Choi, Jae Hyuk; Seo, Jong Tae

    2017-01-01

    Purpose Colonoscopy is one of the most effective diagnostic and therapeutic tools for colorectal diseases. We aim to propose a master-slave robotic colonoscopy that is controllable in remote site using conventional colonoscopy. Materials and Methods The master and slave robot were developed to use conventional flexible colonoscopy. The robotic colonoscopic procedure was performed using a colonoscope training model by one expert endoscopist and two unexperienced engineers. To provide the haptic sensation, the insertion force and the rotating torque were measured and sent to the master robot. Results A slave robot was developed to hold the colonoscopy and its knob, and perform insertion, rotation, and two tilting motions of colonoscope. A master robot was designed to teach motions of the slave robot. These measured force and torque were scaled down by one tenth to provide the operator with some reflection force and torque at the haptic device. The haptic sensation and feedback system was successful and helpful to feel the constrained force or torque in colon. The insertion time using robotic system decreased with repeated procedures. Conclusion This work proposed a robotic approach for colonoscopy using haptic feedback algorithm, and this robotic device would effectively perform colonoscopy with reduced burden and comparable safety for patients in remote site. PMID:27873506

  17. Improved PMMA single-piece haptic materials

    NASA Astrophysics Data System (ADS)

    Healy, Donald D.; Wilcox, Christopher D.

    1991-12-01

    During the past fifteen years, Intraocular lens (IOL) haptic preferences have shifted from a variety of multi-piece haptic materials to single-piece PMMA. This is due in part to the research of David Apple, M.D., and other who have suggested that All-PMMA implants result in reduced cell flare and better centration. Consequently, single-piece IOLs now represent 45% of all IOL implants. However, many surgeons regard single-piece IOL designs as nonflexible and more difficult to implant than multipiece IOLs. These handling characteristics have slowed the shift from multi-piece to single-piece IOLs. As a result of these handling characteristics, single-piece lenses experience relatively high breakage rates because of handling before insertion and during insertion. To improve these characteristics, manufacturers have refined single-piece IOL haptic designs by pushing the limits of PMMA's physical properties. Furthermore, IOL manufacturers have begun to alter the material itself to change its physical properties. In particular, two new PMMA materials have emerged in the marketplace: Flexeon trademark, a crosslinked polymer and CM trademark, a material with molecularly realigned PMMA. This paper examines three specific measurements of a haptic's strength and flexibility: tensile strength, plastic memory and material plasticity/elasticity. The paper compares with Flexeon trademark and CM trademark lenses to noncrosslinked one-piece lenses and standard polypropylene multi-piece lenses.

  18. Discriminating Tissue Stiffness with a Haptic Catheter: Feeling the Inside of the Beating Heart.

    PubMed

    Kesner, Samuel B; Howe, Robert D

    2011-01-01

    Catheter devices allow physicians to access the inside of the human body easily and painlessly through natural orifices and vessels. Although catheters allow for the delivery of fluids and drugs, the deployment of devices, and the acquisition of the measurements, they do not allow clinicians to assess the physical properties of tissue inside the body due to the tissue motion and transmission limitations of the catheter devices, including compliance, friction, and backlash. The goal of this research is to increase the tactile information available to physicians during catheter procedures by providing haptic feedback during palpation procedures. To accomplish this goal, we have developed the first motion compensated actuated catheter system that enables haptic perception of fast moving tissue structures. The actuated catheter is instrumented with a distal tip force sensor and a force feedback interface that allows users to adjust the position of the catheter while experiencing the forces on the catheter tip. The efficacy of this device and interface is evaluated through a psychophyisical study comparing how accurately users can differentiate various materials attached to a cardiac motion simulator using the haptic device and a conventional manual catheter. The results demonstrate that haptics improves a user's ability to differentiate material properties and decreases the total number of errors by 50% over the manual catheter system.

  19. Palpation imaging using a haptic system for virtual reality applications in medicine.

    PubMed

    Khaled, W; Reichling, S; Bruhns, O T; Boese, H; Baumann, M; Monkman, G; Egersdoerfer, S; Klein, D; Tunayar, A; Freimuth, H; Lorenz, A; Pessavento, A; Ermert, H

    2004-01-01

    In the field of medical diagnosis, there is a strong need to determine mechanical properties of biological tissue, which are of histological and pathological relevance. Malignant tumors are significantly stiffer than surrounding healthy tissue. One of the established diagnosis procedures is the palpation of body organs and tissue. Palpation is used to measure swelling, detect bone fracture, find and measure pulse, or to locate changes in the pathological state of tissue and organs. Current medical practice routinely uses sophisticated diagnostic tests through magnetic resonance imaging (MRI), computed tomography (CT) and ultrasound (US) imaging. However, they cannot provide direct measure of tissue elasticity. Last year we presented the concept of the first haptic sensor actuator system to visualize and reconstruct mechanical properties of tissue using ultrasonic elastography and a haptic display with electrorheological fluids. We developed a real time strain imaging system for tumor diagnosis. It allows biopsies simultaneously to conventional ultrasound B-Mode and strain imaging investigations. We deduce the relative mechanical properties by using finite element simulations and numerical solution models solving the inverse problem. Various modifications on the haptic sensor actuator system have been investigated. This haptic system has the potential of inducing real time substantial forces, using a compact lightweight mechanism which can be applied to numerous areas including intraoperative navigation, telemedicine, teaching and telecommunication.

  20. A pervasive visual-haptic framework for virtual delivery training.

    PubMed

    Abate, Andrea F; Acampora, Giovanni; Loia, Vincenzo; Ricciardi, Stefano; Vasilakos, Athanasios V

    2010-03-01

    Thanks to the advances of voltage regulator (VR) technologies and haptic systems, virtual simulators are increasingly becoming a viable alternative to physical simulators in medicine and surgery, though many challenges still remain. In this study, a pervasive visual-haptic framework aimed to the training of obstetricians and midwives to vaginal delivery is described. The haptic feedback is provided by means of two hand-based haptic devices able to reproduce force-feedbacks on fingers and arms, thus enabling a much more realistic manipulation respect to stylus-based solutions. The interactive simulation is not solely driven by an approximated model of complex forces and physical constraints but, instead, is approached by a formal modeling of the whole labor and of the assistance/intervention procedures performed by means of a timed automata network and applied to a parametrical 3-D model of the anatomy, able to mimic a wide range of configurations. This novel methodology is able to represent not only the sequence of the main events associated to either a spontaneous or to an operative childbirth process, but also to help in validating the manual intervention as the actions performed by the user during the simulation are evaluated according to established medical guidelines. A discussion on the first results as well as on the challenges still unaddressed is included.

  1. A visual graphic/haptic rendering model for hysteroscopic procedures.

    PubMed

    Lim, Fabian; Brown, Ian; McColl, Ryan; Seligman, Cory; Alsaraira, Amer

    2006-03-01

    Hysteroscopy is an extensively popular option in evaluating and treating women with infertility. The procedure utilises an endoscope, inserted through the vagina and cervix to examine the intra-uterine cavity via a monitor. The difficulty of hysteroscopy from the surgeon's perspective is the visual spatial perception of interpreting 3D images on a 2D monitor, and the associated psychomotor skills in overcoming the fulcrum-effect. Despite the widespread use of this procedure, current qualified hysteroscopy surgeons have not been trained the fundamentals through an organised curriculum. The emergence of virtual reality as an educational tool for this procedure, and for other endoscopic procedures, has undoubtedly raised interests. The ultimate objective is for the inclusion of virtual reality training as a mandatory component for gynaecologic endoscopy training. Part of this process involves the design of a simulator, encompassing the technical difficulties and complications associated with the procedure. The proposed research examines fundamental hysteroscopy factors, current training and accreditation, and proposes a hysteroscopic simulator design that is suitable for educating and training.

  2. 77 FR 15390 - Certain Mobile Electronic Devices Incorporating Haptics; Receipt of Amended Complaint...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-15

    ... INTERNATIONAL TRADE COMMISSION [DN 2875] Certain Mobile Electronic Devices Incorporating Haptics.... International Trade Commission. ACTION: Notice. SUMMARY: Notice is hereby given that the U.S. International Trade Commission has received an amended complaint entitled Certain Mobile Electronic Devices...

  3. Multimedia Principle in Teaching Lessons

    ERIC Educational Resources Information Center

    Kari Jabbour, Khayrazad

    2012-01-01

    Multimedia learning principle occurs when we create mental representations from combining text and relevant graphics into lessons. This article discusses the learning advantages that result from adding multimedia learning principle into instructions; and how to select graphics that support learning. There is a balance that instructional designers…

  4. Force and torque modelling of drilling simulation for orthopaedic surgery.

    PubMed

    MacAvelia, Troy; Ghasempoor, Ahmad; Janabi-Sharifi, Farrokh

    2014-01-01

    The advent of haptic simulation systems for orthopaedic surgery procedures has provided surgeons with an excellent tool for training and preoperative planning purposes. This is especially true for procedures involving the drilling of bone, which require a great amount of adroitness and experience due to difficulties arising from vibration and drill bit breakage. One of the potential difficulties with the drilling of bone is the lack of consistent material evacuation from the drill's flutes as the material tends to clog. This clogging leads to significant increases in force and torque experienced by the surgeon. Clogging was observed for feed rates greater than 0.5 mm/s and spindle speeds less than 2500 rpm. The drilling simulation systems that have been created to date do not address the issue of drill flute clogging. This paper presents force and torque prediction models that account for this phenomenon. The two coefficients of friction required by these models were determined via a set of calibration experiments. The accuracy of both models was evaluated by an additional set of validation experiments resulting in average R² regression correlation values of 0.9546 and 0.9209 for the force and torque prediction models, respectively. The resulting models can be adopted by haptic simulation systems to provide a more realistic tactile output.

  5. Mechanically Compliant Electronic Materials for Wearable Photovoltaics and Human-Machine Interfaces

    NASA Astrophysics Data System (ADS)

    O'Connor, Timothy Francis, III

    Applications of stretchable electronic materials for human-machine interfaces are described herein. Intrinsically stretchable organic conjugated polymers and stretchable electronic composites were used to develop stretchable organic photovoltaics (OPVs), mechanically robust wearable OPVs, and human-machine interfaces for gesture recognition, American Sign Language Translation, haptic control of robots, and touch emulation for virtual reality, augmented reality, and the transmission of touch. The stretchable and wearable OPVs comprise active layers of poly-3-alkylthiophene:phenyl-C61-butyric acid methyl ester (P3AT:PCBM) and transparent conductive electrodes of poly(3,4-ethylenedioxythiophene)-poly(styrenesulfonate) (PEDOT:PSS) and devices could only be fabricated through a deep understanding of the connection between molecular structure and the co-engineering of electronic performance with mechanical resilience. The talk concludes with the use of composite piezoresistive sensors two smart glove prototypes. The first integrates stretchable strain sensors comprising a carbon-elastomer composite, a wearable microcontroller, low energy Bluetooth, and a 6-axis accelerometer/gyroscope to construct a fully functional gesture recognition glove capable of wirelessly translating American Sign Language to text on a cell phone screen. The second creates a system for the haptic control of a 3D printed robot arm, as well as the transmission of touch and temperature information.

  6. Cranial implant design using augmented reality immersive system.

    PubMed

    Ai, Zhuming; Evenhouse, Ray; Leigh, Jason; Charbel, Fady; Rasmussen, Mary

    2007-01-01

    Software tools that utilize haptics for sculpting precise fitting cranial implants are utilized in an augmented reality immersive system to create a virtual working environment for the modelers. The virtual environment is designed to mimic the traditional working environment as closely as possible, providing more functionality for the users. The implant design process uses patient CT data of a defective area. This volumetric data is displayed in an implant modeling tele-immersive augmented reality system where the modeler can build a patient specific implant that precisely fits the defect. To mimic the traditional sculpting workspace, the implant modeling augmented reality system includes stereo vision, viewer centered perspective, sense of touch, and collaboration. To achieve optimized performance, this system includes a dual-processor PC, fast volume rendering with three-dimensional texture mapping, the fast haptic rendering algorithm, and a multi-threading architecture. The system replaces the expensive and time consuming traditional sculpting steps such as physical sculpting, mold making, and defect stereolithography. This augmented reality system is part of a comprehensive tele-immersive system that includes a conference-room-sized system for tele-immersive small group consultation and an inexpensive, easily deployable networked desktop virtual reality system for surgical consultation, evaluation and collaboration. This system has been used to design patient-specific cranial implants with precise fit.

  7. Temporal bone dissection simulator for training pediatric otolaryngology surgeons

    NASA Astrophysics Data System (ADS)

    Tabrizi, Pooneh R.; Sang, Hongqiang; Talari, Hadi F.; Preciado, Diego; Monfaredi, Reza; Reilly, Brian; Arikatla, Sreekanth; Enquobahrie, Andinet; Cleary, Kevin

    2017-03-01

    Cochlear implantation is the standard of care for infants born with severe hearing loss. Current guidelines approve the surgical placement of implants as early as 12 months of age. Implantation at a younger age poses a greater surgical challenge since the underdeveloped mastoid tip, along with thin calvarial bone, creates less room for surgical navigation and can result in increased surgical risk. We have been developing a temporal bone dissection simulator based on actual clinical cases for training otolaryngology fellows in this delicate procedure. The simulator system is based on pre-procedure CT (Computed Tomography) images from pediatric infant cases (<12 months old) at our hospital. The simulator includes: (1) simulation engine to provide the virtual reality of the temporal bone surgery environment, (2) a newly developed haptic interface for holding the surgical drill, (3) an Oculus Rift to provide a microscopic-like view of the temporal bone surgery, and (4) user interface to interact with the simulator through the Oculus Rift and the haptic device. To evaluate the system, we have collected 10 representative CT data sets and segmented the key structures: cochlea, round window, facial nerve, and ossicles. The simulator will present these key structures to the user and warn the user if needed by continuously calculating the distances between the tip of surgical drill and the key structures.

  8. Creating Objects and Object Categories for Studying Perception and Perceptual Learning

    PubMed Central

    Hauffen, Karin; Bart, Eugene; Brady, Mark; Kersten, Daniel; Hegdé, Jay

    2012-01-01

    In order to quantitatively study object perception, be it perception by biological systems or by machines, one needs to create objects and object categories with precisely definable, preferably naturalistic, properties1. Furthermore, for studies on perceptual learning, it is useful to create novel objects and object categories (or object classes) with such properties2. Many innovative and useful methods currently exist for creating novel objects and object categories3-6 (also see refs. 7,8). However, generally speaking, the existing methods have three broad types of shortcomings. First, shape variations are generally imposed by the experimenter5,9,10, and may therefore be different from the variability in natural categories, and optimized for a particular recognition algorithm. It would be desirable to have the variations arise independently of the externally imposed constraints. Second, the existing methods have difficulty capturing the shape complexity of natural objects11-13. If the goal is to study natural object perception, it is desirable for objects and object categories to be naturalistic, so as to avoid possible confounds and special cases. Third, it is generally hard to quantitatively measure the available information in the stimuli created by conventional methods. It would be desirable to create objects and object categories where the available information can be precisely measured and, where necessary, systematically manipulated (or 'tuned'). This allows one to formulate the underlying object recognition tasks in quantitative terms. Here we describe a set of algorithms, or methods, that meet all three of the above criteria. Virtual morphogenesis (VM) creates novel, naturalistic virtual 3-D objects called 'digital embryos' by simulating the biological process of embryogenesis14. Virtual phylogenesis (VP) creates novel, naturalistic object categories by simulating the evolutionary process of natural selection9,12,13. Objects and object categories created by these simulations can be further manipulated by various morphing methods to generate systematic variations of shape characteristics15,16. The VP and morphing methods can also be applied, in principle, to novel virtual objects other than digital embryos, or to virtual versions of real-world objects9,13. Virtual objects created in this fashion can be rendered as visual images using a conventional graphical toolkit, with desired manipulations of surface texture, illumination, size, viewpoint and background. The virtual objects can also be 'printed' as haptic objects using a conventional 3-D prototyper. We also describe some implementations of these computational algorithms to help illustrate the potential utility of the algorithms. It is important to distinguish the algorithms from their implementations. The implementations are demonstrations offered solely as a 'proof of principle' of the underlying algorithms. It is important to note that, in general, an implementation of a computational algorithm often has limitations that the algorithm itself does not have. Together, these methods represent a set of powerful and flexible tools for studying object recognition and perceptual learning by biological and computational systems alike. With appropriate extensions, these methods may also prove useful in the study of morphogenesis and phylogenesis. PMID:23149420

  9. Effects of visual information regarding allocentric processing in haptic parallelity matching.

    PubMed

    Van Mier, Hanneke I

    2013-10-01

    Research has revealed that haptic perception of parallelity deviates from physical reality. Large and systematic deviations have been found in haptic parallelity matching most likely due to the influence of the hand-centered egocentric reference frame. Providing information that increases the influence of allocentric processing has been shown to improve performance on haptic matching. In this study allocentric processing was stimulated by providing informative vision in haptic matching tasks that were performed using hand- and arm-centered reference frames. Twenty blindfolded participants (ten men, ten women) explored the orientation of a reference bar with the non-dominant hand and subsequently matched (task HP) or mirrored (task HM) its orientation on a test bar with the dominant hand. Visual information was provided by means of informative vision with participants having full view of the test bar, while the reference bar was blocked from their view (task VHP). To decrease the egocentric bias of the hands, participants also performed a visual haptic parallelity drawing task (task VHPD) using an arm-centered reference frame, by drawing the orientation of the reference bar. In all tasks, the distance between and orientation of the bars were manipulated. A significant effect of task was found; performance improved from task HP, to VHP to VHPD, and HM. Significant effects of distance were found in the first three tasks, whereas orientation and gender effects were only significant in tasks HP and VHP. The results showed that stimulating allocentric processing by means of informative vision and reducing the egocentric bias by using an arm-centered reference frame led to most accurate performance on parallelity matching. © 2013 Elsevier B.V. All rights reserved.

  10. Robot-Assisted Proprioceptive Training with Added Vibro-Tactile Feedback Enhances Somatosensory and Motor Performance.

    PubMed

    Cuppone, Anna Vera; Squeri, Valentina; Semprini, Marianna; Masia, Lorenzo; Konczak, Jürgen

    2016-01-01

    This study examined the trainability of the proprioceptive sense and explored the relationship between proprioception and motor learning. With vision blocked, human learners had to perform goal-directed wrist movements relying solely on proprioceptive/haptic cues to reach several haptically specified targets. One group received additional somatosensory movement error feedback in form of vibro-tactile cues applied to the skin of the forearm. We used a haptic robotic device for the wrist and implemented a 3-day training regimen that required learners to make spatially precise goal-directed wrist reaching movements without vision. We assessed whether training improved the acuity of the wrist joint position sense. In addition, we checked if sensory learning generalized to the motor domain and improved spatial precision of wrist tracking movements that were not trained. The main findings of the study are: First, proprioceptive acuity of the wrist joint position sense improved after training for the group that received the combined proprioceptive/haptic and vibro-tactile feedback (VTF). Second, training had no impact on the spatial accuracy of the untrained tracking task. However, learners who had received VTF significantly reduced their reliance on haptic guidance feedback when performing the untrained motor task. That is, concurrent VTF was highly salient movement feedback and obviated the need for haptic feedback. Third, VTF can be also provided by the limb not involved in the task. Learners who received VTF to the contralateral limb equally benefitted. In conclusion, somatosensory training can significantly enhance proprioceptive acuity within days when learning is coupled with vibro-tactile sensory cues that provide feedback about movement errors. The observable sensory improvements in proprioception facilitates motor learning and such learning may generalize to the sensorimotor control of the untrained motor tasks. The implications of these findings for neurorehabilitation are discussed.

  11. Film Boxes.

    ERIC Educational Resources Information Center

    Osterer, Irv

    2002-01-01

    Presents an art lesson in which students created three-dimensional designs for 35mm film packages to improve graphic arts learning. Describes how the students examined and created film boxes using QuarkXPress software. (CMK)

  12. A powerful graphical pulse sequence programming tool for magnetic resonance imaging.

    PubMed

    Jie, Shen; Ying, Liu; Jianqi, Li; Gengying, Li

    2005-12-01

    A powerful graphical pulse sequence programming tool has been designed for creating magnetic resonance imaging (MRI) applications. It allows rapid development of pulse sequences in graphical mode (allowing for the visualization of sequences), and consists of three modules which include a graphical sequence editor, a parameter management module and a sequence compiler. Its key features are ease to use, flexibility and hardware independence. When graphic elements are combined with a certain text expressions, the graphical pulse sequence programming is as flexible as text-based programming tool. In addition, a hardware-independent design is implemented by using the strategy of two step compilations. To demonstrate the flexibility and the capability of this graphical sequence programming tool, a multi-slice fast spin echo experiment is performed on our home-made 0.3 T permanent magnet MRI system.

  13. Integrating Commercial Off-The-Shelf (COTS) graphics and extended memory packages with CLIPS

    NASA Technical Reports Server (NTRS)

    Callegari, Andres C.

    1990-01-01

    This paper addresses the question of how to mix CLIPS with graphics and how to overcome PC's memory limitations by using the extended memory available in the computer. By adding graphics and extended memory capabilities, CLIPS can be converted into a complete and powerful system development tool, on the other most economical and popular computer platform. New models of PCs have amazing processing capabilities and graphic resolutions that cannot be ignored and should be used to the fullest of their resources. CLIPS is a powerful expert system development tool, but it cannot be complete without the support of a graphics package needed to create user interfaces and general purpose graphics, or without enough memory to handle large knowledge bases. Now, a well known limitation on the PC's is the usage of real memory which limits CLIPS to use only 640 Kb of real memory, but now that problem can be solved by developing a version of CLIPS that uses extended memory. The user has access of up to 16 MB of memory on 80286 based computers and, practically, all the available memory (4 GB) on computers that use the 80386 processor. So if we give CLIPS a self-configuring graphics package that will automatically detect the graphics hardware and pointing device present in the computer, and we add the availability of the extended memory that exists in the computer (with no special hardware needed), the user will be able to create more powerful systems at a fraction of the cost and on the most popular, portable, and economic platform available such as the PC platform.

  14. ITOS to EDGE "Bridge" Software for Morpheus Lunar/Martian Vehicle

    NASA Technical Reports Server (NTRS)

    Hirsh, Robert; Fuchs, Jordan

    2012-01-01

    My project Involved Improving upon existing software and writing new software for the Project Morpheus Team. Specifically, I created and updated Integrated Test and Operations Systems (ITOS) user Interfaces for on-board Interaction with the vehicle during archive playback as well as live streaming data. These Interfaces are an integral part of the testing and operations for the Morpheus vehicle providing any and all information from the vehicle to evaluate instruments and insure coherence and control of the vehicle during Morpheus missions. I also created a "bridge" program for Interfacing "live" telemetry data with the Engineering DOUG Graphics Engine (EDGE) software for a graphical (standalone or VR dome) view of live Morpheus nights or archive replays, providing graphical representation of vehicle night and movement during subsequent tests and in real missions.

  15. Photojournal Home Page Graphic 2007

    NASA Technical Reports Server (NTRS)

    2008-01-01

    This image is an unannotated version of the Photojournal Home Page graphic released in October 2007. This digital collage contains a highly stylized rendition of our solar system and points beyond. As this graphic was intended to be used as a navigation aid in searching for data within the Photojournal, certain artistic embellishments have been added (color, location, etc.). Several data sets from various planetary and astronomy missions were combined to create this image.

  16. OpenSesame: an open-source, graphical experiment builder for the social sciences.

    PubMed

    Mathôt, Sebastiaan; Schreij, Daniel; Theeuwes, Jan

    2012-06-01

    In the present article, we introduce OpenSesame, a graphical experiment builder for the social sciences. OpenSesame is free, open-source, and cross-platform. It features a comprehensive and intuitive graphical user interface and supports Python scripting for complex tasks. Additional functionality, such as support for eyetrackers, input devices, and video playback, is available through plug-ins. OpenSesame can be used in combination with existing software for creating experiments.

  17. Using haptic feedback to increase seat belt use of service vehicle drivers.

    DOT National Transportation Integrated Search

    2011-01-01

    This study pilot-tested a new application of a technology-based intervention to increase seat belt use. The technology was based on a : contingency in which unbelted drivers experienced sustained haptic feedback to the gas pedal when they exceeded 25...

  18. Haptic identification of objects and their depictions.

    PubMed

    Klatzky, R L; Loomis, J M; Lederman, S J; Wake, H; Fujita, N

    1993-08-01

    Haptic identification of real objects is superior to that of raised two-dimensional (2-D) depictions. Three explanations of real-object superiority were investigated: contribution of material information, contribution of 3-D shape and size, and greater potential for integration across the fingers. In Experiment 1, subjects, while wearing gloves that gently attenuated material information, haptically identified real objects that provided reduced cues to compliance, mass, and part motion. The gloves permitted exploration with free hand movement, a single outstretched finger, or five outstretched fingers. Performance decreased over these three conditions but was superior to identification of pictures of the same objects in all cases, indicating the contribution of 3-D structure and integration across the fingers. Picture performance was also better with five fingers than with one. In Experiment 2, the subjects wore open-fingered gloves, which provided them with material information. Consequently, the effect of type of exploration was substantially reduced but not eliminated. Material compensates somewhat for limited access to object structure but is not the primary basis for haptic object identification.

  19. When vision is not an option: children's integration of auditory and haptic information is suboptimal.

    PubMed

    Petrini, Karin; Remark, Alicia; Smith, Louise; Nardini, Marko

    2014-05-01

    When visual information is available, human adults, but not children, have been shown to reduce sensory uncertainty by taking a weighted average of sensory cues. In the absence of reliable visual information (e.g. extremely dark environment, visual disorders), the use of other information is vital. Here we ask how humans combine haptic and auditory information from childhood. In the first experiment, adults and children aged 5 to 11 years judged the relative sizes of two objects in auditory, haptic, and non-conflicting bimodal conditions. In , different groups of adults and children were tested in non-conflicting and conflicting bimodal conditions. In , adults reduced sensory uncertainty by integrating the cues optimally, while children did not. In , adults and children used similar weighting strategies to solve audio-haptic conflict. These results suggest that, in the absence of visual information, optimal integration of cues for discrimination of object size develops late in childhood. © 2014 The Authors. Developmental Science Published by John Wiley & Sons Ltd.

  20. Functional Equivalence of Spatial Images from Touch and Vision: Evidence from Spatial Updating in Blind and Sighted Individuals

    PubMed Central

    Giudice, Nicholas A.; Betty, Maryann R.; Loomis, Jack M.

    2012-01-01

    This research examines whether visual and haptic map learning yield functionally equivalent spatial images in working memory, as evidenced by similar encoding bias and updating performance. In three experiments, participants learned four-point routes either by seeing or feeling the maps. At test, blindfolded participants made spatial judgments about the maps from imagined perspectives that were either aligned or misaligned with the maps as represented in working memory. Results from Experiments 1 and 2 revealed a highly similar pattern of latencies and errors between visual and haptic conditions. These findings extend the well known alignment biases for visual map learning to haptic map learning, provide further evidence of haptic updating, and most importantly, show that learning from the two modalities yields very similar performance across all conditions. Experiment 3 found the same encoding biases and updating performance with blind individuals, demonstrating that functional equivalence cannot be due to visual recoding and is consistent with an amodal hypothesis of spatial images. PMID:21299331

  1. Design and Calibration of a New 6 DOF Haptic Device

    PubMed Central

    Qin, Huanhuan; Song, Aiguo; Liu, Yuqing; Jiang, Guohua; Zhou, Bohe

    2015-01-01

    For many applications such as tele-operational robots and interactions with virtual environments, it is better to have performance with force feedback than without. Haptic devices are force reflecting interfaces. They can also track human hand positions simultaneously. A new 6 DOF (degree-of-freedom) haptic device was designed and calibrated in this study. It mainly contains a double parallel linkage, a rhombus linkage, a rotating mechanical structure and a grasping interface. Benefited from the unique design, it is a hybrid structure device with a large workspace and high output capability. Therefore, it is capable of multi-finger interactions. Moreover, with an adjustable base, operators can change different postures without interrupting haptic tasks. To investigate the performance regarding position tracking accuracy and static output forces, we conducted experiments on a three-dimensional electric sliding platform and a digital force gauge, respectively. Displacement errors and force errors are calculated and analyzed. To identify the capability and potential of the device, four application examples were programmed. PMID:26690449

  2. A haptic pedal for surgery assistance.

    PubMed

    Díaz, Iñaki; Gil, Jorge Juan; Louredo, Marcos

    2014-09-01

    The research and development of mechatronic aids for surgery is a persistent challenge in the field of robotic surgery. This paper presents a new haptic pedal conceived to assist surgeons in the operating room by transmitting real-time surgical information through the foot. An effective human-robot interaction system for medical practice must exchange appropriate information with the operator as quickly and accurately as possible. Moreover, information must flow through the appropriate sensory modalities for a natural and simple interaction. However, users of current robotic systems might experience cognitive overload and be increasingly overwhelmed by data streams from multiple modalities. A new haptic channel is thus explored to complement and improve existing systems. A preliminary set of experiments has been carried out to evaluate the performance of the proposed system in a virtual surgical drilling task. The results of the experiments show the effectiveness of the haptic pedal in providing surgical information through the foot. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  3. Sensory subtraction in robot-assisted surgery: fingertip skin deformation feedback to ensure safety and improve transparency in bimanual haptic interaction.

    PubMed

    Meli, Leonardo; Pacchierotti, Claudio; Prattichizzo, Domenico

    2014-04-01

    This study presents a novel approach to force feedback in robot-assisted surgery. It consists of substituting haptic stimuli, composed of a kinesthetic component and a skin deformation, with cutaneous stimuli only. The force generated can then be thought as a subtraction between the complete haptic interaction, cutaneous, and kinesthetic, and the kinesthetic part of it. For this reason, we refer to this approach as sensory subtraction. Sensory subtraction aims at outperforming other nonkinesthetic feedback techniques in teleoperation (e.g., sensory substitution) while guaranteeing the stability and safety of the system. We tested the proposed approach in a challenging 7-DoF bimanual teleoperation task, similar to the Pegboard experiment of the da Vinci Skills Simulator. Sensory subtraction showed improved performance in terms of completion time, force exerted, and total displacement of the rings with respect to two popular sensory substitution techniques. Moreover, it guaranteed a stable interaction in the presence of a communication delay in the haptic loop.

  4. Influence of surgical gloves on haptic perception thresholds.

    PubMed

    Hatzfeld, Christian; Dorsch, Sarah; Neupert, Carsten; Kupnik, Mario

    2018-02-01

    Impairment of haptic perception by surgical gloves could reduce requirements on haptic systems for surgery. While grip forces and manipulation capabilities were not impaired in previous studies, no data is available for perception thresholds. Absolute and differential thresholds (20 dB above threshold) of 24 subjects were measured for frequencies of 25 and 250 Hz with a Ψ-method. Effects of wearing a surgical glove, moisture on the contact surface and subject's experience with gloves were incorporated in a full-factorial experimental design. Absolute thresholds of 12.8 dB and -29.6 dB (means for 25 and 250 Hz, respectively) and differential thresholds of -12.6 dB and -9.5 dB agree with previous studies. A relevant effect of the frequency on absolute thresholds was found. Comparisons of glove- and no-glove-conditions did not reveal a significant mean difference. Wearing a single surgical glove does not affect absolute and differential haptic perception thresholds. Copyright © 2017 John Wiley & Sons, Ltd.

  5. A Three-Axis Force Sensor for Dual Finger Haptic Interfaces

    PubMed Central

    Fontana, Marco; Marcheschi, Simone; Salsedo, Fabio; Bergamasco, Massimo

    2012-01-01

    In this work we present the design process, the characterization and testing of a novel three-axis mechanical force sensor. This sensor is optimized for use in closed-loop force control of haptic devices with three degrees of freedom. In particular the sensor has been conceived for integration with a dual finger haptic interface that aims at simulating forces that occur during grasping and surface exploration. The sensing spring structure has been purposely designed in order to match force and layout specifications for the application. In this paper the design of the sensor is presented, starting from an analytic model that describes the characteristic matrix of the sensor. A procedure for designing an optimal overload protection mechanism is proposed. In the last part of the paper the authors describe the experimental characterization and the integrated test on a haptic hand exoskeleton showing the improvements in the controller performances provided by the inclusion of the force sensor. PMID:23202012

  6. Development of a novel haptic glove for improving finger dexterity in poststroke rehabilitation.

    PubMed

    Lin, Chi-Ying; Tsai, Chia-Min; Shih, Pei-Cheng; Wu, Hsiao-Ching

    2015-01-01

    Almost all stroke patients experience a certain degree of fine motor impairment, and impeded finger movement may limit activities in daily life. Thus, to improve the quality of life of stroke patients, designing an efficient training device for fine motor rehabilitation is crucial. This study aimed to develop a novel fine motor training glove that integrates a virtual-reality based interactive environment with vibrotactile feedback for more effective post stroke hand rehabilitation. The proposed haptic rehabilitation device is equipped with small DC vibration motors for vibrotactile feedback stimulation and piezoresistive thin-film force sensors for motor function evaluation. Two virtual-reality based games ``gopher hitting'' and ``musical note hitting'' were developed as a haptic interface. According to the designed rehabilitation program, patients intuitively push and practice their fingers to improve the finger isolation function. Preliminary tests were conducted to assess the feasibility of the developed haptic rehabilitation system and to identify design concerns regarding the practical use in future clinical testing.

  7. The Effect of Visual Experience on Perceived Haptic Verticality When Tilted in the Roll Plane

    PubMed Central

    Cuturi, Luigi F.; Gori, Monica

    2017-01-01

    The orientation of the body in space can influence perception of verticality leading sometimes to biases consistent with priors peaked at the most common head and body orientation, that is upright. In this study, we investigate haptic perception of verticality in sighted individuals and early and late blind adults when tilted counterclockwise in the roll plane. Participants were asked to perform a stimulus orientation discrimination task with their body tilted to their left ear side 90° relative to gravity. Stimuli were presented by using a motorized haptic bar. In order to test whether different reference frames relative to the head influenced perception of verticality, we varied the position of the stimulus on the body longitudinal axis. Depending on the stimulus position sighted participants tended to have biases away or toward their body tilt. Visually impaired individuals instead show a different pattern of verticality estimations. A bias toward head and body tilt (i.e., Aubert effect) was observed in late blind individuals. Interestingly, no strong biases were observed in early blind individuals. Overall, these results posit visual sensory information to be fundamental in influencing the haptic readout of proprioceptive and vestibular information about body orientation relative to gravity. The acquisition of an idiotropic vector signaling the upright might take place through vision during development. Regarding early blind individuals, independent spatial navigation experience likely enhanced by echolocation behavior might have a role in such acquisition. In the case of participants with late onset blindness, early experience of vision might lead them to anchor their visually acquired priors to the haptic modality with no disambiguation between head and body references as observed in sighted individuals (Fraser et al., 2015). With our study, we aim to investigate haptic perception of gravity direction in unusual body tilts when vision is absent due to visual impairment. Insofar, our findings throw light on the influence of proprioceptive/vestibular sensory information on haptic perceived verticality in blind individuals showing how this phenomenon is affected by visual experience. PMID:29270109

  8. Does the Integration of Haptic and Visual Cues Reduce the Effect of a Biased Visual Reference Frame on the Subjective Head Orientation?

    PubMed Central

    Gueguen, Marc; Vuillerme, Nicolas; Isableu, Brice

    2012-01-01

    Background The selection of appropriate frames of reference (FOR) is a key factor in the elaboration of spatial perception and the production of robust interaction with our environment. The extent to which we perceive the head axis orientation (subjective head orientation, SHO) with both accuracy and precision likely contributes to the efficiency of these spatial interactions. A first goal of this study was to investigate the relative contribution of both the visual and egocentric FOR (centre-of-mass) in the SHO processing. A second goal was to investigate humans' ability to process SHO in various sensory response modalities (visual, haptic and visuo-haptic), and the way they modify the reliance to either the visual or egocentric FORs. A third goal was to question whether subjects combined visual and haptic cues optimally to increase SHO certainty and to decrease the FORs disruption effect. Methodology/Principal Findings Thirteen subjects were asked to indicate their SHO while the visual and/or egocentric FORs were deviated. Four results emerged from our study. First, visual rod settings to SHO were altered by the tilted visual frame but not by the egocentric FOR alteration, whereas no haptic settings alteration was observed whether due to the egocentric FOR alteration or the tilted visual frame. These results are modulated by individual analysis. Second, visual and egocentric FOR dependency appear to be negatively correlated. Third, the response modality enrichment appears to improve SHO. Fourth, several combination rules of the visuo-haptic cues such as the Maximum Likelihood Estimation (MLE), Winner-Take-All (WTA) or Unweighted Mean (UWM) rule seem to account for SHO improvements. However, the UWM rule seems to best account for the improvement of visuo-haptic estimates, especially in situations with high FOR incongruence. Finally, the data also indicated that FOR reliance resulted from the application of UWM rule. This was observed more particularly, in the visual dependent subject. Conclusions: Taken together, these findings emphasize the importance of identifying individual spatial FOR preferences to assess the efficiency of our interaction with the environment whilst performing spatial tasks. PMID:22509295

  9. Haptic-assistive technologies for audition and vision sensory disabilities.

    PubMed

    Sorgini, Francesca; Caliò, Renato; Carrozza, Maria Chiara; Oddo, Calogero Maria

    2018-05-01

    The aim of this review is to analyze haptic sensory substitution technologies for deaf, blind and deaf-blind individuals. The literature search has been performed in Scopus, PubMed and Google Scholar databases using selected keywords, analyzing studies from 1960s to present. Search on databases for scientific publications has been accompanied by web search for commercial devices. Results have been classified by sensory disability and functionality, and analyzed by assistive technology. Complementary analyses have also been carried out on websites of public international agencies, such as the World Health Organization (WHO), and of associations representing sensory disabled persons. The reviewed literature provides evidences that sensory substitution aids are able to mitigate in part the deficits in language learning, communication and navigation for deaf, blind and deaf-blind individuals, and that the tactile sense can be a means of communication to provide some kind of information to sensory disabled individuals. A lack of acceptance emerged from the discussion of capabilities and limitations of haptic assistive technologies. Future researches shall go towards miniaturized, custom-designed and low-cost haptic interfaces and integration with personal devices such as smartphones for a major diffusion of sensory aids among disabled. Implications for rehabilitation Systematic review of state of the art of haptic assistive technologies for vision and audition sensory disabilities. Sensory substitution systems for visual and hearing disabilities have a central role in the transmission of information for patients with sensory impairments, enabling users to interact with the not disabled community in daily activities. Visual and auditory inputs are converted in haptic feedback via different actuation technologies. The information is presented in the form of static or dynamic stimulation of the skin. Their effectiveness and ease of use make haptic sensory substitution systems suitable for patients with different levels of disabilities. They constitute a cheaper and less invasive alternative to implantable partial sensory restitution systems. Future researches are oriented towards the optimization of the stimulation parameters together with the development of miniaturized, custom-designed and low-cost aids operating in synergy in networks, aiming to increase patients' acceptability of these technologies.

  10. Learning from vision-to-touch is different than learning from touch-to-vision.

    PubMed

    Wismeijer, Dagmar A; Gegenfurtner, Karl R; Drewing, Knut

    2012-01-01

    We studied whether vision can teach touch to the same extent as touch seems to teach vision. In a 2 × 2 between-participants learning study, we artificially correlated visual gloss cues with haptic compliance cues. In two "natural" tasks, we tested whether visual gloss estimations have an influence on haptic estimations of softness and vice versa. In two "novel" tasks, in which participants were either asked to haptically judge glossiness or to visually judge softness, we investigated how perceptual estimates transfer from one sense to the other. Our results showed that vision does not teach touch as efficient as touch seems to teach vision.

  11. Fiber optical sensor system for shape and haptics for flexible instruments in minimally invasive surgery: overview and status quo

    NASA Astrophysics Data System (ADS)

    Ledermann, Christoph; Pauer, Hendrikje; Woern, Heinz

    2014-05-01

    In minimally invasive surgery, exible mechatronic instruments promise to improve the overall performance of surgical interventions. However, those instruments require highly developed sensors in order to provide haptic feedback to the surgeon or to enable (semi-)autonomous tasks. Precisely, haptic sensors and a shape sensor are required. In this paper, we present our ber optical sensor system of Fiber Bragg Gratings, which consists of a shape sensor, a kinesthetic sensor and a tactile sensor. The status quo of each of the three sensors is described, as well as the concept to integrate them into one ber optical sensor system.

  12. Sensorimotor enhancement with a mixed reality system for balance and mobility rehabilitation.

    PubMed

    Fung, Joyce; Perez, Claire F

    2011-01-01

    We have developed a mixed reality system incorporating virtual reality (VR), surface perturbations and light touch for gait rehabilitation. Haptic touch has emerged as a novel and efficient technique to improve postural control and dynamic stability. Our system combines visual display with the manipulation of physical environments and addition of haptic feedback to enhance balance and mobility post stroke. A research study involving 9 participants with stroke and 9 age-matched healthy individuals show that the haptic cue provided while walking is an effective means of improving gait stability in people post stroke, especially during challenging environmental conditions such as downslope walking.

  13. 78 FR 23593 - Certain Mobile Electronic Devices Incorporating Haptics; Termination of Investigation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-19

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-834] Certain Mobile Electronic Devices... this investigation may be viewed on the Commission's electronic docket (EDIS) at http://edis.usitc.gov... mobile electronic devices incorporating haptics that infringe certain claims of six Immersion patents. 77...

  14. Introduction to haptics for neurosurgeons.

    PubMed

    L'Orsa, Rachael; Macnab, Chris J B; Tavakoli, Mahdi

    2013-01-01

    Robots are becoming increasingly relevant to neurosurgeons, extending a neurosurgeon's physical capabilities, improving navigation within the surgical landscape when combined with advanced imaging, and propelling the movement toward minimally invasive surgery. Most surgical robots, however, isolate surgeons from the full range of human senses during a procedure. This forces surgeons to rely on vision alone for guidance through the surgical corridor, which limits the capabilities of the system, requires significant operator training, and increases the surgeon's workload. Incorporating haptics into these systems, ie, enabling the surgeon to "feel" forces experienced by the tool tip of the robot, could render these limitations obsolete by making the robot feel more like an extension of the surgeon's own body. Although the use of haptics in neurosurgical robots is still mostly the domain of research, neurosurgeons who keep abreast of this emerging field will be more prepared to take advantage of it as it becomes more prevalent in operating theaters. Thus, this article serves as an introduction to the field of haptics for neurosurgeons. We not only outline the current and future benefits of haptics but also introduce concepts in the fields of robotic technology and computer control. This knowledge will allow readers to be better aware of limitations in the technology that can affect performance and surgical outcomes, and "knowing the right questions to ask" will be invaluable for surgeons who have purchasing power within their departments.

  15. On the design of a miniature haptic ring for cutaneous force feedback using shape memory alloy actuators

    NASA Astrophysics Data System (ADS)

    Hwang, Donghyun; Lee, Jaemin; Kim, Keehoon

    2017-10-01

    This paper proposes a miniature haptic ring that can display touch/pressure and shearing force to the user’s fingerpad. For practical use and wider application of the device, it is developed with the aim of achieving high wearability and mobility/portability as well as cutaneous force feedback functionality. A main body of the device is designed as a ring-shaped lightweight structure with a simple driving mechanism, and thin shape memory alloy (SMA) wires having high energy density are applied as actuating elements. Also, based on a band-type wireless control unit including a wireless data communication module, the whole device could be realized as a wearable mobile haptic device system. These features enable the device to take diverse advantages on functional performances and to provide users with significant usability. In this work, the proposed miniature haptic ring is systematically designed, and its working performances are experimentally evaluated with a fabricated functional prototype. The experimental results obviously demonstrate that the proposed device exhibits higher force-to-weight ratio than conventional finger-wearable haptic devices for cutaneous force feedback. Also, it is investigated that operational performances of the device are strongly influenced by electro-thermomechanical behaviors of the SMA actuator. In addition to the experiments for performance evaluation, we conduct a preliminary user test to assess practical feasibility and usability based on user’s qualitative feedback.

  16. Vibrotactile perception assessment for a haptic interface on an antigravity suit.

    PubMed

    Ko, Sang Min; Lee, Kwangil; Kim, Daeho; Ji, Yong Gu

    2017-01-01

    Haptic technology is used in various fields to transmit information to the user with or without visual and auditory cues. This study aimed to provide preliminary data for use in developing a haptic interface for an antigravity (anti-G) suit. With the structural characteristics of the anti-G suit in mind, we determined five areas on the body (lower back, outer thighs, inner thighs, outer calves, and inner calves) on which to install ten bar-type eccentric rotating mass (ERM) motors as vibration actuators. To determine the design factors of the haptic anti-G suit, we conducted three experiments to find the absolute threshold, moderate intensity, and subjective assessments of vibrotactile stimuli. Twenty-six fighter pilots participated in the experiments, which were conducted in a fixed-based flight simulator. From the results of our study, we recommend 1) absolute thresholds of ∼11.98-15.84 Hz and 102.01-104.06 dB, 2) moderate intensities of 74.36 Hz and 126.98 dB for the lower back and 58.65 Hz and 122.37 dB for either side of the thighs and calves, and 3) subjective assessments of vibrotactile stimuli (displeasure, easy to perceive, and level of comfort). The results of this study will be useful for the design of a haptic anti-G suit. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Neurosurgery simulation using non-linear finite element modeling and haptic interaction

    NASA Astrophysics Data System (ADS)

    Lee, Huai-Ping; Audette, Michel; Joldes, Grand R.; Enquobahrie, Andinet

    2012-02-01

    Real-time surgical simulation is becoming an important component of surgical training. To meet the realtime requirement, however, the accuracy of the biomechancial modeling of soft tissue is often compromised due to computing resource constraints. Furthermore, haptic integration presents an additional challenge with its requirement for a high update rate. As a result, most real-time surgical simulation systems employ a linear elasticity model, simplified numerical methods such as the boundary element method or spring-particle systems, and coarse volumetric meshes. However, these systems are not clinically realistic. We present here an ongoing work aimed at developing an efficient and physically realistic neurosurgery simulator using a non-linear finite element method (FEM) with haptic interaction. Real-time finite element analysis is achieved by utilizing the total Lagrangian explicit dynamic (TLED) formulation and GPU acceleration of per-node and per-element operations. We employ a virtual coupling method for separating deformable body simulation and collision detection from haptic rendering, which needs to be updated at a much higher rate than the visual simulation. The system provides accurate biomechancial modeling of soft tissue while retaining a real-time performance with haptic interaction. However, our experiments showed that the stability of the simulator depends heavily on the material property of the tissue and the speed of colliding objects. Hence, additional efforts including dynamic relaxation are required to improve the stability of the system.

  18. Framework for e-learning assessment in dental education: a global model for the future.

    PubMed

    Arevalo, Carolina R; Bayne, Stephen C; Beeley, Josie A; Brayshaw, Christine J; Cox, Margaret J; Donaldson, Nora H; Elson, Bruce S; Grayden, Sharon K; Hatzipanagos, Stylianos; Johnson, Lynn A; Reynolds, Patricia A; Schönwetter, Dieter J

    2013-05-01

    The framework presented in this article demonstrates strategies for a global approach to e-curricula in dental education by considering a collection of outcome assessment tools. By combining the outcomes for overall assessment, a global model for a pilot project that applies e-assessment tools to virtual learning environments (VLE), including haptics, is presented. Assessment strategies from two projects, HapTEL (Haptics in Technology Enhanced Learning) and UDENTE (Universal Dental E-learning), act as case-user studies that have helped develop the proposed global framework. They incorporate additional assessment tools and include evaluations from questionnaires and stakeholders' focus groups. These measure each of the factors affecting the classical teaching/learning theory framework as defined by Entwistle in a standardized manner. A mathematical combinatorial approach is proposed to join these results together as a global assessment. With the use of haptic-based simulation learning, exercises for tooth preparation assessing enamel and dentine were compared to plastic teeth in manikins. Equivalence for student performance for haptic versus traditional preparation methods was established, thus establishing the validity of the haptic solution for performing these exercises. Further data collected from HapTEL are still being analyzed, and pilots are being conducted to validate the proposed test measures. Initial results have been encouraging, but clearly the need persists to develop additional e-assessment methods for new learning domains.

  19. Teaching bovine abdominal anatomy: use of a haptic simulator.

    PubMed

    Kinnison, Tierney; Forrest, Neil David; Frean, Stephen Philip; Baillie, Sarah

    2009-01-01

    Traditional methods of teaching anatomy to undergraduate medical and veterinary students are being challenged and need to adapt to modern concerns and requirements. There is a move away from the use of cadavers to new technologies as a way of complementing the traditional approaches and addressing resource and ethical problems. Haptic (touch) technology, which allows the student to feel a 3D computer-generated virtual environment, provides a novel way to address some of these challenges. To evaluate the practicalities and usefulness of a haptic simulator, first year veterinary students at the Royal Veterinary College, University of London, were taught basic bovine abdominal anatomy using a rectal palpation simulator: "The Haptic Cow." Over two days, 186 students were taught in small groups and 184 provided feedback via a questionnaire. The results were positive; the majority of students considered that the simulator had been useful for appreciating both the feel and location of key internal anatomical structures, had helped with their understanding of bovine abdominal anatomy and 3D visualization, and the tutorial had been enjoyable. The students were mostly in favor of the small group tutorial format, but some requested more time on the simulator. The findings indicate that the haptic simulator is an engaging way of teaching bovine abdominal anatomy to a large number of students in an efficient manner without using cadavers, thereby addressing some of the current challenges in anatomy teaching.

  20. Discriminating Tissue Stiffness with a Haptic Catheter: Feeling the Inside of the Beating Heart

    PubMed Central

    Kesner, Samuel B.; Howe, Robert D.

    2011-01-01

    Catheter devices allow physicians to access the inside of the human body easily and painlessly through natural orifices and vessels. Although catheters allow for the delivery of fluids and drugs, the deployment of devices, and the acquisition of the measurements, they do not allow clinicians to assess the physical properties of tissue inside the body due to the tissue motion and transmission limitations of the catheter devices, including compliance, friction, and backlash. The goal of this research is to increase the tactile information available to physicians during catheter procedures by providing haptic feedback during palpation procedures. To accomplish this goal, we have developed the first motion compensated actuated catheter system that enables haptic perception of fast moving tissue structures. The actuated catheter is instrumented with a distal tip force sensor and a force feedback interface that allows users to adjust the position of the catheter while experiencing the forces on the catheter tip. The efficacy of this device and interface is evaluated through a psychophyisical study comparing how accurately users can differentiate various materials attached to a cardiac motion simulator using the haptic device and a conventional manual catheter. The results demonstrate that haptics improves a user's ability to differentiate material properties and decreases the total number of errors by 50% over the manual catheter system. PMID:25285321

  1. Assisting Movement Training and Execution With Visual and Haptic Feedback.

    PubMed

    Ewerton, Marco; Rother, David; Weimar, Jakob; Kollegger, Gerrit; Wiemeyer, Josef; Peters, Jan; Maeda, Guilherme

    2018-01-01

    In the practice of motor skills in general, errors in the execution of movements may go unnoticed when a human instructor is not available. In this case, a computer system or robotic device able to detect movement errors and propose corrections would be of great help. This paper addresses the problem of how to detect such execution errors and how to provide feedback to the human to correct his/her motor skill using a general, principled methodology based on imitation learning. The core idea is to compare the observed skill with a probabilistic model learned from expert demonstrations. The intensity of the feedback is regulated by the likelihood of the model given the observed skill. Based on demonstrations, our system can, for example, detect errors in the writing of characters with multiple strokes. Moreover, by using a haptic device, the Haption Virtuose 6D, we demonstrate a method to generate haptic feedback based on a distribution over trajectories, which could be used as an auxiliary means of communication between an instructor and an apprentice. Additionally, given a performance measurement, the haptic device can help the human discover and perform better movements to solve a given task. In this case, the human first tries a few times to solve the task without assistance. Our framework, in turn, uses a reinforcement learning algorithm to compute haptic feedback, which guides the human toward better solutions.

  2. Control of repulsive force in a virtual environment using an electrorheological haptic master for a surgical robot application

    NASA Astrophysics Data System (ADS)

    Oh, Jong-Seok; Choi, Seung-Hyun; Choi, Seung-Bok

    2014-01-01

    This paper presents control performances of a new type of four-degrees-of-freedom (4-DOF) haptic master that can be used for robot-assisted minimally invasive surgery (RMIS). By adopting a controllable electrorheological (ER) fluid, the function of the proposed master is realized as a haptic feedback as well as remote manipulation. In order to verify the efficacy of the proposed master and method, an experiment is conducted with deformable objects featuring human organs. Since the use of real human organs is difficult for control due to high cost and moral hazard, an excellent alternative method, the virtual reality environment, is used for control in this work. In order to embody a human organ in the virtual space, the experiment adopts a volumetric deformable object represented by a shape-retaining chain linked (S-chain) model which has salient properties such as fast and realistic deformation of elastic objects. In haptic architecture for RMIS, the desired torque/force and desired position originating from the object of the virtual slave and operator of the haptic master are transferred to each other. In order to achieve the desired torque/force trajectories, a sliding mode controller (SMC) which is known to be robust to uncertainties is designed and empirically implemented. Tracking control performances for various torque/force trajectories from the virtual slave are evaluated and presented in the time domain.

  3. The role of visuohaptic experience in visually perceived depth.

    PubMed

    Ho, Yun-Xian; Serwe, Sascha; Trommershäuser, Julia; Maloney, Laurence T; Landy, Michael S

    2009-06-01

    Berkeley suggested that "touch educates vision," that is, haptic input may be used to calibrate visual cues to improve visual estimation of properties of the world. Here, we test whether haptic input may be used to "miseducate" vision, causing observers to rely more heavily on misleading visual cues. Human subjects compared the depth of two cylindrical bumps illuminated by light sources located at different positions relative to the surface. As in previous work using judgments of surface roughness, we find that observers judge bumps to have greater depth when the light source is located eccentric to the surface normal (i.e., when shadows are more salient). Following several sessions of visual judgments of depth, subjects then underwent visuohaptic training in which haptic feedback was artificially correlated with the "pseudocue" of shadow size and artificially decorrelated with disparity and texture. Although there were large individual differences, almost all observers demonstrated integration of haptic cues during visuohaptic training. For some observers, subsequent visual judgments of bump depth were unaffected by the training. However, for 5 of 12 observers, training significantly increased the weight given to pseudocues, causing subsequent visual estimates of shape to be less veridical. We conclude that haptic information can be used to reweight visual cues, putting more weight on misleading pseudocues, even when more trustworthy visual cues are available in the scene.

  4. Shared control of a medical robot with haptic guidance.

    PubMed

    Xiong, Linfei; Chng, Chin Boon; Chui, Chee Kong; Yu, Peiwu; Li, Yao

    2017-01-01

    Tele-operation of robotic surgery reduces the radiation exposure during the interventional radiological operations. However, endoscope vision without force feedback on the surgical tool increases the difficulty for precise manipulation and the risk of tissue damage. The shared control of vision and force provides a novel approach of enhanced control with haptic guidance, which could lead to subtle dexterity and better maneuvrability during MIS surgery. The paper provides an innovative shared control method for robotic minimally invasive surgery system, in which vision and haptic feedback are incorporated to provide guidance cues to the clinician during surgery. The incremental potential field (IPF) method is utilized to generate a guidance path based on the anatomy of tissue and surgical tool interaction. Haptic guidance is provided at the master end to assist the clinician during tele-operative surgical robotic task. The approach has been validated with path following and virtual tumor targeting experiments. The experiment results demonstrate that comparing with vision only guidance, the shared control with vision and haptics improved the accuracy and efficiency of surgical robotic manipulation, where the tool-position error distance and execution time are reduced. The validation experiment demonstrates that the shared control approach could help the surgical robot system provide stable assistance and precise performance to execute the designated surgical task. The methodology could also be implemented with other surgical robot with different surgical tools and applications.

  5. When Neuroscience 'Touches' Architecture: From Hapticity to a Supramodal Functioning of the Human Brain.

    PubMed

    Papale, Paolo; Chiesi, Leonardo; Rampinini, Alessandra C; Pietrini, Pietro; Ricciardi, Emiliano

    2016-01-01

    In the last decades, the rapid growth of functional brain imaging methodologies allowed cognitive neuroscience to address open questions in philosophy and social sciences. At the same time, novel insights from cognitive neuroscience research have begun to influence various disciplines, leading to a turn to cognition and emotion in the fields of planning and architectural design. Since 2003, the Academy of Neuroscience for Architecture has been supporting 'neuro-architecture' as a way to connect neuroscience and the study of behavioral responses to the built environment. Among the many topics related to multisensory perceptual integration and embodiment, the concept of hapticity was recently introduced, suggesting a pivotal role of tactile perception and haptic imagery in architectural appraisal. Arguments have thus risen in favor of the existence of shared cognitive foundations between hapticity and the supramodal functional architecture of the human brain. Precisely, supramodality refers to the functional feature of defined brain regions to process and represent specific information content in a more abstract way, independently of the sensory modality conveying such information to the brain. Here, we highlight some commonalities and differences between the concepts of hapticity and supramodality according to the distinctive perspectives of architecture and cognitive neuroscience. This comparison and connection between these two different approaches may lead to novel observations in regard to people-environment relationships, and even provide empirical foundations for a renewed evidence-based design theory.

  6. Interactive Learning for Graphic Design Foundations

    ERIC Educational Resources Information Center

    Chu, Sauman; Ramirez, German Mauricio Mejia

    2012-01-01

    One of the biggest problems for students majoring in pre-graphic design is students' inability to apply their knowledge to different design solutions. The purpose of this study is to examine the effectiveness of interactive learning modules in facilitating knowledge acquisition during the learning process and to create interactive learning modules…

  7. Exploring Identity and Multiliteracies through Graphic Narratives

    ERIC Educational Resources Information Center

    King, Alyson E.

    2015-01-01

    In a first-year, university-level communication course that examined issues of race, ethnicity, postcolonialism, diaspora, and coming-of-age using different points of view and modes of communication, students created graphic novel-style auto-ethnographies to reflect on their experiences with diaspora and identity creation. The assignment was an…

  8. Inner-City Children's Graphics Call for Social Justice

    ERIC Educational Resources Information Center

    Lu, Lucia Y.

    2010-01-01

    In a graduate literacy course, the author as the teacher educator conceptualized "Visual Literacy" into the course. In-service and pre-service teachers from inner-city schools in Georgia and Virginia invited struggling writers to create graphic novels by envisioning life activities, using drawings, and invented spellings as well as…

  9. Using Lotus 1-2-3 for "Non-Stop" Graphic Simulation.

    ERIC Educational Resources Information Center

    Godin, Victor B.; Rao, Ashok

    1988-01-01

    Discusses the use of Lotus 1-2-3 to create non-stop graphic displays of simulation models. Describes a simple application of this technique using the distribution resulting from repeated throws of dice. Lists other software used with this technique. Stresses the advantages of this approach in education. (CW)

  10. Interactive Mathematica Simulations in Chemical Engineering Courses

    ERIC Educational Resources Information Center

    Falconer, John L.; Nicodemus, Garret D.

    2014-01-01

    Interactive Mathematica simulations with graphical displays of system behavior are an excellent addition to chemical engineering courses. The Manipulate command in Mathematica creates on-screen controls that allow users to change system variables and see the graphical output almost instantaneously. They can be used both in and outside class. More…

  11. Paneling "Matters" in Elementary Students' Graphic Narratives

    ERIC Educational Resources Information Center

    Pantaleo, Sylvia

    2013-01-01

    During a 10-week classroom-based study, 20 fourth grade students participated in a number of interdependent activities that focused on developing their visual meaning-making skills and competencies. As well as reading, responding in writing to and discussing a selection of picturebooks, graphic novels, and magazines, the students created graphic…

  12. A User Study on Tactile Graphic Generation Methods

    ERIC Educational Resources Information Center

    Krufka, S. E.; Barner, K. E.

    2006-01-01

    Methods to automatically convert graphics into tactile representations have been recently investigated, creating either raised-line or relief images. In particular, we briefly review one raised-line method where important features are emphasized. This paper focuses primarily on the effects of such emphasis and on comparing both raised-line and…

  13. A Virtual Reality Simulator Prototype for Learning and Assessing Phaco-sculpting Skills

    NASA Astrophysics Data System (ADS)

    Choi, Kup-Sze

    This paper presents a virtual reality based simulator prototype for learning phacoemulsification in cataract surgery, with focus on the skills required for making a cross-shape trench in cataractous lens by an ultrasound probe during the phaco-sculpting procedure. An immersive virtual environment is created with 3D models of the lens and surgical tools. Haptic device is also used as 3D user interface. Phaco-sculpting is simulated by interactively deleting the constituting tetrahedrons of the lens model. Collisions between the virtual probe and the lens are effectively identified by partitioning the space containing the lens hierarchically with an octree. The simulator can be programmed to collect real-time quantitative user data for reviewing and assessing trainee's performance in an objective manner. A game-based learning environment can be created on top of the simulator by incorporating gaming elements based on the quantifiable performance metrics.

  14. Vector generator scan converter

    DOEpatents

    Moore, James M.; Leighton, James F.

    1990-01-01

    High printing speeds for graphics data are achieved with a laser printer by transmitting compressed graphics data from a main processor over an I/O (input/output) channel to a vector generator scan converter which reconstructs a full graphics image for input to the laser printer through a raster data input port. The vector generator scan converter includes a microprocessor with associated microcode memory containing a microcode instruction set, a working memory for storing compressed data, vector generator hardward for drawing a full graphic image from vector parameters calculated by the microprocessor, image buffer memory for storing the reconstructed graphics image and an output scanner for reading the graphics image data and inputting the data to the printer. The vector generator scan converter eliminates the bottleneck created by the I/O channel for transmitting graphics data from the main processor to the laser printer, and increases printer speed up to thirty fold.

  15. Vector generator scan converter

    DOEpatents

    Moore, J.M.; Leighton, J.F.

    1988-02-05

    High printing speeds for graphics data are achieved with a laser printer by transmitting compressed graphics data from a main processor over an I/O channel to a vector generator scan converter which reconstructs a full graphics image for input to the laser printer through a raster data input port. The vector generator scan converter includes a microprocessor with associated microcode memory containing a microcode instruction set, a working memory for storing compressed data, vector generator hardware for drawing a full graphic image from vector parameters calculated by the microprocessor, image buffer memory for storing the reconstructed graphics image and an output scanner for reading the graphics image data and inputting the data to the printer. The vector generator scan converter eliminates the bottleneck created by the I/O channel for transmitting graphics data from the main processor to the laser printer, and increases printer speed up to thirty fold. 7 figs.

  16. 77 FR 49458 - Certain Mobile Electronic Devices Incorporating Haptics; Amendment of the Complaint and Notice of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-16

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-834] Certain Mobile Electronic Devices.... 1337 in the importation, sale for importation, and sale within the United States after importation of certain mobile electronic devices incorporating haptics, by reason of the infringement of claims of six...

  17. Pattern Perception and Pictures for the Blind

    ERIC Educational Resources Information Center

    Heller, Morton A.; McCarthy, Melissa; Clark, Ashley

    2005-01-01

    This article reviews recent research on perception of tangible pictures in sighted and blind people. Haptic picture naming accuracy is dependent upon familiarity and access to semantic memory, just as in visual recognition. Performance is high when haptic picture recognition tasks do not depend upon semantic memory. Viewpoint matters for the ease…

  18. The Use of Haptic Display Technology in Education

    ERIC Educational Resources Information Center

    Barfield, Woodrow

    2009-01-01

    The experience of "virtual reality" can consist of head-tracked and stereoscopic virtual worlds, spatialized sound, haptic feedback, and to a lesser extent olfactory cues. Although virtual reality systems have been proposed for numerous applications, the field of education is one particular application that seems well-suited for virtual…

  19. Mediating Haptic Exploratory Strategies in Children Who Have Visual Impairment and Intellectual Disabilities

    ERIC Educational Resources Information Center

    McLinden, M.

    2012-01-01

    This article provides a synthesis of literature pertaining to the development of haptic exploratory strategies in children who have visual impairment and intellectual disabilities. The information received through such strategies assumes particular significance for these children, given the restricted information available through their visual…

  20. Do Haptic Representations Help Complex Molecular Learning?

    ERIC Educational Resources Information Center

    Bivall, Petter; Ainsworth, Shaaron; Tibell, Lena A. E.

    2011-01-01

    This study explored whether adding a haptic interface (that provides users with somatosensory information about virtual objects by force and tactile feedback) to a three-dimensional (3D) chemical model enhanced students' understanding of complex molecular interactions. Two modes of the model were compared in a between-groups pre- and posttest…

  1. 78 FR 74142 - Agency Information Collection Activities; Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-10

    ... dedicated to the IDSM. Still other warrantors have created posters to alert consumers to the existence of.... The graphic design work entails creating pamphlets, brochures, posters, or other materials aimed at...

  2. Flow visualization of CFD using graphics workstations

    NASA Technical Reports Server (NTRS)

    Lasinski, Thomas; Buning, Pieter; Choi, Diana; Rogers, Stuart; Bancroft, Gordon

    1987-01-01

    High performance graphics workstations are used to visualize the fluid flow dynamics obtained from supercomputer solutions of computational fluid dynamic programs. The visualizations can be done independently on the workstation or while the workstation is connected to the supercomputer in a distributed computing mode. In the distributed mode, the supercomputer interactively performs the computationally intensive graphics rendering tasks while the workstation performs the viewing tasks. A major advantage of the workstations is that the viewers can interactively change their viewing position while watching the dynamics of the flow fields. An overview of the computer hardware and software required to create these displays is presented. For complex scenes the workstation cannot create the displays fast enough for good motion analysis. For these cases, the animation sequences are recorded on video tape or 16 mm film a frame at a time and played back at the desired speed. The additional software and hardware required to create these video tapes or 16 mm movies are also described. Photographs illustrating current visualization techniques are discussed. Examples of the use of the workstations for flow visualization through animation are available on video tape.

  3. Audio Haptic Videogaming for Developing Wayfinding Skills in Learners Who are Blind

    PubMed Central

    Sánchez, Jaime; de Borba Campos, Marcia; Espinoza, Matías; Merabet, Lotfi B.

    2014-01-01

    Interactive digital technologies are currently being developed as a novel tool for education and skill development. Audiopolis is an audio and haptic based videogame designed for developing orientation and mobility (O&M) skills in people who are blind. We have evaluated the cognitive impact of videogame play on O&M skills by assessing performance on a series of behavioral tasks carried out in both indoor and outdoor virtual spaces. Our results demonstrate that the use of Audiopolis had a positive impact on the development and use of O&M skills in school-aged learners who are blind. The impact of audio and haptic information on learning is also discussed. PMID:25485312

  4. Social Touch Technology: A Survey of Haptic Technology for Social Touch.

    PubMed

    Huisman, Gijs

    2017-01-01

    This survey provides an overview of work on haptic technology for social touch. Social touch has been studied extensively in psychology and neuroscience. With the development of new technologies, it is now possible to engage in social touch at a distance or engage in social touch with artificial social agents. Social touch research has inspired research into technology mediated social touch, and this line of research has found effects similar to actual social touch. The importance of haptic stimulus qualities, multimodal cues, and contextual factors in technology mediated social touch is discussed. This survey is concluded by reflecting on the current state of research into social touch technology, and providing suggestions for future research and applications.

  5. Enhancing the Performance of Passive Teleoperation Systems via Cutaneous Feedback.

    PubMed

    Pacchierotti, Claudio; Tirmizi, Asad; Bianchini, Gianni; Prattichizzo, Domenico

    2015-01-01

    We introduce a novel method to improve the performance of passive teleoperation systems with force reflection. It consists of integrating kinesthetic haptic feedback provided by common grounded haptic interfaces with cutaneous haptic feedback. The proposed approach can be used on top of any time-domain control technique that ensures a stable interaction by scaling down kinesthetic feedback when this is required to satisfy stability conditions (e.g., passivity) at the expense of transparency. Performance is recovered by providing a suitable amount of cutaneous force through custom wearable cutaneous devices. The viability of the proposed approach is demonstrated through an experiment of perceived stiffness and an experiment of teleoperated needle insertion in soft tissue.

  6. Keep an eye on your hands: on the role of visual mechanisms in processing of haptic space

    PubMed Central

    Zuidhoek, Sander; Noordzij, Matthijs L.; Kappers, Astrid M. L.

    2008-01-01

    The present paper reviews research on a haptic orientation processing. Central is a task in which a test bar has to be set parallel to a reference bar at another location. Introducing a delay between inspecting the reference bar and setting the test bar leads to a surprising improvement. Moreover, offering visual background information also elevates performance. Interestingly, (congenitally) blind individuals do not or to a weaker extent show the improvement with time, while in parallel to this, they appear to benefit less from spatial imagery processing. Together this strongly points to an important role for visual processing mechanisms in the perception of haptic inputs. PMID:18196305

  7. Learning from vision-to-touch is different than learning from touch-to-vision

    PubMed Central

    Wismeijer, Dagmar A.; Gegenfurtner, Karl R.; Drewing, Knut

    2012-01-01

    We studied whether vision can teach touch to the same extent as touch seems to teach vision. In a 2 × 2 between-participants learning study, we artificially correlated visual gloss cues with haptic compliance cues. In two “natural” tasks, we tested whether visual gloss estimations have an influence on haptic estimations of softness and vice versa. In two “novel” tasks, in which participants were either asked to haptically judge glossiness or to visually judge softness, we investigated how perceptual estimates transfer from one sense to the other. Our results showed that vision does not teach touch as efficient as touch seems to teach vision. PMID:23181012

  8. Factors Influencing Undergraduate Students' Acceptance of a Haptic Interface for Learning Gross Anatomy

    ERIC Educational Resources Information Center

    Yeom, Soonja; Choi-Lundberg, Derek L.; Fluck, Andrew Edward; Sale, Arthur

    2017-01-01

    Purpose: This study aims to evaluate factors influencing undergraduate students' acceptance of a computer-aided learning resource using the Phantom Omni haptic stylus to enable rotation, touch and kinaesthetic feedback and display of names of three-dimensional (3D) human anatomical structures on a visual display. Design/methodology/approach: The…

  9. Comparing Tactile Maps and Haptic Digital Representations of a Maritime Environment

    ERIC Educational Resources Information Center

    Simonnet, Mathieu; Vieilledent, Steephane; Jacobson, R. Daniel; Tisseau, Jacques

    2011-01-01

    A map exploration and representation exercise was conducted with participants who were totally blind. Representations of maritime environments were presented either with a tactile map or with a digital haptic virtual map. We assessed the knowledge of spatial configurations using a triangulation technique. The results revealed that both types of…

  10. 77 FR 20847 - Certain Mobile Electronic Devices Incorporating Haptics; Institution of Investigation Pursuant to...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-06

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-834] Certain Mobile Electronic Devices Incorporating Haptics; Institution of Investigation Pursuant to 19 U.S.C. 1337 AGENCY: U.S. International Trade.... International Trade Commission on February 7, 2012, and an amended complaint was filed with the U.S...

  11. Feel, Imagine and Learn!--Haptic Augmented Simulation and Embodied Instruction in Physics Learning

    ERIC Educational Resources Information Center

    Han, In Sook

    2010-01-01

    The purpose of this study was to investigate the potentials and effects of an embodied instructional model in abstract concept learning. This embodied instructional process included haptic augmented educational simulation as an instructional tool to provide perceptual experiences as well as further instruction to activate those previous…

  12. Haptic Tracking Permits Bimanual Independence

    ERIC Educational Resources Information Center

    Rosenbaum, David A.; Dawson, Amanda A.; Challis, John H.

    2006-01-01

    This study shows that in a novel task--bimanual haptic tracking--neurologically normal human adults can move their 2 hands independently for extended periods of time with little or no training. Participants lightly touched buttons whose positions were moved either quasi-randomly in the horizontal plane by 1 or 2 human drivers (Experiment 1), in…

  13. Unpacking Students' Conceptualizations through Haptic Feedback

    ERIC Educational Resources Information Center

    Magana, A. J.; Balachandran, S.

    2017-01-01

    While it is clear that the use of computer simulations has a beneficial effect on learning when compared to instruction without computer simulations, there is still room for improvement to fully realize their benefits for learning. Haptic technologies can fulfill the educational potential of computer simulations by adding the sense of touch.…

  14. User Acceptance of a Haptic Interface for Learning Anatomy

    ERIC Educational Resources Information Center

    Yeom, Soonja; Choi-Lundberg, Derek; Fluck, Andrew; Sale, Arthur

    2013-01-01

    Visualizing the structure and relationships in three dimensions (3D) of organs is a challenge for students of anatomy. To provide an alternative way of learning anatomy engaging multiple senses, we are developing a force-feedback (haptic) interface for manipulation of 3D virtual organs, using design research methodology, with iterations of system…

  15. Supramodality Effects in Visual and Haptic Spatial Processes

    ERIC Educational Resources Information Center

    Cattaneo, Zaira; Vecchi, Tomaso

    2008-01-01

    In this article, the authors investigated unimodal and cross-modal processes in spatial working memory. A number of locations had to be memorized within visual or haptic matrices according to different experimental conditions known to be critical in accounting for the effects of perception on imagery. Results reveal that some characteristics of…

  16. Teaching Bovine Abdominal Anatomy: Use of a Haptic Simulator

    ERIC Educational Resources Information Center

    Kinnison, Tierney; Forrest, Neil David; Frean, Stephen Philip; Baillie, Sarah

    2009-01-01

    Traditional methods of teaching anatomy to undergraduate medical and veterinary students are being challenged and need to adapt to modern concerns and requirements. There is a move away from the use of cadavers to new technologies as a way of complementing the traditional approaches and addressing resource and ethical problems. Haptic (touch)…

  17. Influence of Stimulus Symmetry and Complexity upon Haptic Scanning Strategies During Detection, Learning and Recognition Tasks.

    ERIC Educational Resources Information Center

    Locher, Paul J.; Simmons, Roger W.

    Two experiments were conducted to investigate the perceptual processes involved in haptic exploration of randomly generated shapes. Experiment one required subjects to detect symmetrical or asymmetrical characteristics of individually presented plastic shapes, also varying in complexity. Scanning time for both symmetrical and asymmetrical shapes…

  18. What Aspects of Vision Facilitate Haptic Processing?

    ERIC Educational Resources Information Center

    Millar, Susanna; Al-Attar, Zainab

    2005-01-01

    We investigate how vision affects haptic performance when task-relevant visual cues are reduced or excluded. The task was to remember the spatial location of six landmarks that were explored by touch in a tactile map. Here, we use specially designed spectacles that simulate residual peripheral vision, tunnel vision, diffuse light perception, and…

  19. Overview Electrotactile Feedback for Enhancing Human Computer Interface

    NASA Astrophysics Data System (ADS)

    Pamungkas, Daniel S.; Caesarendra, Wahyu

    2018-04-01

    To achieve effective interaction between a human and a computing device or machine, adequate feedback from the computing device or machine is required. Recently, haptic feedback is increasingly being utilised to improve the interactivity of the Human Computer Interface (HCI). Most existing haptic feedback enhancements aim at producing forces or vibrations to enrich the user’s interactive experience. However, these force and/or vibration actuated haptic feedback systems can be bulky and uncomfortable to wear and only capable of delivering a limited amount of information to the user which can limit both their effectiveness and the applications they can be applied to. To address this deficiency, electrotactile feedback is used. This involves delivering haptic sensations to the user by electrically stimulating nerves in the skin via electrodes placed on the surface of the skin. This paper presents a review and explores the capability of electrotactile feedback for HCI applications. In addition, a description of the sensory receptors within the skin for sensing tactile stimulus and electric currents alsoseveral factors which influenced electric signal to transmit to the brain via human skinare explained.

  20. EMG-based visual-haptic biofeedback: a tool to improve motor control in children with primary dystonia.

    PubMed

    Casellato, Claudia; Pedrocchi, Alessandra; Zorzi, Giovanna; Vernisse, Lea; Ferrigno, Giancarlo; Nardocci, Nardo

    2013-05-01

    New insights suggest that dystonic motor impairments could also involve a deficit of sensory processing. In this framework, biofeedback, making covert physiological processes more overt, could be useful. The present work proposes an innovative integrated setup which provides the user with an electromyogram (EMG)-based visual-haptic biofeedback during upper limb movements (spiral tracking tasks), to test if augmented sensory feedbacks can induce motor control improvement in patients with primary dystonia. The ad hoc developed real-time control algorithm synchronizes the haptic loop with the EMG reading; the brachioradialis EMG values were used to modify visual and haptic features of the interface: the higher was the EMG level, the higher was the virtual table friction and the background color proportionally moved from green to red. From recordings on dystonic and healthy subjects, statistical results showed that biofeedback has a significant impact, correlated with the local impairment, on the dystonic muscular control. These tests pointed out the effectiveness of biofeedback paradigms in gaining a better specific-muscle voluntary motor control. The flexible tool developed here shows promising prospects of clinical applications and sensorimotor rehabilitation.

  1. Topographic modelling of haptic properties of tissue products

    NASA Astrophysics Data System (ADS)

    Rosen, B.-G.; Fall, A.; Rosen, S.; Farbrot, A.; Bergström, P.

    2014-03-01

    The way a product or material feels when touched, haptics, has been shown to be a property that plays an important role when consumers determine the quality of products For tissue products in constant touch with the skin, softness" becomes a primary quality parameter. In the present work, the relationship between topography and the feeling of the surface has been investigated for commercial tissues with varying degree of texture from the low textured crepe tissue to the highly textured embossed- and air-dried tissue products. A trained sensory panel at was used to grade perceived haptic "roughness". The technique used to characterize the topography was Digital light projection (DLP) technique, By the use of multivariate statistics, strong correlations between perceived roughness and topography were found with predictability of above 90 percent even though highly textured products were included. Characterization was made using areal ISO 25178-2 topography parameters in combination with non-contacting topography measurement. The best prediction ability was obtained when combining haptic properties with the topography parameters auto-correlation length (Sal), peak material volume (Vmp), core roughness depth (Sk) and the maximum height of the surface (Sz).

  2. Using postural synergies to animate a low-dimensional hand avatar in haptic simulation.

    PubMed

    Mulatto, Sara; Formaglio, Alessandro; Malvezzi, Monica; Prattichizzo, Domenico

    2013-01-01

    A technique to animate a realistic hand avatar with 20 DoFs based on the biomechanics of the human hand is presented. The animation does not use any sensor glove or advanced tracker with markers. The proposed approach is based on the knowledge of a set of kinematic constraints on the model of the hand, referred to as postural synergies, which allows to represent the hand posture using a number of variables lower than the number of joints of the hand model. This low-dimensional set of parameters is estimated from direct measurement of the motion of thumb and index finger tracked using two haptic devices. A kinematic inversion algorithm has been developed, which takes synergies into account and estimates the kinematic configuration of the whole hand, i.e., also of the fingers whose end tips are not directly tracked by the two haptic devices. The hand skin is deformable and its deformation is computed using a linear vertex blending technique. The proposed synergy-based animation of the hand avatar involves only algebraic computations and is suitable for real-time implementation as required in haptics.

  3. Inhomogeneous Point-Processes to Instantaneously Assess Affective Haptic Perception through Heartbeat Dynamics Information

    NASA Astrophysics Data System (ADS)

    Valenza, G.; Greco, A.; Citi, L.; Bianchi, M.; Barbieri, R.; Scilingo, E. P.

    2016-06-01

    This study proposes the application of a comprehensive signal processing framework, based on inhomogeneous point-process models of heartbeat dynamics, to instantaneously assess affective haptic perception using electrocardiogram-derived information exclusively. The framework relies on inverse-Gaussian point-processes with Laguerre expansion of the nonlinear Wiener-Volterra kernels, accounting for the long-term information given by the past heartbeat events. Up to cubic-order nonlinearities allow for an instantaneous estimation of the dynamic spectrum and bispectrum of the considered cardiovascular dynamics, as well as for instantaneous measures of complexity, through Lyapunov exponents and entropy. Short-term caress-like stimuli were administered for 4.3-25 seconds on the forearms of 32 healthy volunteers (16 females) through a wearable haptic device, by selectively superimposing two levels of force, 2 N and 6 N, and two levels of velocity, 9.4 mm/s and 65 mm/s. Results demonstrated that our instantaneous linear and nonlinear features were able to finely characterize the affective haptic perception, with a recognition accuracy of 69.79% along the force dimension, and 81.25% along the velocity dimension.

  4. Haptic guidance of overt visual attention.

    PubMed

    List, Alexandra; Iordanescu, Lucica; Grabowecky, Marcia; Suzuki, Satoru

    2014-11-01

    Research has shown that information accessed from one sensory modality can influence perceptual and attentional processes in another modality. Here, we demonstrated a novel crossmodal influence of haptic-shape information on visual attention. Participants visually searched for a target object (e.g., an orange) presented among distractor objects, fixating the target as quickly as possible. While searching for the target, participants held (never viewed and out of sight) an item of a specific shape in their hands. In two experiments, we demonstrated that the time for the eyes to reach a target-a measure of overt visual attention-was reduced when the shape of the held item (e.g., a sphere) was consistent with the shape of the visual target (e.g., an orange), relative to when the held shape was unrelated to the target (e.g., a hockey puck) or when no shape was held. This haptic-to-visual facilitation occurred despite the fact that the held shapes were not predictive of the visual targets' shapes, suggesting that the crossmodal influence occurred automatically, reflecting shape-specific haptic guidance of overt visual attention.

  5. Human-arm-and-hand-dynamic model with variability analyses for a stylus-based haptic interface.

    PubMed

    Fu, Michael J; Cavuşoğlu, M Cenk

    2012-12-01

    Haptic interface research benefits from accurate human arm models for control and system design. The literature contains many human arm dynamic models but lacks detailed variability analyses. Without accurate measurements, variability is modeled in a very conservative manner, leading to less than optimal controller and system designs. This paper not only presents models for human arm dynamics but also develops inter- and intrasubject variability models for a stylus-based haptic device. Data from 15 human subjects (nine male, six female, ages 20-32) were collected using a Phantom Premium 1.5a haptic device for system identification. In this paper, grip-force-dependent models were identified for 1-3-N grip forces in the three spatial axes. Also, variability due to human subjects and grip-force variation were modeled as both structured and unstructured uncertainties. For both forms of variability, the maximum variation, 95 %, and 67 % confidence interval limits were examined. All models were in the frequency domain with force as input and position as output. The identified models enable precise controllers targeted to a subset of possible human operator dynamics.

  6. Surgical virtual reality - highlights in developing a high performance surgical haptic device.

    PubMed

    Custură-Crăciun, D; Cochior, D; Constantinoiu, S; Neagu, C

    2013-01-01

    Just like simulators are a standard in aviation and aerospace sciences, we expect for surgical simulators to soon become a standard in medical applications. These will correctly instruct future doctors in surgical techniques without there being a need for hands on patient instruction. Using virtual reality by digitally transposing surgical procedures changes surgery in are volutionary manner by offering possibilities for implementing new, much more efficient, learning methods, by allowing the practice of new surgical techniques and by improving surgeon abilities and skills. Perfecting haptic devices has opened the door to a series of opportunities in the fields of research,industry, nuclear science and medicine. Concepts purely theoretical at first, such as telerobotics, telepresence or telerepresentation,have become a practical reality as calculus techniques, telecommunications and haptic devices evolved,virtual reality taking a new leap. In the field of surgery barrier sand controversies still remain, regarding implementation and generalization of surgical virtual simulators. These obstacles remain connected to the high costs of this yet fully sufficiently developed technology, especially in the domain of haptic devices. Celsius.

  7. Obstacle Crossing Differences Between Blind and Blindfolded Subjects After Haptic Exploration.

    PubMed

    Forner-Cordero, Arturo; Garcia, Valéria D; Rodrigues, Sérgio T; Duysens, Jacques

    2016-01-01

    Little is known about the ability of blind people to cross obstacles after they have explored haptically their size and position. Long-term absence of vision may affect spatial cognition in the blind while their extensive experience with the use of haptic information for guidance may lead to compensation strategies. Seven blind and 7 sighted participants (with vision available and blindfolded) walked along a flat pathway and crossed an obstacle after a haptic exploration. Blind and blindfolded subjects used different strategies to cross the obstacle. After the first 20 trials the blindfolded subjects reduced the distance between the foot and the obstacle at the toe-off instant, while the blind behaved as the subjects with full vision. Blind and blindfolded participants showed larger foot clearance than participants with vision. At foot landing the hip was more behind the foot in the blindfolded condition, while there were no differences between the blind and the vision conditions. For several parameters of the obstacle crossing task, blind people were more similar to subjects with full vision indicating that the blind subjects were able to compensate for the lack of vision.

  8. The Role of Graphic and Sanitized Violence in the Enjoyment of Television Dramas

    ERIC Educational Resources Information Center

    Weaver, Andrew J.; Wilson, Barbara J.

    2009-01-01

    This experiment explores the relationship between television violence and viewer enjoyment. Over 400 participants were randomly assigned to one of 15 conditions that were created by editing five TV programs into three versions each: A graphically violent version, a sanitized violent version, and a nonviolent version. After viewing, participants…

  9. Diary of a Conversion--Lotus 1-2-3 to Symphony 1.1.

    ERIC Educational Resources Information Center

    Dunnewin, Larry

    1986-01-01

    Describes the uses of Lotus 1-2-3 (a spreadsheet-graphics-database program created by Lotus Development Corporation) and Symphony 1.1 (a refinement and expansion of Symphony 1.01 providing memory efficiency, speed, ease of use, greater file compatibility). Spreadsheet and graphics capabilities, the use of windows, database environment, and…

  10. An interactive interface for NCAR Graphics

    NASA Technical Reports Server (NTRS)

    Buzbee, Bill; Lackman, Bob; Alpert, Ethan

    1994-01-01

    The NCAR Graphics package has been a valuable research tool for over 20 years. As a low level Fortran library, however, it was difficult to use for nonprogramming researchers. With this grant and NSF support, an interactive interface has been created which greatly facilitates use of the package by researchers of diverse computer skill levels.

  11. Comparing Graphical and Verbal Representations of Measurement Error in Test Score Reports

    ERIC Educational Resources Information Center

    Zwick, Rebecca; Zapata-Rivera, Diego; Hegarty, Mary

    2014-01-01

    Research has shown that many educators do not understand the terminology or displays used in test score reports and that measurement error is a particularly challenging concept. We investigated graphical and verbal methods of representing measurement error associated with individual student scores. We created four alternative score reports, each…

  12. Getting the Bigger Picture: Children's Utilization of Graphics and Text

    ERIC Educational Resources Information Center

    Norman, Rebecca R.; Roberts, Kathryn L.

    2015-01-01

    This study examined 30 second graders' patterns of attention to graphics (e.g., maps, diagrams, photographs, illustrations) and their illustration extensions (e.g., captions, labels) in two informational texts, and how students processed these items (e.g., creating narrative or evaluating). Results indicate that students do tend to study different…

  13. Data visualization, bar naked: A free tool for creating interactive graphics.

    PubMed

    Weissgerber, Tracey L; Savic, Marko; Winham, Stacey J; Stanisavljevic, Dejana; Garovic, Vesna D; Milic, Natasa M

    2017-12-15

    Although bar graphs are designed for categorical data, they are routinely used to present continuous data in studies that have small sample sizes. This presentation is problematic, as many data distributions can lead to the same bar graph, and the actual data may suggest different conclusions from the summary statistics. To address this problem, many journals have implemented new policies that require authors to show the data distribution. This paper introduces a free, web-based tool for creating an interactive alternative to the bar graph (http://statistika.mfub.bg.ac.rs/interactive-dotplot/). This tool allows authors with no programming expertise to create customized interactive graphics, including univariate scatterplots, box plots, and violin plots, for comparing values of a continuous variable across different study groups. Individual data points may be overlaid on the graphs. Additional features facilitate visualization of subgroups or clusters of non-independent data. A second tool enables authors to create interactive graphics from data obtained with repeated independent experiments (http://statistika.mfub.bg.ac.rs/interactive-repeated-experiments-dotplot/). These tools are designed to encourage exploration and critical evaluation of the data behind the summary statistics and may be valuable for promoting transparency, reproducibility, and open science in basic biomedical research. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  14. Computer graphics testbed to simulate and test vision systems for space applications

    NASA Technical Reports Server (NTRS)

    Cheatham, John B.; Wu, Chris K.; Lin, Y. H.

    1991-01-01

    A system was developed for displaying computer graphics images of space objects and the use of the system was demonstrated as a testbed for evaluating vision systems for space applications. In order to evaluate vision systems, it is desirable to be able to control all factors involved in creating the images used for processing by the vision system. Considerable time and expense is involved in building accurate physical models of space objects. Also, precise location of the model relative to the viewer and accurate location of the light source require additional effort. As part of this project, graphics models of space objects such as the Solarmax satellite are created that the user can control the light direction and the relative position of the object and the viewer. The work is also aimed at providing control of hue, shading, noise and shadows for use in demonstrating and testing imaging processing techniques. The simulated camera data can provide XYZ coordinates, pitch, yaw, and roll for the models. A physical model is also being used to provide comparison of camera images with the graphics images.

  15. Reducing the motor response in haptic parallel matching eliminates the typically observed gender difference.

    PubMed

    van Mier, Hanneke I

    2016-01-01

    When making two bars haptically parallel to each other, large deviations have been observed, most likely caused by the bias of a hand-centered egocentric reference frame. A consistent finding is that women show significantly larger deviations than men when performing this task. It has been suggested that this difference might be due to the fact that women are more egocentrically oriented than men or are less efficient in overcoming the egocentric bias of the hand. If this is indeed the case, reducing the bias of the egocentric reference frame should eliminate the above-mentioned gender difference. This was investigated in the current study. Sixty participants (30 men, 30 women) were instructed to haptically match (task HP) the orientation of a test bar with the dominant hand to the orientation of a reference bar that was perceived with the non-dominant hand. In a haptic visual task (task HV), in which only the reference bar and exploring hand were out of view, no motor response was required, but participants had to "match" the perceived orientation by verbally naming the parallel orientation that was read out on a test protractor. Both females and males performed better in the HV task than in the HP task. Significant gender effects were only found in the haptic parallelity task (HP), corroborating the idea that women perform at the same level as men when the egocentric bias of the hand is reduced.

  16. Enhanced visuo-haptic integration for the non-dominant hand.

    PubMed

    Yalachkov, Yavor; Kaiser, Jochen; Doehrmann, Oliver; Naumer, Marcus J

    2015-07-21

    Visuo-haptic integration contributes essentially to object shape recognition. Although there has been a considerable advance in elucidating the neural underpinnings of multisensory perception, it is still unclear whether seeing an object and exploring it with the dominant hand elicits the same brain response as compared to the non-dominant hand. Using fMRI to measure brain activation in right-handed participants, we found that for both left- and right-hand stimulation the left lateral occipital complex (LOC) and anterior cerebellum (aCER) were involved in visuo-haptic integration of familiar objects. These two brain regions were then further investigated in another study, where unfamiliar, novel objects were presented to a different group of right-handers. Here the left LOC and aCER were more strongly activated by bimodal than unimodal stimuli only when the left but not the right hand was used. A direct comparison indicated that the multisensory gain of the fMRI activation was significantly higher for the left than the right hand. These findings are in line with the principle of "inverse effectiveness", implying that processing of bimodally presented stimuli is particularly enhanced when the unimodal stimuli are weak. This applies also when right-handed subjects see and simultaneously touch unfamiliar objects with their non-dominant left hand. Thus, the fMRI signal in the left LOC and aCER induced by visuo-haptic stimulation is dependent on which hand was employed for haptic exploration. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Towards a Teleoperated Needle Driver Robot with Haptic Feedback for RFA of Breast Tumors under Continuous MRI1

    PubMed Central

    Kokes, Rebecca; Lister, Kevin; Gullapalli, Rao; Zhang, Bao; MacMillan, Alan; Richard, Howard; Desai, Jaydev P.

    2009-01-01

    Objective The purpose of this paper is to explore the feasibility of developing a MRI-compatible needle driver system for radiofrequency ablation (RFA) of breast tumors under continuous MRI imaging while being teleoperated by a haptic feedback device from outside the scanning room. The developed needle driver prototype was designed and tested for both tumor targeting capability as well as RFA. Methods The single degree-of-freedom (DOF) prototype was interfaced with a PHANToM haptic device controlled from outside the scanning room. Experiments were performed to demonstrate MRI-compatibility and position control accuracy with hydraulic actuation, along with an experiment to determine the PHANToM’s ability to guide the RFA tool to a tumor nodule within a phantom breast tissue model while continuously imaging within the MRI and receiving force feedback from the RFA tool. Results Hydraulic actuation is shown to be a feasible actuation technique for operation in an MRI environment. The design is MRI-compatible in all aspects except for force sensing in the directions perpendicular to the direction of motion. Experiments confirm that the user is able to detect healthy vs. cancerous tissue in a phantom model when provided with both visual (imaging) feedback and haptic feedback. Conclusion The teleoperated 1-DOF needle driver system presented in this paper demonstrates the feasibility of implementing a MRI-compatible robot for RFA of breast tumors with haptic feedback capability. PMID:19303805

  18. When Neuroscience ‘Touches’ Architecture: From Hapticity to a Supramodal Functioning of the Human Brain

    PubMed Central

    Papale, Paolo; Chiesi, Leonardo; Rampinini, Alessandra C.; Pietrini, Pietro; Ricciardi, Emiliano

    2016-01-01

    In the last decades, the rapid growth of functional brain imaging methodologies allowed cognitive neuroscience to address open questions in philosophy and social sciences. At the same time, novel insights from cognitive neuroscience research have begun to influence various disciplines, leading to a turn to cognition and emotion in the fields of planning and architectural design. Since 2003, the Academy of Neuroscience for Architecture has been supporting ‘neuro-architecture’ as a way to connect neuroscience and the study of behavioral responses to the built environment. Among the many topics related to multisensory perceptual integration and embodiment, the concept of hapticity was recently introduced, suggesting a pivotal role of tactile perception and haptic imagery in architectural appraisal. Arguments have thus risen in favor of the existence of shared cognitive foundations between hapticity and the supramodal functional architecture of the human brain. Precisely, supramodality refers to the functional feature of defined brain regions to process and represent specific information content in a more abstract way, independently of the sensory modality conveying such information to the brain. Here, we highlight some commonalities and differences between the concepts of hapticity and supramodality according to the distinctive perspectives of architecture and cognitive neuroscience. This comparison and connection between these two different approaches may lead to novel observations in regard to people–environment relationships, and even provide empirical foundations for a renewed evidence-based design theory. PMID:27375542

  19. Modeling and Design of an Electro-Rheological Fluid Based Haptic System for Tele-Operation of Space Robots

    NASA Technical Reports Server (NTRS)

    Mavroidis, Constantinos; Pfeiffer, Charles; Paljic, Alex; Celestino, James; Lennon, Jamie; Bar-Cohen, Yoseph

    2000-01-01

    For many years, the robotic community sought to develop robots that can eventually operate autonomously and eliminate the need for human operators. However, there is an increasing realization that there are some tasks that human can perform significantly better but, due to associated hazards, distance, physical limitations and other causes, only robot can be employed to perform these tasks. Remotely performing these types of tasks requires operating robots as human surrogates. While current "hand master" haptic systems are able to reproduce the feeling of rigid objects, they present great difficulties in emulating the feeling of remote/virtual stiffness. In addition, they tend to be heavy, cumbersome and usually they only allow limited operator workspace. In this paper a novel haptic interface is presented to enable human-operators to "feel" and intuitively mirror the stiffness/forces at remote/virtual sites enabling control of robots as human-surrogates. This haptic interface is intended to provide human operators intuitive feeling of the stiffness and forces at remote or virtual sites in support of space robots performing dexterous manipulation tasks (such as operating a wrench or a drill). Remote applications are referred to the control of actual robots whereas virtual applications are referred to simulated operations. The developed haptic interface will be applicable to IVA operated robotic EVA tasks to enhance human performance, extend crew capability and assure crew safety. The electrically controlled stiffness is obtained using constrained ElectroRheological Fluids (ERF), which changes its viscosity under electrical stimulation. Forces applied at the robot end-effector due to a compliant environment will be reflected to the user using this ERF device where a change in the system viscosity will occur proportionally to the force to be transmitted. In this paper, we will present the results of our modeling, simulation, and initial testing of such an electrorheological fluid (ERF) based haptic device.

  20. Human haptic perception is interrupted by explorative stops of milliseconds

    PubMed Central

    Grunwald, Martin; Muniyandi, Manivannan; Kim, Hyun; Kim, Jung; Krause, Frank; Mueller, Stephanie; Srinivasan, Mandayam A.

    2014-01-01

    Introduction: The explorative scanning movements of the hands have been compared to those of the eyes. The visual process is known to be composed of alternating phases of saccadic eye movements and fixation pauses. Descriptive results suggest that during the haptic exploration of objects short movement pauses occur as well. The goal of the present study was to detect these “explorative stops” (ES) during one-handed and two-handed haptic explorations of various objects and patterns, and to measure their duration. Additionally, the associations between the following variables were analyzed: (a) between mean exploration time and duration of ES, (b) between certain stimulus features and ES frequency, and (c) the duration of ES during the course of exploration. Methods: Five different Experiments were used. The first two Experiments were classical recognition tasks of unknown haptic stimuli (A) and of common objects (B). In Experiment C space-position information of angle legs had to be perceived and reproduced. For Experiments D and E the PHANToM haptic device was used for the exploration of virtual (D) and real (E) sunken reliefs. Results: In each Experiment we observed explorative stops of different average durations. For Experiment A: 329.50 ms, Experiment B: 67.47 ms, Experiment C: 189.92 ms, Experiment D: 186.17 ms and Experiment E: 140.02 ms. Significant correlations were observed between exploration time and the duration of the ES. Also, ES occurred more frequently, but not exclusively, at defined stimulus features like corners, curves and the endpoints of lines. However, explorative stops do not occur every time a stimulus feature is explored. Conclusions: We assume that ES are a general aspect of human haptic exploration processes. We have tried to interpret the occurrence and duration of ES with respect to the Hypotheses-Rebuild-Model and the Limited Capacity Control System theory. PMID:24782797

  1. VSHC -- VAXstation VWS hardcopy

    NASA Astrophysics Data System (ADS)

    Huckle, H. E.; Clayton, C. A.

    VSHC works when a detached process is run at boot time which runs a .EXE file that creates a permanent mailbox and redefines UISPRINT_DESTINATION to that mailbox. The program then goes into an infinite loop which includes a read to that mailbox. When a hardcopy is initiated, sixel graphics commands are sent to UISPRINT_DESTINATION and thus go to the mailbox. The program then reads those graphics commands from the mailbox and interprets them into equivalent Canon commands, using a `State Machine' technique to determine how far it's got, i.e. is it a start of a plot, end of plot, middle of plot, next plot etc. It spools the file of Canon graphics commands thus created (in VSHC_SCRATCH:), to a queue pointed at by the logical name VSHC_QUEUE. UISPRINT_DESTINATION can be mysteriously reset to its default value of CSA0: and so every few minutes an AST timeout occurs to reset UISPRINT_DESTINATION.

  2. SPARQLGraph: a web-based platform for graphically querying biological Semantic Web databases.

    PubMed

    Schweiger, Dominik; Trajanoski, Zlatko; Pabinger, Stephan

    2014-08-15

    Semantic Web has established itself as a framework for using and sharing data across applications and database boundaries. Here, we present a web-based platform for querying biological Semantic Web databases in a graphical way. SPARQLGraph offers an intuitive drag & drop query builder, which converts the visual graph into a query and executes it on a public endpoint. The tool integrates several publicly available Semantic Web databases, including the databases of the just recently released EBI RDF platform. Furthermore, it provides several predefined template queries for answering biological questions. Users can easily create and save new query graphs, which can also be shared with other researchers. This new graphical way of creating queries for biological Semantic Web databases considerably facilitates usability as it removes the requirement of knowing specific query languages and database structures. The system is freely available at http://sparqlgraph.i-med.ac.at.

  3. Writing a Scientific Paper II. Communication by Graphics

    NASA Astrophysics Data System (ADS)

    Sterken, C.

    2011-07-01

    This paper discusses facets of visual communication by way of images, graphs, diagrams and tabular material. Design types and elements of graphical images are presented, along with advice on how to create graphs, and on how to read graphical illustrations. This is done in astronomical context, using case studies and historical examples of good and bad graphics. Design types of graphs (scatter and vector plots, histograms, pie charts, ternary diagrams and three-dimensional surface graphs) are explicated, as well as the major components of graphical images (axes, legends, textual parts, etc.). The basic features of computer graphics (image resolution, vector images, bitmaps, graphical file formats and file conversions) are explained, as well as concepts of color models and of color spaces (with emphasis on aspects of readability of color graphics by viewers suffering from color-vision deficiencies). Special attention is given to the verity of graphical content, and to misrepresentations and errors in graphics and associated basic statistics. Dangers of dot joining and curve fitting are discussed, with emphasis on the perception of linearity, the issue of nonsense correlations, and the handling of outliers. Finally, the distinction between data, fits and models is illustrated.

  4. The Easy Way to Create Computer Slide Shows.

    ERIC Educational Resources Information Center

    Anderson, Mary Alice

    1995-01-01

    Discusses techniques for creating computer slide shows. Topics include memory; format; color use; HyperCard and CD-ROM; font styles and sizes; graphs and graphics; the slide show option; special effects; and tips for effective presentation. (Author/AEF)

  5. Computerized Fortune Cookies--a Classroom Treat.

    ERIC Educational Resources Information Center

    Reissman, Rose

    1996-01-01

    Discusses the use of fortune cookie fortunes in a middle school class combined with computer graphics, desktop publishing, and word processing technology to create writing assignments, games, and discussions. Topics include cultural contexts, and students creating their own fortunes. (LRW)

  6. A review of simulation platforms in surgery of the temporal bone.

    PubMed

    Bhutta, M F

    2016-10-01

    Surgery of the temporal bone is a high-risk activity in an anatomically complex area. Simulation enables rehearsal of such surgery. The traditional simulation platform is the cadaveric temporal bone, but in recent years other simulation platforms have been created, including plastic and virtual reality platforms. To undertake a review of simulation platforms for temporal bone surgery, specifically assessing their educational value in terms of validity and in enabling transition to surgery. Systematic qualitative review. Search of the Pubmed, CINAHL, BEI and ERIC databases. Assessment of reported outcomes in terms of educational value. A total of 49 articles were included, covering cadaveric, animal, plastic and virtual simulation platforms. Cadaveric simulation is highly rated as an educational tool, but there may be a ceiling effect on educational outcomes after drilling 8-10 temporal bones. Animal models show significant anatomical variation from man. Plastic temporal bone models offer much potential, but at present lack sufficient anatomical or haptic validity. Similarly, virtual reality platforms lack sufficient anatomical or haptic validity, but with technological improvements they are advancing rapidly. At present, cadaveric simulation remains the best platform for training in temporal bone surgery. Technological advances enabling improved materials or modelling mean that in the future plastic or virtual platforms may become comparable to cadaveric platforms, and also offer additional functionality including patient-specific simulation from CT data. © 2015 John Wiley & Sons Ltd.

  7. Parylene-coated ionic liquid-carbon nanotube actuators for user-safe haptic devices.

    PubMed

    Bubak, Grzegorz; Gendron, David; Ceseracciu, Luca; Ansaldo, Alberto; Ricci, Davide

    2015-07-22

    Simple fabrication, high power-to-weight and power-to-volume ratios, and the ability to operate in open air at low voltage make the ionic electroactive polymer actuators highly attractive for haptic applications. Whenever a direct tactile stimulation of the skin is involved, electrical and chemical insulation as well as a long-term stability of the actuator are required. Because of its inherent physicochemical properties such as high dielectric strength, resistance to solvents, and biological inactivity, Parylene C meets the requirements for making biocompatible actuators. We have studied the displacement and the generated force of Parylene-coated carbon nanotube actuators as well as the encapsulation quality. A 2 μm coating creates an effective electrical insulation of the actuators without altering the blocking force at frequencies from 50 mHz to 1 Hz. Moreover, the generated strain is preserved at higher frequencies (from 0.5 to 5 Hz). We employed a simple mechanical model to explain the relation between the key parameters-flexural stiffness, displacement, and force-for uncoated and coated actuators. In addition, we demonstrated that our Parylene-coated actuators are not damaged by rinsing in liquid media such as 2-propanol or water. In conclusion, our results indicate that Parylene C encapsulated actuators are safe to touch and can be used in contact with human skin and in biomedical applications in direct contact with tissues and physiological fluids.

  8. Multimodal Interaction with Speech, Gestures and Haptic Feedback in a Media Center Application

    NASA Astrophysics Data System (ADS)

    Turunen, Markku; Hakulinen, Jaakko; Hella, Juho; Rajaniemi, Juha-Pekka; Melto, Aleksi; Mäkinen, Erno; Rantala, Jussi; Heimonen, Tomi; Laivo, Tuuli; Soronen, Hannu; Hansen, Mervi; Valkama, Pellervo; Miettinen, Toni; Raisamo, Roope

    We demonstrate interaction with a multimodal media center application. Mobile phone-based interface includes speech and gesture input and haptic feedback. The setup resembles our long-term public pilot study, where a living room environment containing the application was constructed inside a local media museum allowing visitors to freely test the system.

  9. Grounded Learning Experience: Helping Students Learn Physics through Visuo-Haptic Priming and Instruction

    ERIC Educational Resources Information Center

    Huang, Shih-Chieh Douglas

    2013-01-01

    In this dissertation, I investigate the effects of a grounded learning experience on college students' mental models of physics systems. The grounded learning experience consisted of a priming stage and an instruction stage, and within each stage, one of two different types of visuo-haptic representation was applied: visuo-gestural simulation…

  10. Study of Co-Located and Distant Collaboration with Symbolic Support via a Haptics-Enhanced Virtual Reality Task

    ERIC Educational Resources Information Center

    Yeh, Shih-Ching; Hwang, Wu-Yuin; Wang, Jin-Liang; Zhan, Shi-Yi

    2013-01-01

    This study intends to investigate how multi-symbolic representations (text, digits, and colors) could effectively enhance the completion of co-located/distant collaborative work in a virtual reality context. Participants' perceptions and behaviors were also studied. A haptics-enhanced virtual reality task was developed to conduct…

  11. Haptic device for a ventricular shunt insertion simulator.

    PubMed

    Panchaphongsaphak, Bundit; Stutzer, Diego; Schwyter, Etienne; Bernays, René-Ludwig; Riener, Robert

    2006-01-01

    In this paper we propose a new one-degree-of-freedom haptic device that can be used to simulate ventricular shunt insertion procedures. The device is used together with the BRAINTRAIN training simulator developed for neuroscience education, neurological data visualization and surgical planning. The design of the haptic device is based on a push-pull cable concept. The rendered forces produced by a linear motor connected at one end of the cable are transferred to the user via a sliding mechanism at the end-effector located at the other end of the cable. The end-effector provides the range of movement up to 12 cm. The force is controlled by an open-loop impedance algorithm and can become up to 15 N.

  12. A novel shape-changing haptic table-top display

    NASA Astrophysics Data System (ADS)

    Wang, Jiabin; Zhao, Lu; Liu, Yue; Wang, Yongtian; Cai, Yi

    2018-01-01

    A shape-changing table-top display with haptic feedback allows its users to perceive 3D visual and texture displays interactively. Since few existing devices are developed as accurate displays with regulatory haptic feedback, a novel attentive and immersive shape changing mechanical interface (SCMI) consisting of image processing unit and transformation unit was proposed in this paper. In order to support a precise 3D table-top display with an offset of less than 2 mm, a custommade mechanism was developed to form precise surface and regulate the feedback force. The proposed image processing unit was capable of extracting texture data from 2D picture for rendering shape-changing surface and realizing 3D modeling. The preliminary evaluation result proved the feasibility of the proposed system.

  13. Virtual reality robotic telesurgery simulations using MEMICA haptic system

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph; Mavroidis, Constantinos; Bouzit, Mourad; Dolgin, Benjamin; Harm, Deborah L.; Kopchok, George E.; White, Rodney

    2001-01-01

    The authors conceived a haptic mechanism called MEMICA (Remote Mechanical Mirroring using Controlled stiffness and Actuators) that can enable the design of high dexterity, rapid response, and large workspace haptic system. The development of a novel MEMICA gloves and virtual reality models are being explored to allow simulation of telesurgery and other applications. The MEMICA gloves are being designed to provide intuitive mirroring of the conditions at a virtual site where a robot simulates the presence of a human operator. The key components of MEMICA are miniature electrically controlled stiffness (ECS) elements and electrically controlled force and stiffness (ECFS) actuators that are based on the use of Electro-Rheological Fluids (ERF. In this paper the design of the MEMICA system and initial experimental results are presented.

  14. Inertial constraints on limb proprioception are independent of visual calibration.

    PubMed

    Riley, M A; Turvey, M T

    2001-04-01

    When the coincidence of a limb's spatial axes and inertial eigenvectors is broken, haptic proprioception of the limb's position conforms to the eigenvectors. Additionally, when prisms break the coincidence between an arm's visual and actual positions, haptic proprioception is shifted toward the visual-spatial direction. In 3 experiments, variation of the arm's mass distribution was combined with prism adaptation to investigate the hypothesis that the proprioceptive effects of inertial and visual manipulations are additive. This hypothesis was supported across manipulations of plane of motion, body posture, proprioceptive target, and proprioceptive experience during prism adaptation. Haptic proprioception seems to depend on local, physical reference frames that are relative to the physical reference frames for the body's environmental position and orientation.

  15. Development of Quasi-3DOF upper limb rehabilitation system using ER brake: PLEMO-P1

    NASA Astrophysics Data System (ADS)

    Kikuchi, T.; Fukushima, K.; Furusho, J.; Ozawa, T.

    2009-02-01

    In recent years, many researchers have studied the potential of using robotics technology to assist and quantify the motor functions for neuron-rehabilitation. Some kinds of haptic devices have been developed and evaluated its efficiency with clinical tests, for example, upper limb training for patients with spasticity after stroke. However, almost all the devices are active-type (motor-driven) haptic devices and they basically require high-cost safety system compared to passive-type (brake-based) devices. In this study, we developed a new practical haptic device 'PLEMO-P1'; this system adopted ER brakes as its force generators. In this paper, the mechanism of PLEMO-P1 and its software for a reaching rehabilitation are described.

  16. The pits and falls of graphical presentation

    PubMed Central

    Sperandei, Sandro

    2014-01-01

    Graphics are powerful tools to communicate research results and to gain information from data. However, researchers should be careful when deciding which data to plot and the type of graphic to use, as well as other details. The consequence of bad decisions in these features varies from making research results unclear to distortions of these results, through the creation of “chartjunk” with useless information. This paper is not another tutorial about “good graphics” and “bad graphics”. Instead, it presents guidelines for graphic presentation of research results and some uncommon, but useful examples to communicate basic and complex data types, especially multivariate model results, which are commonly presented only by tables. By the end, there are no answers here, just ideas meant to inspire others on how to create their own graphics. PMID:25351349

  17. Creating, Storing, and Dumping Low and High Resolution Graphics on the Apple IIe Microcomputer System.

    ERIC Educational Resources Information Center

    Fletcher, Richard K., Jr.

    This description of procedures for dumping high and low resolution graphics using the Apple IIe microcomputer system focuses on two special hardware configurations that are commonly used in schools--the Apple Dot Matrix Printer with the Apple Parallel Interface Card, and the Imagewriter Printer with the Apple Super Serial Interface Card. Special…

  18. Hospice Comics: Representations of Patient and Family Experience of Illness and Death in Graphic Novels.

    PubMed

    Czerwiec, M K; Huang, Michelle N

    2017-06-01

    Non-fiction graphic novels about illness and death created by patients and their loved ones have much to teach all readers. However, the bond of empathy made possible in the comic form may have special lessons for healthcare providers who read these texts and are open to the insights they provide.

  19. Guidance from the Graphical User Interface (GUI) Experience: What GUI Teaches about Technology Access.

    ERIC Educational Resources Information Center

    National Council on Disability, Washington, DC.

    This report investigates the use of the graphical user interface (GUI) in computer programs, the problems it creates for individuals with visual impairments or blindness, and advocacy efforts concerning this issue, which have been targeted primarily at Microsoft, producer of Windows. The report highlights the concerns of individuals with visual…

  20. A Graphical Journey of Innovative Organic Architectures that Have Improved Our Lives

    ERIC Educational Resources Information Center

    McGrath, Nicholas A.; Brichacek, Matthew; Njardarson, Jon T.

    2010-01-01

    A new free graphical teaching tool that highlights the beautiful organic architectures of the top selling pharmaceuticals is detailed on two posters. In addition to the multitude of teaching and data-mining opportunities these posters offer, they were also created to emphasize the central role organic chemists play in the development of new…

  1. NASA's Hybrid Reality Lab: One Giant Leap for Full Dive

    NASA Technical Reports Server (NTRS)

    Delgado, Francisco J.; Noyes, Matthew

    2017-01-01

    This presentation demonstrates how NASA is using consumer VR headsets, game engine technology and NVIDIA's GPUs to create highly immersive future training systems augmented with extremely realistic haptic feedback, sound, additional sensory information, and how these can be used to improve the engineering workflow. Include in this presentation is an environment simulation of the ISS, where users can interact with virtual objects, handrails, and tracked physical objects while inside VR, integration of consumer VR headsets with the Active Response Gravity Offload System, and a space habitat architectural evaluation tool. Attendees will learn how the best elements of real and virtual worlds can be combined into a hybrid reality environment with tangible engineering and scientific applications.

  2. [Computer graphic display of retinal examination results. Software improving the quality of documenting fundus changes].

    PubMed

    Jürgens, Clemens; Grossjohann, Rico; Czepita, Damian; Tost, Frank

    2009-01-01

    Graphic documentation of retinal examination results in clinical ophthalmological practice is often depicted using pictures or in handwritten form. Popular software products used to describe changes in the fundus do not vary much from simple graphic programs that enable to insert, scale and edit basic graphic elements such as: a circle, rectangle, arrow or text. Displaying the results of retinal examinations in a unified way is difficult to achieve. Therefore, we devised and implemented modern software tools for this purpose. A computer program enabling to quickly and intuitively form graphs of the fundus, that can be digitally archived or printed was created. Especially for the needs of ophthalmological clinics, a set of standard digital symbols used to document the results of retinal examinations was developed and installed in a library of graphic symbols. These symbols are divided into the following categories: preoperative, postoperative, neovascularization, retinopathy of prematurity. The appropriate symbol can be selected with a click of the mouse and dragged-and-dropped on the canvas of the fundus. Current forms of documenting results of retinal examinations are unsatisfactory, due to the fact that they are time consuming and imprecise. Unequivocal interpretation is difficult or in some cases impossible. Using the developed computer program a sketch of the fundus can be created much more quickly than by hand drawing. Additionally the quality of the medica documentation using a system of well described and standardized symbols will be enhanced. (1) Graphic symbols used to document the results of retinal examinations are a part of everyday clinical practice. (2) The designed computer program will allow quick and intuitive graphical creation of fundus sketches that can be either digitally archived or printed.

  3. TIGER: A graphically interactive grid system for turbomachinery applications

    NASA Technical Reports Server (NTRS)

    Shih, Ming-Hsin; Soni, Bharat K.

    1992-01-01

    Numerical grid generation algorithm associated with the flow field about turbomachinery geometries is presented. Graphical user interface is developed with FORMS Library to create an interactive, user-friendly working environment. This customized algorithm reduces the man-hours required to generate a grid associated with turbomachinery geometry, as compared to the use of general-purpose grid generation softwares. Bezier curves are utilized both interactively and automatically to accomplish grid line smoothness and orthogonality. Graphical User Interactions are provided in the algorithm, allowing the user to design and manipulate the grid lines with a mouse.

  4. Eye Tracking of Occluded Self-Moved Targets: Role of Haptic Feedback and Hand-Target Dynamics.

    PubMed

    Danion, Frederic; Mathew, James; Flanagan, J Randall

    2017-01-01

    Previous studies on smooth pursuit eye movements have shown that humans can continue to track the position of their hand, or a target controlled by the hand, after it is occluded, thereby demonstrating that arm motor commands contribute to the prediction of target motion driving pursuit eye movements. Here, we investigated this predictive mechanism by manipulating both the complexity of the hand-target mapping and the provision of haptic feedback. Two hand-target mappings were used, either a rigid (simple) one in which hand and target motion matched perfectly or a nonrigid (complex) one in which the target behaved as a mass attached to the hand by means of a spring. Target animation was obtained by asking participants to oscillate a lightweight robotic device that provided (or not) haptic feedback consistent with the target dynamics. Results showed that as long as 7 s after target occlusion, smooth pursuit continued to be the main contributor to total eye displacement (∼60%). However, the accuracy of eye-tracking varied substantially across experimental conditions. In general, eye-tracking was less accurate under the nonrigid mapping, as reflected by higher positional and velocity errors. Interestingly, haptic feedback helped to reduce the detrimental effects of target occlusion when participants used the nonrigid mapping, but not when they used the rigid one. Overall, we conclude that the ability to maintain smooth pursuit in the absence of visual information can extend to complex hand-target mappings, but the provision of haptic feedback is critical for the maintenance of accurate eye-tracking performance.

  5. Eye Tracking of Occluded Self-Moved Targets: Role of Haptic Feedback and Hand-Target Dynamics

    PubMed Central

    Mathew, James

    2017-01-01

    Abstract Previous studies on smooth pursuit eye movements have shown that humans can continue to track the position of their hand, or a target controlled by the hand, after it is occluded, thereby demonstrating that arm motor commands contribute to the prediction of target motion driving pursuit eye movements. Here, we investigated this predictive mechanism by manipulating both the complexity of the hand-target mapping and the provision of haptic feedback. Two hand-target mappings were used, either a rigid (simple) one in which hand and target motion matched perfectly or a nonrigid (complex) one in which the target behaved as a mass attached to the hand by means of a spring. Target animation was obtained by asking participants to oscillate a lightweight robotic device that provided (or not) haptic feedback consistent with the target dynamics. Results showed that as long as 7 s after target occlusion, smooth pursuit continued to be the main contributor to total eye displacement (∼60%). However, the accuracy of eye-tracking varied substantially across experimental conditions. In general, eye-tracking was less accurate under the nonrigid mapping, as reflected by higher positional and velocity errors. Interestingly, haptic feedback helped to reduce the detrimental effects of target occlusion when participants used the nonrigid mapping, but not when they used the rigid one. Overall, we conclude that the ability to maintain smooth pursuit in the absence of visual information can extend to complex hand-target mappings, but the provision of haptic feedback is critical for the maintenance of accurate eye-tracking performance. PMID:28680964

  6. Should drivers be operating within an automation-free bandwidth? Evaluating haptic steering support systems with different levels of authority.

    PubMed

    Petermeijer, Sebastiaan M; Abbink, David A; de Winter, Joost C F

    2015-02-01

    The aim of this study was to compare continuous versus bandwidth haptic steering guidance in terms of lane-keeping behavior, aftereffects, and satisfaction. An important human factors question is whether operators should be supported continuously or only when tolerance limits are exceeded. We aimed to clarify this issue for haptic steering guidance by investigating costs and benefits of both approaches in a driving simulator. Thirty-two participants drove five trials, each with a different level of haptic support: no guidance (Manual); guidance outside a 0.5-m bandwidth (Band1); a hysteresis version of Band1, which guided back to the lane center once triggered (Band2); continuous guidance (Cont); and Cont with double feedback gain (ContS). Participants performed a reaction time task while driving. Toward the end of each trial, the guidance was unexpectedly disabled to investigate aftereffects. All four guidance systems prevented large lateral errors (>0.7 m). Cont and especially ContS yielded smaller lateral errors and higher time to line crossing than Manual, Band1, and Band2. Cont and ContS yielded short-lasting aftereffects, whereas Band1 and Band2 did not. Cont yielded higher self-reported satisfaction and faster reaction times than Band1. Continuous and bandwidth guidance both prevent large driver errors. Continuous guidance yields improved performance and satisfaction over bandwidth guidance at the cost of aftereffects and variability in driver torque (indicating human-automation conflicts). The presented results are useful for designers of haptic guidance systems and support critical thinking about the costs and benefits of automation support systems.

  7. Researching Haptics in Higher Education: The Complexity of Developing Haptics Virtual Learning Systems and Evaluating Its Impact on Students' Learning

    ERIC Educational Resources Information Center

    San Diego, Jonathan P.; Cox, Margaret J.; Quinn, Barry F. A.; Newton, Jonathan Tim; Banerjee, Avijit; Woolford, Mark

    2012-01-01

    hapTEL, an interdisciplinary project funded by two UK research councils from 2007 to 2011, involves a large interdisciplinary team (with undergraduate and post-graduate student participants) which has been developing and evaluating a virtual learning system within an HE healthcare education setting, working on three overlapping strands. Strand 1…

  8. Effects of a Haptic Augmented Simulation on K-12 Students' Achievement and Their Attitudes Towards Physics

    ERIC Educational Resources Information Center

    Civelek, Turhan; Ucar, Erdem; Ustunel, Hakan; Aydin, Mehmet Kemal

    2014-01-01

    The current research aims to explore the effects of a haptic augmented simulation on students' achievement and their attitudes towards Physics in an immersive virtual reality environment (VRE). A quasi-experimental post-test design was employed utilizing experiment and control groups. The participants were 215 students from a K-12 school in…

  9. Physics-based approach to haptic display

    NASA Technical Reports Server (NTRS)

    Brown, J. Michael; Colgate, J. Edward

    1994-01-01

    This paper addresses the implementation of complex multiple degree of freedom virtual environments for haptic display. We suggest that a physics based approach to rigid body simulation is appropriate for hand tool simulation, but that currently available simulation techniques are not sufficient to guarantee successful implementation. We discuss the desirable features of a virtual environment simulation, specifically highlighting the importance of stability guarantees.

  10. Neodymium:YAG laser cutting of intraocular lens haptics.

    PubMed

    Gorn, R A; Steinert, R F

    1985-11-01

    Neodymium:YAG laser cutting of polymethylmethacrylate and polypropylene anterior chamber and posterior chamber intraocular lens haptics was studied in terms of ease of transection and physical structure of the cut areas as seen by scanning electron microscopy. A marked difference was discovered, with the polymethylmethacrylate cutting easily along transverse planes, whereas the polypropylene resisted cutting along longitudinal fibers. Clinical guidelines are presented.

  11. A Comparison of the Effects of Depth Rotation on Visual and Haptic Three-Dimensional Object Recognition

    ERIC Educational Resources Information Center

    Lawson, Rebecca

    2009-01-01

    A sequential matching task was used to compare how the difficulty of shape discrimination influences the achievement of object constancy for depth rotations across haptic and visual object recognition. Stimuli were nameable, 3-dimensional plastic models of familiar objects (e.g., bed, chair) and morphs midway between these endpoint shapes (e.g., a…

  12. Touch influences perceived gloss

    PubMed Central

    Adams, Wendy J.; Kerrigan, Iona S.; Graf, Erich W.

    2016-01-01

    Identifying an object’s material properties supports recognition and action planning: we grasp objects according to how heavy, hard or slippery we expect them to be. Visual cues to material qualities such as gloss have recently received attention, but how they interact with haptic (touch) information has been largely overlooked. Here, we show that touch modulates gloss perception: objects that feel slippery are perceived as glossier (more shiny).Participants explored virtual objects that varied in look and feel. A discrimination paradigm (Experiment 1) revealed that observers integrate visual gloss with haptic information. Observers could easily detect an increase in glossiness when it was paired with a decrease in friction. In contrast, increased glossiness coupled with decreased slipperiness produced a small perceptual change: the visual and haptic changes counteracted each other. Subjective ratings (Experiment 2) reflected a similar interaction – slippery objects were rated as glossier and vice versa. The sensory system treats visual gloss and haptic friction as correlated cues to surface material. Although friction is not a perfect predictor of gloss, the visual system appears to know and use a probabilistic relationship between these variables to bias perception – a sensible strategy given the ambiguity of visual clues to gloss. PMID:26915492

  13. Visualizing Without Vision at the Microscale: Students With Visual Impairments Explore Cells With Touch

    NASA Astrophysics Data System (ADS)

    Jones, M. Gail; Minogue, James; Oppewal, Tom; Cook, Michelle P.; Broadwell, Bethany

    2006-12-01

    Science instruction is typically highly dependent on visual representations of scientific concepts that are communicated through textbooks, teacher presentations, and computer-based multimedia materials. Little is known about how students with visual impairments access and interpret these types of visually-dependent instructional materials. This study explored the efficacy of new haptic (simulated tactile feedback and kinesthetics) instructional technology for teaching cell morphology and function to middle and high school students with visual impairments. The study examined students' prior experiences learning about the cell and cell functions in classroom instruction, as well as how haptic feedback technology impacted students' awareness of the 3-D nature of an animal cell, the morphology and function of cell organelles, and students' interest in the haptic technology as an instructional tool. Twenty-one students with visual impairment participated in the study. Students explored a tactile model of the cell with a haptic point probe that allowed them to feel the cell and its organelles. Results showed that students made significant gains in their ability to identify cell organelles and found the technology to be highly interesting as an instructional tool. The need for additional adaptive technology for students with visual impairments is discussed.

  14. Simulation and curriculum design: a global survey in dental education.

    PubMed

    Perry, S; Burrow, M F; Leung, W K; Bridges, S M

    2017-12-01

    Curriculum reforms are being driven by globalization and international standardization. Although new information technologies such as dental haptic virtual reality (VR) simulation systems have provided potential new possibilities for clinical learning in dental curricula, infusion into curricula requires careful planning. This study aimed to identify current patterns in the role and integration of simulation in dental degree curricula internationally. An original internet survey was distributed by invitation to clinical curriculum leaders in dental schools in Asia, Europe, North America, and Oceania (Australia and New Zealand). The results (N = 62) showed Asia, Europe and Oceania tended towards integrated curriculum designs with North America having a higher proportion of traditional curricula. North America had limited implementation of haptic VR simulation technology but reported the highest number of scheduled simulation hours. Australia and New Zealand were the most likely regions to incorporate haptic VR simulation technology. This survey indicated considerable variation in curriculum structure with regionally-specific preferences being evident in terms of curriculum structure, teaching philosophies and motivation for incorporation of VR haptic simulation into curricula. This study illustrates the need for an improved evidence base on dental simulations to inform curriculum designs and psychomotor skill learning in dentistry. © 2017 Australian Dental Association.

  15. Finite Element Methods for real-time Haptic Feedback of Soft-Tissue Models in Virtual Reality Simulators

    NASA Technical Reports Server (NTRS)

    Frank, Andreas O.; Twombly, I. Alexander; Barth, Timothy J.; Smith, Jeffrey D.; Dalton, Bonnie P. (Technical Monitor)

    2001-01-01

    We have applied the linear elastic finite element method to compute haptic force feedback and domain deformations of soft tissue models for use in virtual reality simulators. Our results show that, for virtual object models of high-resolution 3D data (>10,000 nodes), haptic real time computations (>500 Hz) are not currently possible using traditional methods. Current research efforts are focused in the following areas: 1) efficient implementation of fully adaptive multi-resolution methods and 2) multi-resolution methods with specialized basis functions to capture the singularity at the haptic interface (point loading). To achieve real time computations, we propose parallel processing of a Jacobi preconditioned conjugate gradient method applied to a reduced system of equations resulting from surface domain decomposition. This can effectively be achieved using reconfigurable computing systems such as field programmable gate arrays (FPGA), thereby providing a flexible solution that allows for new FPGA implementations as improved algorithms become available. The resulting soft tissue simulation system would meet NASA Virtual Glovebox requirements and, at the same time, provide a generalized simulation engine for any immersive environment application, such as biomedical/surgical procedures or interactive scientific applications.

  16. Intact haptic priming in normal aging and Alzheimer's disease: evidence for dissociable memory systems.

    PubMed

    Ballesteros, Soledad; Reales, José Manuel

    2004-01-01

    This study is the first to report complete priming in Alzheimer's disease (AD) patients and older control subjects for objects presented haptically. To investigate possible dissociations between implicit and explicit objects representations, young adults, Alzheimer's patients, and older controls performed a speeded object naming task followed by a recognition task. Similar haptic priming was exhibited by the three groups, although young adults responded faster than the two older groups. Furthermore, there was no difference in performance between the two healthy groups. On the other hand, younger and older healthy adults did not differ on explicit recognition while, as expected, AD patients were highly impaired. The double dissociation suggests that different memory systems mediate both types of memory tasks. The preservation of intact haptic priming in AD provides strong support to the idea that object implicit memory is mediated by a memory system that is different from the medial-temporal diencephalic system underlying explicit memory, which is impaired early in AD. Recent imaging and behavioral studies suggest that the implicit memory system may depend on extrastriate areas of the occipital cortex although somatosensory cortical mechanisms may also be involved.

  17. Collision detection and modeling of rigid and deformable objects in laparoscopic simulator

    NASA Astrophysics Data System (ADS)

    Dy, Mary-Clare; Tagawa, Kazuyoshi; Tanaka, Hiromi T.; Komori, Masaru

    2015-03-01

    Laparoscopic simulators are viable alternatives for surgical training and rehearsal. Haptic devices can also be incorporated with virtual reality simulators to provide additional cues to the users. However, to provide realistic feedback, the haptic device must be updated by 1kHz. On the other hand, realistic visual cues, that is, the collision detection and deformation between interacting objects must be rendered at least 30 fps. Our current laparoscopic simulator detects the collision between a point on the tool tip, and on the organ surfaces, in which haptic devices are attached on actual tool tips for realistic tool manipulation. The triangular-mesh organ model is rendered using a mass spring deformation model, or finite element method-based models. In this paper, we investigated multi-point-based collision detection on the rigid tool rods. Based on the preliminary results, we propose a method to improve the collision detection scheme, and speed up the organ deformation reaction. We discuss our proposal for an efficient method to compute simultaneous multiple collision between rigid (laparoscopic tools) and deformable (organs) objects, and perform the subsequent collision response, with haptic feedback, in real-time.

  18. Control of an ER haptic master in a virtual slave environment for minimally invasive surgery applications

    NASA Astrophysics Data System (ADS)

    Han, Young-Min; Choi, Seung-Bok

    2008-12-01

    This paper presents the control performance of an electrorheological (ER) fluid-based haptic master device connected to a virtual slave environment that can be used for minimally invasive surgery (MIS). An already developed haptic joint featuring controllable ER fluid and a spherical joint mechanism is adopted for the master system. Medical forceps and an angular position measuring device are devised and integrated with the joint to establish the MIS master system. In order to embody a human organ in virtual space, a volumetric deformable object is used. The virtual object is then mathematically formulated by a shape-retaining chain-linked (S-chain) model. After evaluating the reflection force, computation time and compatibility with real-time control, the haptic architecture for MIS is established by incorporating the virtual slave with the master device so that the reflection force for the object of the virtual slave and the desired position for the master operator are transferred to each other. In order to achieve the desired force trajectories, a sliding mode controller is formulated and then experimentally realized. Tracking control performances for various force trajectories are evaluated and presented in the time domain.

  19. 6-DoF Haptic Rendering Using Continuous Collision Detection between Points and Signed Distance Fields.

    PubMed

    Hongyi Xu; Barbic, Jernej

    2017-01-01

    We present an algorithm for fast continuous collision detection between points and signed distance fields, and demonstrate how to robustly use it for 6-DoF haptic rendering of contact between objects with complex geometry. Continuous collision detection is often needed in computer animation, haptics, and virtual reality applications, but has so far only been investigated for polygon (triangular) geometry representations. We demonstrate how to robustly and continuously detect intersections between points and level sets of the signed distance field. We suggest using an octree subdivision of the distance field for fast traversal of distance field cells. We also give a method to resolve continuous collisions between point clouds organized into a tree hierarchy and a signed distance field, enabling rendering of contact between rigid objects with complex geometry. We investigate and compare two 6-DoF haptic rendering methods now applicable to point-versus-distance field contact for the first time: continuous integration of penalty forces, and a constraint-based method. An experimental comparison to discrete collision detection demonstrates that the continuous method is more robust and can correctly resolve collisions even under high velocities and during complex contact.

  20. Passive haptics in a knee arthroscopy simulator: is it valid for core skills training?

    PubMed

    McCarthy, Avril D; Moody, Louise; Waterworth, Alan R; Bickerstaff, Derek R

    2006-01-01

    Previous investigation of a cost-effective virtual reality arthroscopic training system, the Sheffield Knee Arthroscopy Training System (SKATS), indicated the desirability of including haptic feedback. A formal task analysis confirmed the importance of knee positioning as a core skill for trainees learning to navigate the knee arthroscopically. The system cost and existing limb interface, which permits knee positioning, would be compromised by the addition of commercial active haptic devices available currently. The validation results obtained when passive haptic feedback (resistance provided by physical structures) is provided indicate that SKATS has construct, predictive and face validity for navigation and triangulation training. When tested using SKATS, experienced surgeons (n = 11) performed significantly faster, located significantly more pathologies, and showed significantly shorter arthroscope path lengths than a less experienced surgeon cohort (n = 12). After SKATS training sessions, novices (n = 3) showed significant improvements in: task completion time, shorter arthroscope path lengths, shorter probe path lengths, and fewer arthroscope tip contacts. Main improvements occurred after the first two practice sessions, indicating rapid familiarization and a training effect. Feedback from questionnaires completed by orthopaedic surgeons indicates that the system has face validity for its remit of basic arthroscopic training.

Top